Forum Replies Created

Viewing 15 posts - 466 through 480 (of 498 total)
  • Author
    Posts
  • in reply to: Blue line around running SARNDbox projected image? #100769
    Oliver Kreylos
    Keymaster

    If you want to try different Nvidia driver versions, it’s easier to use the binary blobs directly from Nvidia’s site instead of the distribution-supplied drivers, but to be honest it’s quite the pain in the buns.

    I don’t currently have a driver version that’s known to work. When we built our initial prototype in 2012, it worked flawlessly, and when we rolled out the sandbox for open-house day in 2013, we were able to roll back to a then-recent driver version that worked, but currently, due to a newly-installed OS, we have the blue rectangle in our sandbox as well. It’s very close to the physical edges, so it’s not too annoying, but it’s there.

    in reply to: BENQ MH630 #100765
    Oliver Kreylos
    Keymaster

    The Kinect camera has a 4×3 aspect ratio, meaning that the sandbox itself should ideally also have a 4×3 aspect ratio, or there will be parts of it that won’t be scanned, and that would be confusing/irritating.

    This projector has a 16:9 aspect ratio, meaning that it would have to over-project the area scanned by the Kinect. This is not necessarily a problem, but it will take some extra software setup to ensure that the overprojected parts are black, to be close to invisible, instead of some random color. When overprojecting to the full size of the scanned sandbox area, the remaining resolution of the projector would be 1440×1080, which is somewhat higher than the 620’s 1024×768. But the sandbox’s effective resolution is limited by the Kinect camera’s 640×480, so the benefit would be marginal.

    The bigger issue is throw distance. The 620’s throw ratio matches that of the Kinect camera, meaning it can be mounted at the same height as the Kinect. This projector needs to be mounted significantly higher to achieve the same projection size, which makes the overall sandbox design more complex (probably needs a mirror), and could potentially lead to problems when the Kinect itself ends up in the projector’s light path and casts a shadow onto the sand surface.

    Based on the projector specs and the throw distance calculator at Projector Central, the minimal throw distance to create a 30″ tall image is 61″.

    Here is an existing AR Sandbox installation with a long-throw projector and a mirror: British Geological Survey

    in reply to: Using SARndbox with libfreenect or Iannix #100750
    Oliver Kreylos
    Keymaster

    That would require code modification. The main SARndbox application receives depth image data in a central callback, from where it is dispatched to any clients who need depth data for processing. It would be straightforward to add another client that converts the raw depth data into whatever format necessary to control external applications, and then stream it out through OSC or a custom protocol.

    The relevant callback is called rawDepthFrameDispatcher, and found in Sandbox.cpp.

    in reply to: Blue line around running SARNDbox projected image? #100749
    Oliver Kreylos
    Keymaster

    The last four lines are the corner points in 3D. It looks like your fourth corner point is off (52 instead of 62 in x, 39 instead of 44 in y), which would pull in the entire box, so you might want to re-measure that.

    You can manually push out the box by adding/subtracting from the x,y positions. For example:

    (-65.6, -49.7, -147.3)
    (64.0, -47.4, -148.9)
    (-63.1, 45.6, -154.4)
    (53.3, 40.4, -155.3)
    

    will push your box outward by 1cm in each direction.

    in reply to: Blue line around running SARNDbox projected image? #100745
    Oliver Kreylos
    Keymaster

    Thanks. This is no guarantee of success, but you could try installing a newer Nvidia driver. I’m on Fedora 20, and the stock Nvidia driver for that distro is at version 331.104. (The graphics card on this laptop is a GeForce 8600M GT.) I can’t test right now whether this driver version works or not. The stock version on Fedora 21 is 343.36 right now.

    Let’s collect version numbers of Nvidia drivers that don’t exhibit the problem.

    Oliver Kreylos
    Keymaster

    There are two methods to achieve this.

    The simpler method is to start the CalibrateProjector utility as usual, and bind all desired tools manually as usual. Once all tools are bound, open the application’s main menu, and select “Save Input Graph…” from the “Vrui System”->”Devices” sub-menu. Select a file name and location from the file selection menu that pops up (to use the keyboard to type, press “F1” once to enter text mode, and then “F1” again to leave text mode when done) or leave the default name and location, and select “OK.”

    The saved input graph file, which has the same general format as any Vrui configuration file, can be loaded back into an application either manually via the “Load Input Graph…” entry in the same sub-menu, or via the command line:

    $ CalibrateProjector [usual arguments] -loadInputGraph <input graph file name>
    

    This method works for all Vrui applications, including RawKinectViewer, KinectViewer, and SARndbox itself.

    The other method is to directly edit the Vrui.cfg configuration file, or create an application-specific patch configuration file (see the Vrui documentation). Vrui.cfg contains a section called “DefaultTools,” which in turn contains sub-sections for every tool that is to be bound at application start-up. The format for the tool binding sections is as follows:

    section SomeUniqueName
      toolClass CaptureTool
      bindings ((Mouse, 1, 2))
    endsection
    

    This will bind CalibrateProjector’s capture tool to buttons 1 and 2 on the “Mouse” device (which includes keyboard keys and mouse buttons), in that order.

    in reply to: Blue line around running SARNDbox projected image? #100742
    Oliver Kreylos
    Keymaster

    Unfortunately, I don’t know yet how to completely remove the blue border.

    The reason it’s there is either a bug in the Nvidia graphics driver, or an edge case that is triggered by unorthodox texture image handling inside the AR Sandbox software (I have not yet found where the latter would be in the code). The blue rectangle appears only in specific versions of the Nvidia driver; would you please post your exact version number? It is displayed in the “X Server Information” panel in the Nvidia control dialog, nvidia-settings.

    The blue rectangle runs exactly around the sandbox’s water simulation area, as set up during calibration step 5 (measure extents of sand surface). That area coincides with the extents of the 2D texture images containing the water simulation’s state, and are an effect of either a) the graphics driver wrongly accessing edge pixels from that texture image, or b) the sandbox software wrongly relying on unspecified texture edge behavior.

    The blue rectangle can be pushed further outside the box by extending the 3D box corner positions measured in calibration step 5, but pushing the box corners outside the physical limits of the sandbox is not recommended because it might lead to artifacts in the water simulation.

    in reply to: Problem with calibration #100739
    Oliver Kreylos
    Keymaster

    I just checked the sources, and this is a documentation error. I’ll post a top-level addressing the issue.

    in reply to: kinect viewer glitch #100736
    Oliver Kreylos
    Keymaster

    The GL Error problem is most probably due to running in a virtual machine. On those, Linux only gets simulated access to the graphics hardware and has to use software-emulated OpenGL, which doesn’t support all features used by the software, and is very slow.

    The Kinect problem might be due to the virtual machine, too. The picture looks like there are transmission problems on the USB bus into which the Kinect is plugged, and that could be caused by virtual machine emulation, but I’m not sure.

    In general, it’s not good to run the AR Sandbox through a virtual machine. The sandbox requires a high-performance graphics card, and through the virtualization layer, it does not get access to the installed card — it’s the same as not having one.

    in reply to: Problem with calibration #100732
    Oliver Kreylos
    Keymaster

    Before you line up the depth image grid, ensure that the “Average Frames” button in the main menu is selected. That will collect several depth frames, calculate their average, and display it. While the button is selected, the depth image won’t update live any longer. After you placed the grid and collected a tie point, uncheck the “Average Frames” button, move the calibration target, and repeat.

    in reply to: Running with Windows #100731
    Oliver Kreylos
    Keymaster

    I haven’t tried building under cygwin in a very long time, and last time it didn’t work.

    One issue with running under Windows, either via cygwin (if it works) or in a virtual machine, is that the sandbox application won’t get access to the graphics hardware — it can only run in emulated mode.

    This means the performance of the sandbox, especially the water simulation, will be very poor. The ideal installation is with a dedicated Linux PC, or a Linux installation run natively via dual-boot.

    in reply to: Full Screen #100730
    Oliver Kreylos
    Keymaster

    Please check the topic “Full Screen Mode.”

    in reply to: Water Drain #100723
    Oliver Kreylos
    Keymaster

    There are many brands/makes of USB buttons, but they vary widely in their capabilities, and it’s usually not specified what exactly they do. That’s dangerous.

    In our AR Sandbox, we use a two-button switch, Delcom 706502-F, where the red button drains and the blue button deposits water.

    Delcom also has one-button switches, such as this one, but they are rather pricey. On the upside, they are robust and known to work.

    Delcom’s switches are programmable via a Windows-based utility available from their web site. They can be programmed as mouse buttons, keyboard keys, or joystick buttons. The ideal configuration for the AR Sandbox is one or two joystick buttons. Most cheap USB buttons only emulate keyboard keys, and usually in a dumb way that makes them unusable for the AR Sandbox.

    To use a USB switch that’s configured as joystick button(s), one has to create a patch configuration file that makes it available as an additional input device, and then bind a water management tool to the new device’s button(s) using Vrui’s standard tool binding interface. Alternatively, one can bind a water management tool automatically via the same patch configuration file. Here’s the file I use to bind our two-button box to the add water / remove water functions:

    section Vrui
      section Desktop
        inputDeviceAdapterNames (MouseAdapter, HIDAdapter)
    
        section HIDAdapter
          inputDeviceAdapterType HID
          inputDeviceNames (ButtonBox)
          
          section ButtonBox
            name ButtonBox
            deviceVendorProductId 0fc5:b080
          endsection
        endsection
        
        section Tools
          section DefaultTools
            section WaterTool
              toolClass GlobalWaterTool
              bindings ((ButtonBox, Button167, Button166))
            endsection
          endsection
        endsection
      endsection
    endsection
    

    The deviceVendorProductId setting would need to be adapted to the actual switch’s ID, and the actual button names (Button166 and Button167 in our case) might have to be changed.

    The name of this patch configuration file, say SandboxButtons.cfg, is then added to SARndbox’s command line as such:

    $ SARndbox [usual options] -mergeConfig SandboxButtons.cfg
    
    in reply to: Hardware requirements #100721
    Oliver Kreylos
    Keymaster

    The computing hardware requirements are not strict, per se. The AR Sandbox software runs fine on my vintage 2008 Macbook Pro (Intel Core 2 Duo 2.2GHz, 2GB RAM, Nvidia GeForce 8600M) as long as there is little or no water (or water simulation is disabled entirely).

    The problem is that the water simulation will run slow on low-end systems like that, which leads to noticeable and annoying choppyness when the water is agitated and moves quickly, even if only in some small areas. We found that GTX 770-level hardware is able to handle almost any circumstance without noticeable slow-downs.

    The AR Sandbox is GPU-limited, meaning that the choice of main CPU is secondary. A lower-clock rate Core i7, or an i5, should cause no problems at all.

    The water simulation method I implemented is conservative, meaning that it maintains physical accuracy at the expense of run-time. To be honest, that’s not an ideal choice for an application that requires interactive response and where accuracy is secondary to irrelevant, but that’s the way it is. An always-robust non-accurate simulation method like smooth particle hydrodynamics would be more appropriate, but that wouldn’t be usable for my other applications.

    in reply to: Water to Lava #100720
    Oliver Kreylos
    Keymaster

    The water effect is achieved by a GLSL fragment shader, whose source code is in SurfaceAddWaterColor.fs in the SARndbox package’s shader directory, share/SARndbox-<version>/Shaders.

    To switch the water to appear as lava (this only affects appearance, not behavior), change the following lines in the source file:

    Uncomment line 170:

    float colorW=max(turb(vec3(fragCoord*0.05,waterAnimationTime*0.25)),0.0); // Turbulence noise
    

    Comment out line 177:

    // float colorW=pow(dot(wn,normalize(vec3(0.075,0.075,1.0))),100.0)*1.0-0.0;
    

    Comment/uncomment lines 179 and 180:

    // vec4 waterColor=vec4(colorW,colorW,1.0,1.0); // Water
    vec4 waterColor=vec4(1.0-colorW,1.0-colorW*2.0,0.0,1.0); // Lava
    

    Then save the source file. If the AR Sandbox is already running, the change will take effect immediately; otherwise, it will take effect when the SARndbox executable is started the next time.

    To go back to water-colored water, reverse the steps above.

    Another option is to create two copies of the shader file, say SurfaceAddWaterColor-Water.fs and SurfaceAddWaterColor-Lava.fs, make the above changes in the second file, and then use a script on an icon or hotkey to copy either file onto the one that’s used by the Sandbox, such as

    $ cp SurfaceAddWaterColor-Lava.fs SurfaceAddWaterColor.fs
    

    to set “lava mode,” and

    $ cp SurfaceAddWaterColor-Water.fs SurfaceAddWaterColor.fs
    

    to set “water mode.”

Viewing 15 posts - 466 through 480 (of 498 total)