Forum Replies Created

Viewing 15 posts - 331 through 345 (of 459 total)
  • Author
    Posts
  • in reply to: water simulation #101185

    Oliver Kreylos
    Keymaster

    You can control the water simulation via two command line parameters: -wt to set the simulation grid size, and -ws to set the relative simulation speed. Check the output of SARndbox -h for details.

    in reply to: Record data #101184

    Oliver Kreylos
    Keymaster

    It’s a bit complicated. The Kinect camera captures elevation in a projective coordinate system, and the AR Sandbox only unprojects those measurements into a regular 2D grid on the GPU as part of water simulation.

    The best way to intercept that grid is to patch the updateBathymetry method in WaterTable2.cpp. After line 273, you could read the contents of the 2D texture containing the just-rendered bathymetry to a 2D array on the CPU side using glGetTexImage, and write that to a file. Note that the bathymetry texture is a single-component texture with 32-bit floating-point components.

    in reply to: Add to the visuals #101183

    Oliver Kreylos
    Keymaster

    Color mapping is done via a GLSL shader. This particular shader is programmatically generated inside SurfaceRenderer.cpp, and compiled on-demand. To modify it, you have to embed GLSL code into the C++ code via string manipulations.

    Look at line 310 in SurfaceRenderer.cpp:

    vec4 baseColor=texture1D(heightColorMapSampler,heightColorMapTexCoord);\n\

    This is where the fragment shader queries a color from the height color map texture based on the fragment’s elevation relative to the base plane. This is where you would add code to check the fragment’s elevation against your upper and lower bounds, and, if it lies inside, get the base color from a second 2D texture image based on the fragment’s (x, y) position instead of the 1D color map texture.

    You would then also have to add code to create and manage that 2D texture, upload it to an unused texture unit before shader execution, and add a uniform sampler variable to the shader to access the texture image.

    in reply to: Intel i5 use, can it be done? #101182

    Oliver Kreylos
    Keymaster

    The water simulation is entirely run on the GPU and mostly independent of CPU performance.

    We generally recommend Core i7 CPUs because most of our software is aggressively multi-threaded and benefits from more cores and hyperthreading.

    But I think the AR Sandbox uses four threads at most, so it should be fine on Core i5 CPUs.

    in reply to: Sandbox Size #101181

    Oliver Kreylos
    Keymaster

    Going smaller doesn’t cause technical issues, but you lose play space.

    in reply to: Model Export #101180

    Oliver Kreylos
    Keymaster

    293 frames were probably all you had. I might have forgotten to check for end-of-file in LWOWriter.

    I recommend passing the min and max frame indices on the command line, as in

    $ ./bin/LWOWriter FooFile 60 60

    That should only export a single LWO file from video frame 60 (two seconds in to get around early start problems).

    in reply to: Model Export #101178

    Oliver Kreylos
    Keymaster

    There is no export functionality built into the AR Sandbox itself, but the Kinect package contains an (unsupported) utility to convert a previously saved 3D video stream to a sequence of 3D mesh files in Lightwave Object (.lwo) format, which in turn can be read by most 3D modeling software. This is not the same as exporting from inside the AR Sandbox, as that contains special filter algorithms to ensure a watertight surface.

    Unfortunately, I have not maintained that utility, and it fell prey to some API changes. I am currently packaging an updated version of the Kinect package which will fix those issues.

    With the new package, here is the sequence to build the utility, called LWOWriter:

    1. Change into the Kinect package’s source directory.
    2. Build the LWOWriter utility:
      $ make PACKAGES=MYKINECT LWOWriter
    3. Run KinectViewer and save a stretch of 3D video data via the main menu’s “Save Streams…” entry.
    4. Exit KinectViewer, locate the saved video stream (a pair of files with .color and .depth extension), and run it through LWOWriter:

      $ ./bin/LWOWriter <video stream base name> <first exported frame> <last exported frame>

      This will create one Lightwave Object file for each video frame between the given start and end indices. Base name is just the name of the video stream files without the .depth/.color extension.

    If you don’t want to wait for Kinect-2.8-002, you can fix LWOWriter.cpp yourself: Change line 304 from
    projector.setFilterDepthFrames(true);
    to
    projector.setFilterDepthFrames(false,true);
    and replace line 325,
    const Kinect::MeshBuffer& mesh=projector.processDepthFrame(depth);
    with

    Kinect::MeshBuffer mesh;
    projector.processDepthFrame(depth,mesh);
    
    in reply to: USB projector use, will it work? #101149

    Oliver Kreylos
    Keymaster

    This will not work. The projector needs a custom graphics driver to receive a video signal over USB, and I’m betting there is no such driver for Linux.

    Even if it did work, at 100 lumens the projector will not be bright enough (our recommended projector has 3000 lumens).

    in reply to: Terminal with calibration info #101148

    Oliver Kreylos
    Keymaster

    If you installed Mint with Mate desktop, as in the video, open the main menu (lower-left corner), and there should be a “Terminal” icon right there (see 20:46 in the video). If you have Gnome Shell, press the Windows key, type “term”, and press Enter.

    in reply to: Kinect not detected #101147

    Oliver Kreylos
    Keymaster

    This should work, but the Kinect is sometimes temperamental. Run

    $~/Vrui-3.1/bin/KinectUtil reset 0
    

    first, or if that doesn’t help, unplug and replug the Kinect.

    in reply to: Hardware check #101146

    Oliver Kreylos
    Keymaster

    Yes, that’s correct. You should still be able to get new first-generation Kinects (“Kinect-for-Xbox-360”) from Amazon.

    in reply to: SARndbox User Guide? #101145

    Oliver Kreylos
    Keymaster

    We don’t have a guide for running the AR Sandbox (yet).

    To your question: That should work. To enable the button in the software, start the AR Sandbox as usual, and then press and hold some key, say “9”. This will pop up a tool selection menu. Move the mouse to select “Manage Water” from the bottom, and let go of the key you pressed. This will bring up a dialog window asking you to press another button to assign to the “Drain” function. Now press the USB button, and the dialog window will go away.

    Now, if you press and hold “9”, or whatever key you picked, it will rain, and if you press and hold the USB button, the water will drain.

    To make the assignment permanent, press the right mouse button to bring up the main menu, move down to “Vrui System,” then to “Devices,” and finally select “Save Input Graph…”. This will suggest a file name; take note of the file name and location, and select “OK” to save.

    To load the new input graph on startup, run the sandbox as

    $<SARndbox location>/SARndbox [usual command line arguments] -loadInputGraph <location>/<file name>
    

    You can keep the original file name, or rename the file and move it to a convenient location.

    in reply to: Issues and errors #101144

    Oliver Kreylos
    Keymaster

    The error means that the projector calibration matrix that’s created as part of final calibration doesn’t exist yet (I assume you haven’t run that step yet). You can run the sandbox without, but you have to leave off the -fpv command line argument. In that mode, the topography is drawn as a regular 3D surface, and you can use mouse and keyboard to rotate, translate, and scale it. See the Vrui Application User Guide for details.

    Did you install Linux Mint according to the instruction video, i.e., using Mate as a desktop environment? Then the fullscreen shortcut should be there.

    If you installed another desktop environment, it might not be. In that case, you can force the sandbox to start in fullscreen mode by creating a new file Vrui.cfg inside the /home/ardadmin/src/SARndbox-1.5-001/ directory, with the following (exact) contents:

    section Vrui
      section Desktop
        section Window
          windowFullscreen true
        endsection
      endsection
    endsection
    

    If you then run the sandbox as follows:

    $ cd /home/ardadmin/src/SARndbox-1.5-001
    $ ./bin/SARndbox
    

    (adding the usual command line parameters), it will start in fullscreen.

    in reply to: Compatibility with Kinect for Windows v2? #101143

    Oliver Kreylos
    Keymaster

    It’s going to be a while, but you can still order new Kinect-for-Xbox-360 devices, for example via Amazon.

    in reply to: Modification of the current software #101081

    Oliver Kreylos
    Keymaster

    This is trickier than it appears because water data only lives on the graphics card, not in CPU-accessible main memory where you would need it to run simulations. You can download the current water state grid after it’s being updated in the display method, which is line 934 in Sandbox.cpp. Use WaterTable2::bindQuantityTexture to bind the water texture to the active texture unit, and then use glGetTexImage with a texture target of GL_TEXTURE_RECTANGLE_ARB, an RGB format, and a floating-point data type. The first texture component will be water surface level (not water height above ground), and the second and third are horizontal and vertical flux.

Viewing 15 posts - 331 through 345 (of 459 total)