Forum Replies Created

Viewing 15 posts - 316 through 330 (of 452 total)
  • Author
    Posts
  • in reply to: Lava #101345

    Oliver Kreylos
    Keymaster

    I haven’t tried this, but you could slow down the water simulation by passing -ws <speed factor> 30 on SARndbox’s command line. <speed factor> is a multiplier with 1.0 being normal speed. A small number, such as -ws 0.1 30 might make the fluid behave lava-ish. Some experimentation would be required to find the best-feeling speed factor.

    in reply to: Legal or Not Legal to Crowd Source #101344

    Oliver Kreylos
    Keymaster

    Note: The following applies primarily to the USA, but should be similar in most other places.

    The GNU General Public License (GPL) explicitly does not forbid selling a GPL-licensed software product. Without going into legalese, the only thing a licensee of a GPL software product can not do is re-license that software product to a third party under a non GPL-compatible license, such as, for example, a closed-source license. This is how Red Hat can sell Linux.

    What this means in practice is that you can sell the SARndbox software (or any software derived from the SARndbox software) to a third party, for any amount of money you see fit. What you can not do is take those same rights away from your licensees: you have to give them the full source code of the software you sold, and you cannot prevent them from copying or modifying the software, or re-selling the software to a fourth party.

    “Free” in the GPL means unencumbered by restrictions, not “free” as in no cost. Or in Richard Stallman’s words, “free as in speech, not free as in beer.”

    Regarding the other hardware components (PC, projector, Kinect, etc.): These are goods as defined by ownership law, and legal owners can do with them as they please, including reselling them, say during a yard sale, or on eBay, or as part of an AR Sandbox sale. Some argue that complex goods such as projectors or 3D cameras also contain intellectual property, such as firmware or industrial designs, but fortunately, the US Supreme Court in 2013 upheld the first-sale doctrine, which explicitly allows resale, including resale for profit, of copyright- or trademark-protected physical objects by their legal owners.

    in reply to: Thoughts on build appreciated! #101285

    Oliver Kreylos
    Keymaster

    You’ll be fine with the Core i5 CPU, but think twice about dropping from a 970 to a 960. The 970 is almost twice as fast at running the water simulation as the 960. The 960 is roughly comparable to a 770, which we currently have in our AR Sandbox, and water gets choppy sometimes. It’s not a dealbreaker, but it’s pretty annoying when it happens.

    in reply to: Xbox Kinect not being found: Our Solution #101281

    Oliver Kreylos
    Keymaster

    Did you originally connect your Kinect to a USB 3.0 port or to a USB 2.0 port? If it was USB 3.0, please check whether the USB controller on your motherboard is a NEC 720200, or something else. If it is something else, that might be the cause of your problems. The Kinect’s power draw is well within the specifications for USB 2.0, so that should not have been the issue.

    You can check by running

    $ lspci | grep "USB 3.0"

    from a terminal, and looking for “NEC” or “Renesas” (different name for same chip), models “uDP720200” or “uDP720200” (or similar).

    in reply to: Xbox Kinect not being found: Our Solution #101280

    Oliver Kreylos
    Keymaster

    I have been using this no-name PCI express 2-port USB 3.0 card (USD 11.30) to reliably run two Kinects at the same time. The most important thing to watch out for is that the card uses the NEC 720200 chip, which is the best-supported USB 3.0 controller in the Linux kernel.

    in reply to: Projector options? #101278

    Oliver Kreylos
    Keymaster

    The BenQ MX620ST has been discontinued, which is probably why it’s more expensive now. It used to be around USD 550. The new model is BenQ MX631ST, which currently (as of 10/07/2015) retails for USD 529 on newegg.com.

    The reason we recommend BenQ projectors is that they have good image properties, and are among the least expensive short-throw projectors.

    in reply to: Credit, where credit is due #101274

    Oliver Kreylos
    Keymaster

    That’s great! We usually recommend using a paragraph like this:

    The Augmented Reality (AR) Sandbox was developed by the UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES, http://www.keckcaves.org), as part of an informal science education project funded by the National Science Foundation under grant DRL 11-14663. More information about the AR Sandbox can be found at http://idav.ucdavis.edu/~okreylos/ResDev/SARndbox.

    If you happen to manage to take some pictures while exhibiting your AR Sandbox, I’d be happy to add them to my list of external installations.


    Oliver Kreylos
    Keymaster

    The KinectUtil program can’t see your Kinect, and the SARndbox would then not be able to, either. You need to get this to work first. Try plugging your Kinect into a different USB port. Check the output of lsusb to see if you get something like this:

    Bus 001 Device 061: ID 045e:02ae Microsoft Corp. Xbox NUI Camera
    Bus 001 Device 057: ID 045e:02b0 Microsoft Corp. Xbox NUI Motor
    Bus 001 Device 059: ID 045e:02ad Microsoft Corp. Xbox NUI Audio

    If you do see that, but KinectUtil still doesn’t work, try KinectUtil reset all first. If that still doesn’t work, you might have a defective Kinect device.

    in reply to: water simulation #101185

    Oliver Kreylos
    Keymaster

    You can control the water simulation via two command line parameters: -wt to set the simulation grid size, and -ws to set the relative simulation speed. Check the output of SARndbox -h for details.

    in reply to: Record data #101184

    Oliver Kreylos
    Keymaster

    It’s a bit complicated. The Kinect camera captures elevation in a projective coordinate system, and the AR Sandbox only unprojects those measurements into a regular 2D grid on the GPU as part of water simulation.

    The best way to intercept that grid is to patch the updateBathymetry method in WaterTable2.cpp. After line 273, you could read the contents of the 2D texture containing the just-rendered bathymetry to a 2D array on the CPU side using glGetTexImage, and write that to a file. Note that the bathymetry texture is a single-component texture with 32-bit floating-point components.

    in reply to: Add to the visuals #101183

    Oliver Kreylos
    Keymaster

    Color mapping is done via a GLSL shader. This particular shader is programmatically generated inside SurfaceRenderer.cpp, and compiled on-demand. To modify it, you have to embed GLSL code into the C++ code via string manipulations.

    Look at line 310 in SurfaceRenderer.cpp:

    vec4 baseColor=texture1D(heightColorMapSampler,heightColorMapTexCoord);\n\

    This is where the fragment shader queries a color from the height color map texture based on the fragment’s elevation relative to the base plane. This is where you would add code to check the fragment’s elevation against your upper and lower bounds, and, if it lies inside, get the base color from a second 2D texture image based on the fragment’s (x, y) position instead of the 1D color map texture.

    You would then also have to add code to create and manage that 2D texture, upload it to an unused texture unit before shader execution, and add a uniform sampler variable to the shader to access the texture image.

    in reply to: Intel i5 use, can it be done? #101182

    Oliver Kreylos
    Keymaster

    The water simulation is entirely run on the GPU and mostly independent of CPU performance.

    We generally recommend Core i7 CPUs because most of our software is aggressively multi-threaded and benefits from more cores and hyperthreading.

    But I think the AR Sandbox uses four threads at most, so it should be fine on Core i5 CPUs.

    in reply to: Sandbox Size #101181

    Oliver Kreylos
    Keymaster

    Going smaller doesn’t cause technical issues, but you lose play space.

    in reply to: Model Export #101180

    Oliver Kreylos
    Keymaster

    293 frames were probably all you had. I might have forgotten to check for end-of-file in LWOWriter.

    I recommend passing the min and max frame indices on the command line, as in

    $ ./bin/LWOWriter FooFile 60 60

    That should only export a single LWO file from video frame 60 (two seconds in to get around early start problems).

    in reply to: Model Export #101178

    Oliver Kreylos
    Keymaster

    There is no export functionality built into the AR Sandbox itself, but the Kinect package contains an (unsupported) utility to convert a previously saved 3D video stream to a sequence of 3D mesh files in Lightwave Object (.lwo) format, which in turn can be read by most 3D modeling software. This is not the same as exporting from inside the AR Sandbox, as that contains special filter algorithms to ensure a watertight surface.

    Unfortunately, I have not maintained that utility, and it fell prey to some API changes. I am currently packaging an updated version of the Kinect package which will fix those issues.

    With the new package, here is the sequence to build the utility, called LWOWriter:

    1. Change into the Kinect package’s source directory.
    2. Build the LWOWriter utility:
      $ make PACKAGES=MYKINECT LWOWriter
    3. Run KinectViewer and save a stretch of 3D video data via the main menu’s “Save Streams…” entry.
    4. Exit KinectViewer, locate the saved video stream (a pair of files with .color and .depth extension), and run it through LWOWriter:

      $ ./bin/LWOWriter <video stream base name> <first exported frame> <last exported frame>

      This will create one Lightwave Object file for each video frame between the given start and end indices. Base name is just the name of the video stream files without the .depth/.color extension.

    If you don’t want to wait for Kinect-2.8-002, you can fix LWOWriter.cpp yourself: Change line 304 from
    projector.setFilterDepthFrames(true);
    to
    projector.setFilterDepthFrames(false,true);
    and replace line 325,
    const Kinect::MeshBuffer& mesh=projector.processDepthFrame(depth);
    with

    Kinect::MeshBuffer mesh;
    projector.processDepthFrame(depth,mesh);
    
Viewing 15 posts - 316 through 330 (of 452 total)