Forum Replies Created

Viewing 15 posts - 31 through 45 (of 498 total)
  • Author
    Posts
  • in reply to: Could not find resource libScriptExecutorTool.so #118564
    Oliver Kreylos
    Keymaster

    One of your error messages refers to Vrui-4.2, which does not contain the script executor tool. I recommend updating to the current software versions across the board. Don’t forget to back up your configuration files!

    You can change “viscosity” through a slider in the water simulation dialog, or through commands to the AR Sandbox’s control pipe. I’m putting viscosity in quotes because you’re actually changing velocity attenuation, which is close enough in visible effect to stand in as a proxy for viscosity.

    in reply to: Calibration process checking #118561
    Oliver Kreylos
    Keymaster

    1) CalibrateProjector always starts from scratch. It ignores any previous ProjectorMatrix.dat.

    2) Yes, it should.

    3) Do you mean after you collected an additional 12 tie points? If not, see 1).

    4) Generally, yes. CalibrateProjector only accepts tie points if they fall within the box boundaries. Capturing corners in the wrong order would result in most or all tie points being rejected.

    in reply to: Error while loading shared libraries #118560
    Oliver Kreylos
    Keymaster

    I don’t know. Did you run Vrui’s installation script, and did the spinning globe show up? Which version of the Kinect package did you download?

    Make sure that you were following these instructions, and not an outdated version.

    in reply to: Is this a suitable short throw projector? #118559
    Oliver Kreylos
    Keymaster

    That’s not going to be working well. The projector’s brightness is 25 lumens. We recommend using projectors of 3000 lumens, or ideally more. The image will be almost invisibly dim.

    in reply to: Complete Installation Instructions #118557
    Oliver Kreylos
    Keymaster

    It will work, but 72cm x 54cm is on the small side for a sandbox. The problem with normal-throw projectors is that the height they need typically requires folding their light paths with a mirror, which works, but adds complexity and reduces stability.

    in reply to: Off-the-shelf desktop PC for AR Sandbox #118531
    Oliver Kreylos
    Keymaster

    “OPENGL vendor string:VMWare,Inc.”

    It looks like you installed Linux inside a virtual machine. That will not work, because Linux won’t be able to directly access the graphics card, hence no image over HDMI.

    in reply to: GTX 1060 Driver Issues #118510
    Oliver Kreylos
    Keymaster

    I’ve recently run across a similar problem. First off, don’t ignore and continue; things won’t work. The AR Sandbox software won’t be able to access the graphics card’s full feature set.

    Check your BIOS settings next time you reboot. If your boot options are set to “secure boot,” or sometimes it’s called something like “boot for Windows,” then try switching to legacy boot mode, which might unfortunately require a re-install of Linux — try if you can boot into your existing Linux installation first.

    While “secure boot” should work with Linux Mint, there might be the odd motherboard where it causes issues. Not being able to access the graphics card can be one of those issues.

    If you’re already in “legacy mode,” switch to the other mode.

    in reply to: RealSense V2.0 #118450
    Oliver Kreylos
    Keymaster

    If you have a RealSense API v1 camera, i.e., anything before D415/D435, and installed Intel’s RealSense driver before building the Kinect package, it should just work.

    If you have a D415/D435 camera, it will not work. I don’t have a wrapper for the new driver library for lack of hardware.

    in reply to: Can I turn off ability make rain by hand? #118447
    Oliver Kreylos
    Keymaster

    You can turn off the water simulation entirely by adding -ws 0.0 0 to SARndbox’s command line, but right now there is no switch to only turn off hand detection.

    in reply to: Theoretical background for Calibration method #118416
    Oliver Kreylos
    Keymaster

    Base plane equation:

    1. Collect a bunch of 3D points d_i in depth image space from the camera’s depth image.

    2. Convert depth image-space points to metric camera space using camera’s intrinsic depth un-projection matrix (read from camera firmware): c_i = DP * d_i.

    3. Find the plane equation (nx, ny, nz) * c_i = o that best fits all points c_i using a standard least-squares linear system solver A^T * A * x = A^T * b.

    Projector calibration matrix:

    1. Collect a set of tie points that associate a 2D point p_i in projector image space with a 3D point c_i in metric camera space.

    2. Create an over-determined linear system for (p_ix * w_i, p_iy * w_i) = H_2 * c_i where H = (h0, …, h3, h4, …, h7, h8, …, h11) is an unknown 3×4 matrix, H_2 is the 2×4 matrix consisting of the upper two rows of H, and w_i = h8*c_ix + h9*c_iy + h10*c_iz + h11 is the homogeneous weight of H * (c_ix, c_iy, c_iz, 1).

    3. Solve the system A^T * A * x = 0 by finding the eigenvector of A^T * A that has the smallest eigenvalue.

    4. Extend the resulting matrix H to an OpenGL projection matrix by inserting a third row (0, 0, 0, -1) and pre-multiplying it with the inverse of a viewport matrix that maps the projector’s screen rectangle to [-1, 1] in both x and y, and the camera-space tie points’ z range, after multiplication with the extended H, to [-0.5, 0.5] in z. Z maps to [-0.5, 0.5] instead of [-1, 1] to avoid clipping away geometry that is lower or higher than the lowest or highest tie point, respectively.

    in reply to: Hot key in Projector/Camera Calibration? How move picture? #118415
    Oliver Kreylos
    Keymaster

    You cannot move the picture in the CalibrateProjector calibration utility, by design. If you are referring to the calibration steps that are involving the RawKinectViewer utility (base plane equation and 3D box corners), then see step 3 of the detailed software installation guide.

    in reply to: Can I use Kinect 1520? #118414
    Oliver Kreylos
    Keymaster

    Kinect model 1520 (“Kinect-for-Xbox-One”) is supported, but it is not ideal for the AR Sandbox, and there are some issues that need to be addressed during calibration (see the detailed software installation guide).

    in reply to: Toggle Contour Lines via Control Pipe? #118392
    Oliver Kreylos
    Keymaster

    Nice job!

    Minor source code critique:

    contourLineSpacing and useCountourLine should not be elements of class Sandbox; they should be local variables of their respective code branches inside the control pipe handling code inside the frame() method. There’s no need to modify Sandbox.h.

    In the contourLineSpacing block: Read from the second token into a local variable of type GLfloat, then there’s no need to cast to GLfloat later. You don’t need to check if contour lines are enabled before setting their distance; if they’re disabled, the surfaceRenderer will ignore the distance setting. This will come in handy later when you globally enable contour lines on renderers that might have had them disabled before. You don’t necessarily have to set the contour line distance in the RenderSettings structure itself; those values are only used to store command line options before the surfaceRenderer is created. No need to wrap the code into a try/catch block; none of the methods called in the block can throw exceptions.

    In the useContourLines block: similar basic comments to above. In addition, I advise against using integers to encode boolean values at textual interfaces. Instead of parsing the second token into an int and comparing that int to 0 or 1, directly compare the second token to “on” and “off” using, e.g., isToken(tokens[1],”on”) and act accordingly. That way the command pipe syntax is in line with other boolean-value commands such as “dippingBed off.”

    I.e., I would do

    else if(isToken(tokens[0],"useContourLines"))
      {
      if(tokens.size()==2)
        {
        /* Parse the command parameter: */
        bool useContourLines=isToken(tokens[1],"on");
        if(useContourLines||isToken(tokens[1],"off"))
          {
          /* Enable or disable contour lines on all surface renderers: */
          for(std::vector<RenderSettings>::iterator rsIt=renderSettings.begin();rsIt!=renderSettings.end();++rsIt)
            rsIt->surfaceRenderer->setDrawContourLines(useContourLines);
          }
        else
          std::cerr<<"Invalid parameter "<<tokens[1]<<" for useContourLines control pipe command"<<std::endl;
        }
      else
        std::cerr<<"Wrong number of arguments for useContourLines control pipe command"<<std::endl;
      }
    else if(isToken(tokens[0],"contourLineSpacing"))
      {
      if(tokens.size()==2)
        {
        /* Parse the contour line distance: */
        GLfloat contourLineSpacing=GLfloat(atof(tokens[1].c_str()));
        
        /* Check if the requested spacing is valid: */
        if(contourLineSpacing>0.0f)
          {
          /* Override the contour line spacing of all surface renderers: */
          for(std::vector<RenderSettings>::iterator rsIt=renderSettings.begin();rsIt!=renderSettings.end();++rsIt)
            rsIt->surfaceRenderer->setContourLineDistance(contourLineSpacing);
          }
        else
          std::cerr<<"Invalid parameter "<<contourLineSpacing<<" for contourLineSpacing control pipe command"<<std::endl;
        }
      else
        std::cerr<<"Wrong number of arguments for contourLineSpacing control pipe command"<<std::endl;
      }
    • This reply was modified 11 months, 2 weeks ago by Oliver Kreylos. Reason: Improved parsing of useContourLines command
    • This reply was modified 11 months, 2 weeks ago by Oliver Kreylos. Reason: Typo!
    in reply to: What recommended configuration PC? #118372
    Oliver Kreylos
    Keymaster

    Right now the most cost-effective GPU for an AR Sandbox is probably a GeForce 1060. There is also a Turing-generation GeForce 2060, but it appears to still be a lot more expensive.

    in reply to: Can I use Distance info from Kinect in OpenCv? #118371
    Oliver Kreylos
    Keymaster

    The SARndbox application talks to the Kinect camera at the USB device level, and only one application can open a USB device at a time.

    If you want to feed a depth map into another application, you will have to modify the SARndbox application’s source code to intercept depth maps either directly where it receives them from the Kinect driver or after depth filtering, and send them to another library in-process, or to another application via IPC.

Viewing 15 posts - 31 through 45 (of 498 total)