Using SARndbox with libfreenect or Iannix

Home Forums AR Sandbox Forum Using SARndbox with libfreenect or Iannix

Viewing 10 posts - 1 through 10 (of 10 total)
  • Author
  • #100748

    I’ve got my sandbox up and running nicely, but now I was wondering if there is a way to get some of the data from a kinect while it’s running?

    I’ve been interested to use the sandbox with 3d sequencer Iannix (, which can take input from kinect via libfreenect, (video demo here or OpenSoundControl messages in general.

    I think I’ll have to use two kinects, but am just wondering since it’s beyond my abilities to read from the source, how you might get these to all play nicely together?


    Oliver Kreylos

    That would require code modification. The main SARndbox application receives depth image data in a central callback, from where it is dispatched to any clients who need depth data for processing. It would be straightforward to add another client that converts the raw depth data into whatever format necessary to control external applications, and then stream it out through OSC or a custom protocol.

    The relevant callback is called rawDepthFrameDispatcher, and found in Sandbox.cpp.


    Thanks Oliver!

    I’ll have a look into adding OSC output. That would make my setting everything up much easier.


    So I’m looking through the code for rawDepthFrameDispatcher, and sending the buffer to a new callback.

    Now when I want to get the depthBuffer (I’m just trying to get it to print out right now), do I call the FrameBuffer objects getBuffer() method?

    Sorry to bother you I’m just not very used to C++. Getting a running printout of the depth would be a huge step for me.



    Ok progress update I’ve had some success getting values out calling getBuffer() and casting to RawDepth as you do in FrameFilter.cpp. I think I’m getting somewhere.

    Hopefully I’m on the right track and now I just need to interpret the data! I’ve not work with a kinect before so I’m a little unsure of what I’m getting out, is it every pixel of 11 bit depth?

    Oliver Kreylos

    To process an incoming depth image within the rawDepthFrameDispatcher method, or any method called by it, you access the frame contents via the getBuffer method:

    const RawDepth* depthImage=static_cast<const RawDepth*>(frameBuffer.getBuffer());

    RawDepth is a typedef for unsigned short, i.e., each pixel in the depth image is a 16-bit unsigned integer. You can get the size of the depth image from the frame buffer’s getSize() method, but for Kinect v1 the size is always 640×480.

    The contents of the depth image are raw pattern displacement values, meaning they are not metric distances. The conversion formula from displacement values d to metric z values is slightly different for each Kinect camera; figuring out the parameters is part of intrinsic calibration. The precise formula to get z from d is:

    float z = A / (B - float(d))

    where A and B are intrinsic calibration parameters. Approximate values for A and B are 34000 and 1090, respectively, which will yield z values in centimeters.

    There is also pretty significant non-linear distortion in the depth image, due to lens imperfections in both the IR pattern projector and the IR camera. My Kinect software uses per-pixel depth correction formulas to account for those; they are applied to the raw displacement values before conversion to metric distances. See the FrameFilter class’ filterThreadMethod method to see how they are applied.


    Just read your reply, thanks Oliver!

    • This reply was modified 6 years, 5 months ago by ernusame.

    I hope you will update this thread if you have made progress on getting the elevation data from the system. We have a sandbox at my university (East Carolina U.), and I think if we had the data… it would be very useful for teaching more advanced concepts/methods.

    Please let us know your progress or problems.



    Yes J.P I’ve managed to get OSC data from SARndbox successfully! Both raw and filtered.

    I’m now using it to drive supercollider and max/msp patches for sound.

    One problem is managing the large output, right now I’m sampling every other pixel, and throttling it with the system clock. I’m sure there’s a better way but was getting arraybounds errors using c++11 time libs and a newer compiler, have no idea why.

    The other is that this is the first time trying to use c++ 🙂

    I’ll upload my ‘modded’ version to github on monday. Hopefully I’ll be able to tidy things up into classes etc. at some point.


    Hi ernusame,

    Can you post a link to your github? I’d like access to the data stream to implement usage tracking.



Viewing 10 posts - 1 through 10 (of 10 total)
  • You must be logged in to reply to this topic.

Comments are closed.