Depth and color data?

Home Forums AR Sandbox Forum Depth and color data?

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #108241
    niklaskp
    Participant

    Hi,

    I have extracted the code and color data from the code sending it forward with OSC and also added Tonic audio library to the code, i’ll ofcourse upload it to Github when ready.
    Right now I am just trying to find better ways to interact with the data, I am not really that used to coding.

    How do I extract the frame buffer data? I need the one that is saved as the output image. I want to extract color information and depth from this data.

    #108242
    niklaskp
    Participant

    Right now I am just taking the color of the screen, and this is not working well enough.

    I am also having some problems extracting the handPosition properly. I am taking the data directly from the corner-data in handExtractor instead of taking the tracked hand because I found that data harder to use for the sound. But now this brings me other problems, like the tracking only works from one direction, I need to check what corner is the hand side of the blob.

    This position data I would need to map with the depth of the color or height of that position on the sandbox so this needs to come from the buffer. Any tips on how to get started with that?

Viewing 2 posts - 1 through 2 (of 2 total)
  • You must be logged in to reply to this topic.

Comments are closed.