Reply To: Other applications for AR sandbox

Home Forums AR Sandbox Forum Other applications for AR sandbox Reply To: Other applications for AR sandbox

#101000
StephenGuerin
Participant

I just browsed this forum for the first time today and noticed this thread.

Cousin Eddie, thank you for pointing out our video. I am the inventor and owner of Simtable.

The hardware costs for the Simtable is a relatively small component of our price. Hardware, by the time you add up projector, camera, computer, sand media, tables, cabling, stands, mounts etc is about $3500 for us. We have been working on a mobile version that brings this down to just a projector and a mobile phone which assuming one has a phone could be as low as $100-$200 as the projector prices fall.

Most of our current cost as a small business is in paying developer and staff salaries to develop interactive GIS simulations (fire, flood, hazmat plumes, traffic and forest ecology), to pay staff travel to meet with users and to setup the tables and train users. Further we provide realtime networked GIS data to feed the simulations as well as bi-directional synching between the Simtable and mobile clients and laptop browsers. We have not been funded by public grant money – at the end of the day developers need to eat 🙂

We recognize not everyone has the budget – We have donated our software to schools and volunteer fire departments. We’ve also open-sourced our agent-based modeling framework agentscript.org for users to develop their own models.

Oliver, we haven’t met but I’ve followed your project since the release of the Kinect and the hacking fervor that followed. I’ve been meaning to contact you for some time but haven’t. You can reach me at 505-577-5828 or stephen@simtable.com. We have a mutual friend in Jim Crutchfield who was in Santa Fe before coming to Davis.

BTW, very nice implementation! I think you’ve nailed a nice interface with a very playful and engaging demo that requires no instruction on use. And nice work to package up an open-source project for others to experiment with!

I just wanted to correct a bit in your post. Simtable *does* do interactive 3D scanning and gives feedback for sculpting to match a given DEM. You can see examples of this scanning we were doing back in 2008. Here is an an old archive page that shows the scanning http://redfish.com/simtable/

I did experiment with the early ZCam before Microsoft purchased them prior to Kinect Development. I decided against using depth cameras for ambient computing as I feel we already have controllable light coming from the projector and it seemed redundant to me to need laser light from the depth cameras. We think rooms can be covered with multiple projectors (lamps) and cheap cameras to do depth scanning and auto warping and blending.

While there’s some advantage to having that light in the infrared so it’s imperceptible to the user (as the kinect does), we are implementing selective scanning as well as leveraging the projected interface to determine depth based on the offset in the camera without distracting the user with structured light patterns. This can be done with realtime refresh rates. As we replace industrial cameras like the PtGrey with mobile phones, we’re finding the high resolutions of the phones yield extremely high resolution depth scans. In some cases, consumer mobile phone cameras are now at 18 megapixels or better. You can see some of recent direction here that includes depth scanning and warp correction:
http://bit.ly/AnySurfaceReview

I hope to meet up sometime – again awesome work!

Comments are closed.