May 16, 2014
Hobbyist Creates Virtual Reality Holodeck Using Kinect And Oculus Rift
Peter Suciu for redOrbit.com - Your Universe Online
Star Trek's Mr. Scott would be proud. 3D video expert Oliver Kreylos has created a virtual reality system of sorts by using off-the-shelf technology. He didn't exactly boldly go where no one had gone before, but he did transport himself into a virtual reality version of his conference room.
Kreylos, a computer science professor and researcher at the University of California, Davis, was able to do this by combining three Microsoft Kinect motion control systems for the Xbox 360 with an Oculus Rift virtual reality headset. The Oculus Rift served as the main user interface that immersed Kreylos into the virtual surroundings, while the three Kinect sensors were able to track body movements and project a virtual avatar of his body into the world seen through the headset.
In addition to the Kinect units and Oculus Rift, the system is run on a single Linux computer powered by a Core i7 processor running at 3.5GHz, with 8GB of RAM and an Nvidia Geforce GTX 770 graphics processor. Kreylos is somewhat tethered in the system's current version but has addressed this by utilizing a 15-foot HDMI cable, 11-foot USB cable and a 12-foot power cord.
Unlike with other immersion technologies, this one is notable in the fact that users can look around, but then see their own hands, and by looking down see one's body. The results as seen in Kreylos's video are still choppy and not exactly high definition, but this is a first step of what will likely be many to come. With each step forward he'll work to address technical issues including the lag that comes from trying to pull in so much information in real time.
"One of the things that always echos [sic] right back when I bring up Kinect and VR is latency, or rather, that the Kinect's latency is too high to be useful for VR," Kreylos posted on his official blog. "Well, we need to be careful here, and distinguish between the latency of the skeletal reconstruction algorithm that’s used by the Xbox and that's deservedly knocked for being too slow, and the latency of raw depth and color video received from the Kinect. In my applications I'm using the latter, and while I still haven't managed to properly measure the end-to-end latency of the Kinect in 'raw' mode, it appears to be much lower than that of skeletal reconstruction. Which makes sense, because skeletal reconstruction is a very involved process that runs on the general-purpose Xbox processors, whereas raw depth image creation runs on the Kinect itself, in dedicated custom silicon."
As first steps go this one could turn heads – and not just those donning the VR headset. This is technology that could be adapted quickly for actual applications.
"This actually showcases some amazing progress though physical motion, smell, and touch remain elusive if we really want to create a 'Holodeck' kind of experience," said Rob Enderle, principal analyst at the Enderle Group. "We have more senses than sight and sound and need to be able to move freely in at least two dimensions. But this does make a huge step forward with regard to instrumenting the space."
Kreylos also believes that the graphical limitations could also make this experience feel more 'real' than if was more high resolution.
"I believe it's related to the uncanny valley principle, in that fuzzy 3D video that moves in a very lifelike fashion is more believable to the brain than high-quality avatars that don’t quite move right," Kreylos added. "But that's a topic for another post."
The next steps in making this truly immersing will require greater ability to move and possibly a way to ditch the heads. Beyond that, for VR to be truly believable it needs to be something that addresses all the senses, said Enderle.
This would include "two dimensional movement that is undetectable, touch and smell all of which are under development," Enderle told redOrbit. "I actually think the eventual solution, because it would be far easier long term, is to tap into the brain directly, trick the users body into thinking it was asleep, and feed in dream without putting the brain into sleep mode (so it is at full cognitive capability).
"In theory you should then be able to hijack all of the senses and you'd need far less equipment to make this work," he added. "This is harder to do as we don't yet have the interfaces into the brain mapped but once that is done then you could actually have a Holodeck like experience generated by something the size of in old iPod. What we are seeing now is relatively crude compared to what we'll likely end up with."