We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›
Playing a first-person shooter like Call of Duty with a virtual reality headset like the Oculus Rift or the Sony Morpheus seems like a recipe for ultra-realistic gaming bliss – until you sprint after virtual Germans and plow right into your actual living room wall.
When you wear a screen on your face, “there’s a lack of situational awareness of the actual world,” says Leap Motion co-founder Michael Buckwald, sipping a soda at a demo at the company offices in San Francisco. “Also, every time I want to take a sip of Coke, I have to take the Oculus Rift off.”
“In a first-person shooter, you need to be able to turn around quickly,” says Omar Khan, the lead Unity developer at Austin-based Chaotic Moon, which designed a humorous Oculus Rift game called Shark Punch. “And if you’re sitting in a chair, tethered to a computer, it’s very difficult to be able to turn. And to aim — it’s very awkward, it’s kind of weird.”
The system gives a user full awareness of the real world while immersed in a virtual one.
Leap Motion has announced new hardware and software for its infrared controller, which allows users to interact with computers using just hand and finger gestures. (Popular Science profiled the original Leap Motion device in 2013). When paired with a virtual reality headset like the Oculus Rift, the system gives a user full awareness of the real world while immersed in a virtual one.
The Oculus Rift headset, acquired in March for $2 billion by Facebook, offers stereoscopic 3-D and 360 degree visuals, for a richly experienced virtual world. But as far as seeing the real world, wearing one is like strapping a bucket on your head. Leap wants to become the go-to controller for VR. The company’s just-announced tweaks include a clip-on mount for VR headsets like the Rift, and improved “top-down” tracking of the hands, even when the user reaches out or turns around.
Developers like Khan have experimented with using the gestural interface on the first-generation Leap controller to manipulate the Rift’s virtual reality, attaching the hand-detecting infrared Leap sensor to the front of the Rift headset, sometimes with duct tape. For first-person shooter games, where players hold a weapon, this can feel awkward, because the user must hold the hands high up, in front of the sensor, for proper tracking. The new top-down tracking orientation, clip-on mount, and larger field of vision will make gesturing for the sensor more comfortable.
Most importantly, Leap has released an application programming interface (API) that will essentially turn the device into a 3-D camera, by letting programmers access infrared data about the user’s surroundings. Using this API, raw infrared imagery of the real world can blend with a representation of the virtual realm. This is a crucial milestone on the way to true augmented reality.
“We’re able to take the live feed of what [users] see and convert it on the fly into a virtual world,” says Khan. “That is something that is quite different.”
Google Glass bombed with many early adopters because it was essentially a “phone for your face” that did nothing more than display output from conventional apps. The new API will incorporate the room into the game. This is true augmented reality.
Medical students might take nanoscale walking tours of the brain.
“Let’s say we wanted to create a game where we create creatures that come out of the environment around us, like my office,” says Khan. “Because of the ability of [the new API] to give you raw image data, you could make a tree grow right in the middle of your office. Or, if you want enemies to come spawning at you, there’s your desk, and all of a sudden, there’s this goblin in your desk. So there’s a lot of things you can do that haven’t been seen before.”
Medical students might take nanoscale walking tours of the brain. Or high schoolers might go on intergalactic journeys. The ability of the device to track tiny .01 millimeter finger motions, far finer than those by the Microsoft Kinect, permits sculpting of 3D virtual objects that might give rise to a new generation of claymation artists.
But in the short-term, virtual reality will sink or swim with gaming. Khan thinks game studios will be attracted by the possibility of doing something entirely new. After all, in theory, the raw-image data will allow you to turn your ordinary living room into the infested space station from Aliens.
“We know that our office looks like this, has a wall over here, but we want to render something from Aliens. We could do that. Or you could create a Nerf battle virtually, or a paintball game virtually, in your office,” Khan says. “Everyone is walking around Oculus Rifts and Leap Motion cameras, pulling data from where they’re at.”
These games can get gamers off their behinds in a far more elaborate way than with the Wii or the Kinect, because you’re no longer tethered to your T.V. or to your same-old surroundings.