Intel has unveiled a virtual-reality headset capable of bringing near-field objects into virtual worlds. Dubbed as a “merged reality” device, Intel's Project Alloy sounds like something straight out of a science fiction novel, allowing the wearer to see their hands and other objects in virtual worlds and environments. The question is, who actually wants a VR experience with their hands?
Project Alloy works by using Intel's RealSense depth-sensing cameras, the same ones used in Windows 10 laptops and computers for facial recognition systems. These cameras can accurately capture and map a user's hands with the headset and place them into the virtual space. Merging this with a virtual world on the headset's screen, wearers could use their hands to physically interact with their environment.
During Project Alloy's unveiling at the Intel Developer Forum in San Francisco, Intel's CEO Brian Krzanich claimed the device would, according to the BBC, “redefine what is possible with computing”.
From footage shown on-stage, it's clear that Intel's RealSense camera can pick up low-resolution scans of real-life objects to import into a virtual space. It's also evident that objects are only detected when held close to a user's face.
Still, as a Medium post by Krzanich points out, the technology does have advantages over the likes of Oculus Rift or HTC Vive, which require users to install and set up sensors to capture a room for positional tracking. Here, Intel's headset does the heavy lifting itself by using depth-sensing cameras to work out your position within a given space.
Working out the kinks
Interestingly, Project Alloy is like HoloLens in that it's a completely wireless system. This certainly does have its own advantages, such as never becoming tangled in a wire or being “jolted” out of an experience due to the limitations of a cord. However, by processing everything onboard, Intel's Project Alloy certainly can't replicate the silky smooth experience of a proper VR headset running on a high-end PC. If anything, it'll be something akin to Samsung's Gear VR experiences except with more positional tracking data.
It's also hard to imagine exactly what something like Project Alloy would feel like,without controllers. Krzanich explains how Intel sees Alloy working by stating you could “pick up your real-world tennis racket in your living room and step virtually onto the court at Wimbledon… fully lean into practicing your back swing – bringing both your hand and racket into the virtual field of play.” While that approach to play may work for Krzanich and his tennis court-sized living room, I'd honestly have to say that I'm a little nervous about swinging a full-sized tennis racket around my house.
Even if you can bring in real-world elements into a virtual world, it's hard to imagine they'd be pleasingly convincing experiences without haptic feedback. Having used many VR devices, the best experiences out there are from ones that provide a sense of feedback to help immerse you inside a virtual space. That said, Microsoft's HoloLens doesn't make use of a controller except in some specific situations and, if Tim Cook's words are to be taken as gospel, Apple's AR project may also not make use of a physical controller.
Intel isn't looking to actually manufacture Project Alloy itself. Instead, it wants to ensure it doesn't miss out on a vital new market like it did with the rise of the smartphone. Intel plans to sell its Alloy technology to manufacturing partners who can use it to develop their own “mixed reality” products. Currently, nobody has been named as working with Intel on the headset, but Microsoft has stated that Windows 10 will soon support Alloy via an update.