ZED Mini Turns Rift and Vive into an AR Headset From the Future
Stereo camera company Stereolabs has launched pre-orders for the ZED Mini, a smaller version of their stereo depth-mapping camera which fits on a mount made to attach to VR headsets like the Rift and Vive. When attached, the camera provides stereo pass-through video and real-time depth and environment mapping, turning the headsets into dev kits emulating the capabilities of high-end AR headsets of the future. The ZED Mini will launch in November.
Apple’s ARKit and Google’s ARCore may be bringing AR tracking capabilities to smartphones, but it will be years yet before high-end immersive augmented reality headsets hit the consumer market. Today’s AR headsets, like the HoloLens and ODG R8, have very small fields of view compared to today’s VR headsets (~40 degrees compared to ~100).
The ZED Mini camera, attached to a VR headset, effectively emulates the sort of experience that AR glasses will hopefully achieve in the future—an immersive, wide field of view with real-time tracking and environment mapping.
With a special mount designed to attach to the Rift or Vive, the ZED Mini has two cameras spaced at 65mm (close to the human IPD average), allowing for comfortable pass-through stereo video into the VR headset. In addition to pass-through video, the camera compares the two different images from each camera to build a depth-map of the scene. The company claims the camera can detect depth out to an impressive 15 meters (49 feet). The camera also builds a geometric map of the environment in real time and fuses the data with an onboard IMU enabling positional tracking of the headset within the AR environment.
I got a chance to try the ZED Mini attached to a Rift for myself at VRDC Fall 2017 last week and came away quite impressed. Although the camera’s own field of view isn’t quite wide enough to completely fill the Rift’s field of view, it still presented a large, immersive view of the real world in front of me, far surpassing today’s AR headsets.
At first I was just looking at the crowd standing in front of me. Then I saw a floating menu representing several different demo experiences I could launch. I launched the droid attack game which began to spawn spherical, floating droids in the world around me. As they approached, I saw convincing occlusion as the drones flew behind people and objects nearby. That meant I could duck behind the table in front of me and the drone would be hidden from my field of view by the table, as if the drone was really behind it. As I moved my head around to get a feel for the occlusion, the inside-out positional tracking held up fairly well in my time testing the device, though I’d want a more extensive testing session to get a better feel of the tracking and pass-through latency; I was quite distracted by the impressive occlusion.
In another demo I held a lightsaber prop which was sensed by the camera and a properly glowing end was overlaid on top of it. Using the prop I was able to bat the drones and send them flying to an explosive death. A strong swing would send them flying fast while a gentle poke would be reflected as a mere shove; a rather convincing fusion of an arbitrary, unmarked prop into an AR scene.
There’s pros and cons to doing pass-through AR instead of transparent AR. For one, the augmented parts of the world can look quite a bit more real because the pixels are drawn directly on top of the image of the real world, eliminating that semi-transparent ‘hologram’ look that you’ll find on transparent AR displays (caused by the fact that it’s difficult to create pixels that can occlude 100% of the light from the outside world when dealing with a transparent display). On the flip side, our eyes have excellent resolution and contrast ratio, which generally means we limit our dynamic range (the ability to see dark and light areas of a scene at the same time) and visual fidelity when using pass-through AR (not to mention the potential to introduce latency).
Either way, if devs want to get a head start on AR development for high field of view AR headsets of the future, the ZED Mini seems like it warrants serious consideration.
Last week the company launched pre-orders for the device priced at $450, with plans to begin shipping in November. As an add-on to the Rift or Vive, it’ll appeal moreso to developers who already own those headsets, while developers just getting starting in the world of immersive computing might also consider the Meta 2.
Apple’s ARKit and Google’s ARCore may be bringing AR tracking capabilities to smartphones, but it will be years yet before high-end immersive augmented reality headsets hit the consumer market. Today’s AR headsets, like the HoloLens and ODG R8, have very small fields of view compared to today’s VR headsets (~40 degrees compared to ~100).
The ZED Mini camera, attached to a VR headset, effectively emulates the sort of experience that AR glasses will hopefully achieve in the future—an immersive, wide field of view with real-time tracking and environment mapping.
With a special mount designed to attach to the Rift or Vive, the ZED Mini has two cameras spaced at 65mm (close to the human IPD average), allowing for comfortable pass-through stereo video into the VR headset. In addition to pass-through video, the camera compares the two different images from each camera to build a depth-map of the scene. The company claims the camera can detect depth out to an impressive 15 meters (49 feet). The camera also builds a geometric map of the environment in real time and fuses the data with an onboard IMU enabling positional tracking of the headset within the AR environment.
I got a chance to try the ZED Mini attached to a Rift for myself at VRDC Fall 2017 last week and came away quite impressed. Although the camera’s own field of view isn’t quite wide enough to completely fill the Rift’s field of view, it still presented a large, immersive view of the real world in front of me, far surpassing today’s AR headsets.
At first I was just looking at the crowd standing in front of me. Then I saw a floating menu representing several different demo experiences I could launch. I launched the droid attack game which began to spawn spherical, floating droids in the world around me. As they approached, I saw convincing occlusion as the drones flew behind people and objects nearby. That meant I could duck behind the table in front of me and the drone would be hidden from my field of view by the table, as if the drone was really behind it. As I moved my head around to get a feel for the occlusion, the inside-out positional tracking held up fairly well in my time testing the device, though I’d want a more extensive testing session to get a better feel of the tracking and pass-through latency; I was quite distracted by the impressive occlusion.
In another demo I held a lightsaber prop which was sensed by the camera and a properly glowing end was overlaid on top of it. Using the prop I was able to bat the drones and send them flying to an explosive death. A strong swing would send them flying fast while a gentle poke would be reflected as a mere shove; a rather convincing fusion of an arbitrary, unmarked prop into an AR scene.
There’s pros and cons to doing pass-through AR instead of transparent AR. For one, the augmented parts of the world can look quite a bit more real because the pixels are drawn directly on top of the image of the real world, eliminating that semi-transparent ‘hologram’ look that you’ll find on transparent AR displays (caused by the fact that it’s difficult to create pixels that can occlude 100% of the light from the outside world when dealing with a transparent display). On the flip side, our eyes have excellent resolution and contrast ratio, which generally means we limit our dynamic range (the ability to see dark and light areas of a scene at the same time) and visual fidelity when using pass-through AR (not to mention the potential to introduce latency).
Either way, if devs want to get a head start on AR development for high field of view AR headsets of the future, the ZED Mini seems like it warrants serious consideration.
Last week the company launched pre-orders for the device priced at $450, with plans to begin shipping in November. As an add-on to the Rift or Vive, it’ll appeal moreso to developers who already own those headsets, while developers just getting starting in the world of immersive computing might also consider the Meta 2.
No comments:
Post a Comment