Why Your Next Phone Should Have Two Cameras for MR

Article posted on

2019-11-25

Another day when we talked with my wife about how we can leverage augmented and virtual reality for real-world cases. We flawlessly shifted to the idea that AR and VR have something in common with solutions, but are different in approaches.
Just look, room-scale VR headsets are so exciting because they allow user freely move around. We call that thing “6 degrees of freedom”. They got it by using a sophisticated system of sensors. ARKit and ARCore are great because they allow user moving around an object. Really it’s the same “6 fields of freedom”, but without hardware setup. Mobile AR is relying only on the software and a single camera.

After you realized this, life will never be the same. We felt ourself as we were ripped off. To prove this theory, we built an AR demo. Here user doesn’t move around a model; he moves inside it and looks around. To make demo even more exciting, we made it for a real live project.

My wife is an interior designer. She has used 360 renders, VR scenes and AR for furniture objects. It allows her to do a lot of work remote. So there was the situation when she needed to show kitchen design to the customer who was thousands of km away, on the construction site. So we build the ARKit demo of his kitchen design. Everything was modelled, and we made the window glass transparent. It was made for being sure that projected, and real worlds are perfectly aligned. And it worked!

innasparrow.com

Using such application on construction site, or anywhere else you can naturally scale up small details and freely move around to feel 1:1 scale of the future apartment. It’s even more exciting than ordinary room-scale VR scene.

innasparrow.com

Let’s be honest VR demo has advantages as a user is fully immersed in a scene using a stereo headset. But wait, AR-ready phone can also show a stereo picture in VR goggles. So why not to use simple cardboard, keep hole for a phone camera, and use it for tracking in 6 dof?

Really we are not so far from true mixed reality. What if your phone has cameras placed from each other on the same distance as between your eyes? It’ll allow to project two separated video streams for two eyes and overlay virtual content from a slightly different perspective. The stereo effect will create a realistic feeling of deepness. I hope that this distance also will help to recognize how real objects are placed on a scene to mask covered parts of augmented objects.

It’s so easy that looks like magic. But practically it doesn’t look like hardware in phones is powerful enough to do twice more calculations. Even now, the projected environment isn’t enough stable and often shifts a little bit. And if on phone screen it’s ok, for fully immersive virtual reality headset it doesn’t work. A user will feel dizzy very fast.

I’m looking forward to those time when we’ll get finally next level of AR hardware.