Meta Prototype Show — Next Major VR Hardware Breakthrough?

Lucid Reality Labs
13 min readJul 5, 2022

--

Meta has yet again caused a stir amongst the tech community. This time it was with the unravelling of a whole set of VR headset prototypes, and a background sneak peak of some devices that potentially looked like AR glasses. Counting 24 devices (see image below), Meta representatives including Mark Zuckerberg and the Chief Scientist Michael Abrash have spoken about four main names — Meta Half Dome, Meta Butterscotch, Meta Starburst & Meta Holocake, as well as couple previous iterations of these prototypes. While one could suspect it being part of a grand promotional campaign for the upcoming launch of Project Cambria, Meta’s newest and so far most powerful headset, we decided to look more closely at the key announcements that were made. A while back we already looked at Project Cambria in one of our previous articles, “XR Headsets to Look Forward to in 2022”. Back when the headset itself and its capabilities were still a bunch of speculations and rumors. Today, we still don’t have a full vision of Project Cambria specs, aside from estimated 2160x2160 per eye resolution and 90Hz refresh rate and a couple of more predictions. Even though it seems rather obvious that Meta is planning on launching the VR headset at the end of 2022.

Image Source: Meta

Nevertheless, if we look beyond Project Cambria and the potential marketing move, Meta’s Media Roundtable behind-the-scenes series “Inside the Lab”, this time hosted by both Mark Zuckerberg and Michael Abrash from Meta Reality Labs, revealed quite a few exciting technologies. The ones, if developed into final products, that could become the next big VR headset breakthroughs of the decade. Combined with Adam Savage’s Tested extensive interview and his on-site hands-on experience at the Meta Reality Labs, we had a grand sneak peak of what is “cooking” inside of the Meta technology research centres. Which also sets our expectation for more technological revelations to take stage in the upcoming couple of years.

All in all, as some say, in their attempt to flex Meta still uncovered quite a bit of their Virtual Reality (VR) technology, spoke a few words about Augmneted Reality (AR), set the stage for some potential new features and capabilities as well as dropped a couple of hints about their on-going developments.

Meta Reality Labs claims to be working on advancing every aspect of the hardware technology from lenses and sensors, to silicon and software. Even though some of the items they showed are yet to take final shape, there is a big chance that they could potentially move from concept to prototype and further into products in the foreseeable future.

To begin with, both the media roundtable and the interview did revolve around the VR headsets prototypes, mentioning some of the essential features an AR headset should also include. But most importantly, Meta has highlighted a number of their key research directions, which include display resolution, optical focus shifting — the very focal technology, distortion, brightness, high dynamic range and headsets formfactor. As well as talked quite a bit about the important milestones that ought to be achieved in the technological development in order to pass the Visual Turing Test. Which according to the Meta’s Chief Scientist Michael Abrash no one has so far passed.

MICHAEL ABRASH META CHIEF SCIENTIST

“The Visual Turing Test which is the phrase which we adopted along with other academic researchers is a way to evaluate whether what’s displayed in VR can be distinguished from the real world. And it’s completely subjective test because what is important here is the human perception of what their seeing, the human experience rather than technical measurements. And it’s a test that no VR technology can pass today. While VR already does create a strong sense of presence of being in virtual places in a genuinely convincing way, its not yet at the level where anyone would wonder whatever what they are looking at is real of virtual.”

While Meta concentrated their prototype demonstration efforts around the VR headsets and solving fundamental technical challenges in their development, they did mention AR headsets within the Visual Turing Test subject. However, the potential timeframe for AR hardware shipment that Mark mentioned was estimated to be planned only for the second half of this decade. Moreover, yet another major breakthrough in display technology will be required to achieve the level of realism and visual fidelity feasible to pass the Visual Turing Test for any of Metas future AR and VR headsets.

As already mentioned, today, there is still quite a few challenges and limitation in VR headsets technology that need to be overcome like resolution, distortion, eye focus and fatigue before we even start approaching the Visual Turing Test. An ideal VR headset of the future would be a stand-alone device and require to be both compact, lightweight and capable of running for prolonged periods of time off of a single battery charge as well as have enough processing power to deliver photorealism and next level immersion. The display resolution of these next generation headsets would be both sharp, stereoscopic (in 3D), with a wide angle of the field of view, more pixels and focus capabilities. Mark talks about these displays and the vision for their potential use.

MARK ZUCKERBERG META FOUNDER & CEO

“Displays that match the full capacity of human vision are going to unlock some really important things. The first is a realistic sense of presence, that’s the feeling of being with someone or in some place as if you are physically there. And given our focus on helping people connect you can see why this is such a big deal.”

While the opportunities these new displays can deliver could be unlimited, Meta concentrates more on connecting users, giving birth to the new forms of art and self-expression, next step in creativity, richness of experience, feeling of as if physically present in a number of interactions, presumably in the Metaverse. But looking a bit further than simply connecting users in photorealistic environments, they could have a grand potential for a vast scope of businesses and institutions to integrate new technology for education, upskilling, reskilling and training. This is especially essential for industries where photorealism will be the next benchmark of standard required by the industries themselves, like Medtech and especially Healthcare.

But for now, going back, let’s look at the introduced VR headset prototypes, concepts and experiments that could be the key milestones in coming closer to passing the Visual Turning Test. Before we realistically get anywhere close to photorealism, quite a few fundamental challenges will need to be solved, taking into account how the human eye perceives visual information and how our brain processes and reconstructs it. Mark talks more about the general challenges of currently existing VR headsets, some of which have already been solved in the shown prototypes and some that still require to be worked on.

MARK ZUCKERBERG META FOUNDER & CEO

“You need stereoscopic displays to create 3D images, you need to build a render of objects and focus your eyes at different distances which is different from a traditional screen or a display where you only need to focus at one distance. You need a display that can cover a much wider field of view than traditional displays. And having a retina level resolution across that whole field of view requires way more pixels than on any traditional display. You need screens that can estimate the brightness and the dynamic range of the physical world which requires 10 times even more brightness than even what we get on HD TVs today. You need realistic motion tacking that is low latency. You need to build a new graphics pipeline to power this type of displays that can get the best performance out of CPUs and GPUs. Need enough power but must not drain battery or overheat as well as be combined in a device that can fit on one’s face.”

The Chief Scientist of Meta also talks quite a bit about one of the more complex issues, the need to reduce the lenses distortion which will require to be done via software, at an eye movement pace as well as in real-time. Michael Abrash also outlines the next level challenges that will be set for later stages of VR hardware development that include: vergence-accommodation conflict (VAC) — the issue when the brain receives conflicting information regarding the distance of objects in the virtual environment, chromatic aberration (CA) or so called “fringes” of color that can occur on the boundaries between the light and dark boarders of the image, ocular parallax or minor depth-dependent image shifts that impact depth perception and realism as well as pupil swim which refers to the level of distortion associated with the pupil movement around the lens.

Finally, let’s actually talk about the VR headsets prototypes that were introduced — the Half Dome for very focal technology, the Starburst with high dynamic range, the Butterscotch with high resolution, and of course the Holocake that demonstrated the light-weight formfactor.

META HALF DOME PROTOTYPE

The Half Dome prototypes are a series of at least three prototypes demonstrating very focal technology, introduced to demonstrate how auto focus could potentially work in a VR headset. The VR Half Dome 1 was built back in 2018 and is the fourth very focal headset amongst Meta prototypes, built in majority based on the Oculus Rift model. The headsets attempted to incorporate reliable eye-tracking, which is essential for very focal technology, with each generation experimenting with a different set of lenses and mechanic engineering to create a comfortable, low vibration and noise dynamic focusing mechanism.

For these VR headsets iterations, the concept was to change the way the lenses are built into the headset. Instead of being in a fixed position the devices introduce dynamic positioning, with the change of focus moving back and forth, replicating real life eye focusing of a user when they look around at objects locate close up and far away. Today’s VR devices don’t have the ability to adjust and switch focus, thus putting a significant strain on the users’ eyes which causes discomfort and eye fatigue.

While working with these prototypes, the distortion correction has posed quite a challenge, as it varies with the eyes movement and technically is required to be corrected by the software. Ultimately, Meta has reached a point where they built Half Dome 3 using pancake lenses, balancing a combination of weight, comfort, field of view (FoV) and power that could run an electronic focusing mechanism.

At the end of the day, having an autofocusing VR headset could be a major game-changer for content, usability and capabilities of VR headsets. Something we would grow accustomed to and expect in our AR and VR devices, the technology to instantly adjust to and replicate the mechanics of how human eyes work.

META BUTTERSCOTCH PROTOTYPE

This particular prototype is aimed at both the display resolution as well as to benchmark the diminishing mark of VR headset resolution. Aiming to match the retina level resolution of 60 pixel per degree (PPD), the display could provide comfort for prolonged hours of use. At the same time, it has a significant limitation when it comes to 3D object and environments rendering in real time. Today, the chips that power VR headsets are yet to have enough processing power, be compact enough, and work off of currently existing batteries without overheating or making the device formfactor too bulky and uncomfortable to be head worn.

At the same time, the software stack is yet to reach such a level of rendering and will require to develop in parallel with the hardware to take full advantage of the retina level resolution. While foveated rendering, which allows to utilize eye-tracking and reduce resolution outside of the eye focus area, is considered a solution, it will require to develop much further in its capabilities. To the extent where eye-tracking can predict where the user will be looking at milliseconds prior to the eye movement.

Nevertheless, Butterscotch prototype aims to achieve and recreate something close to 20–20 human eye resolution, which according to Mark will be around 54.5 PPD and total to 8K display resolution. Today, that can only be achieved with the prototype wired to a PC, the future would have to potentially deliver the next generation of lightweight and powerful chips for it to operate as a stand-alone device.

The other major challenge that would have to be solved for this prototype to start shaping into a real product, would be the dynamic distortion, which is so far difficult to overcome with such a high level of resolution. At this point, Meta places its bets on the advancement of the software rather than mechanics inside the VR headset. What is interesting though, while testing out different options of lenses to achieve the desired resolution, Meta has written Light field portals package that allow to simulate a variety of lenses without any restrictions or having to actually build a variety of prototypes.

But to sum up, without doubt, having retina level of resolution could be a major game changer for any experience far beyond social interaction and communication.

META STARBURST PROTOTYPE

This particular prototype is most definitely a curious find that demonstrates a significantly high dynamic range. Inside the prototype, Meta uses a backlight already 200 lighter than in any of the currently existing VR headsets, an equivalent of 11K Nits.

This level of light can unravel absolutely new possibilities for content as it brings the next level of contrast between light and dark spectrum. Making it possible to build VR objects and environments that have completely different levels of depths from what we experience in VR today. Thus, making the next level of realism and immersion possible.

What’s interesting for this particular case is that the research team of Starburst was not limited in any way by the formfactor, thus the bulky exterior. This allowed to demonstrate the possibilities of adding high visual contrasts and the difference it could make in visual perception of the experience all together.

Further research, especially in combination with the retina level resolution and accurate eye-tracking could impact how the VR industry and virtual existence is perceived as a whole. Making Meta’s vision of a massively engaging Metaverse much more possible than ever before.

META HOLOCAKE PROTOTYPE

The last but not least of the prototypes that have been demonstrated was the Holocake prototype, where the formfactor played an essential role. The formfactor included the visual comfort and effective weight. Today, this VR prototype works teetered from a PC, but potentially could become a stand-alone device of the next generation.

Communicated as to have holographic optics, polorized reflection, this much thinner and lighter prototype could be the benchmark for many future VR and even AR devices. Generally speaking, immersive hardware would have to make a major leap into the future to ensure VR could be transported into something of a sunglasses frame. There will need to be a tremendous leap in both display, silicone, software and many other technologies, to make it happen. At this point Meta is still looking for effective solutions to power the headset, however looking at how the developments are going, we can predict it to be possible already in the next half of the decade.

Still, the second generation of the prototype Holocake 2 is planned to include very focal and eye-tracking. Even though today it is still a concept, once the architecture is proven, if it actually could be, it without doubt would be a significant breakthrough in both AR and VR technology.

Let’s be honest, the introduced prototypes and developed technology sounds not just promising but like a potential major step forward in the next generation of immersive devices. While Project Cambria promises to feature pancake lenses, delivering advanced levels of resolution, it seems that Meta is set on integrating them also into further generations of both AR and VR devices.

While Meta is working on the optical stack, they have tried incorporating a specter of different lenses that could be activated in different combination as well as lenses that could be electronically actuated to move back and forth depending on the focus range. The vision is to have a lens that could be rapidly activated eliminating the need to physically move the lens inside the headset, which is the main idea behind the varifocal lens technology. Realistically, Meta expects it to take another five to six years considering the milestone that have already been achieved today.

Let’s not forget that Meta has also mentioned the AR headsets, even though no actual prototypes were revealed, we could count on another big prototype revelation of augmented technology, displays and hardware of what could be considered as AR glasses or headsets in the nearest future.

With all being said, Meta still plans to make their hardware affordable for most consumers, to have the technology as widespread as mobile phones are today. This would come hand in hand with their vision to unravel a new field of social interactions, collaboration, work and social gaming. At the same time, they do not exclude another major segment, aimed at professional and enterprise use.

Thus, having two price points, one closer to a mobile phone, potentially around the current Quest 2 price of $299 USD, and the second within a professional PC range, however still affordable for many. Here we can only assume it to fall within the $1000-$2000 USD range. With the more advanced headsets being originally launched for the business, high-end segment then transitioning to the consumer level. One thing is clear, that they will without doubt be competing with Apple, their long-run technological rival and one of the generations of their much-anticipated MR headsets and AR glasses.

So looking at what we have learned from Meta, is that they currently have a couple of Quest generations in works, as well as a potential AR headsets. With all that, the big question today still remains, as in when we could actually expect a VR headset developer to build wearable technology that could feel visually indistinguishable from a real world environment. One thing is certain, that Meta is developing many of its devices to be smaller, lighter and importantly, affordable for the general public to take over the Metaverse connected hardware market.

Authors: Alex Dzyuba, Lucid Reality Labs Founder & CEO | Anna Rohi, Lucid Reality Labs Senior Marketing & Communications Manager. You can find the original article published here, for more articles and news on immersive technology follow Lucid Reality Labs Blog.

--

--