VividQ and Dispelix create 3D wearable augmented reality technology

Check out all of the on-demand sessions from Summit Smart Security here.

VividQ, a manufacturer of 3D rendering technology for augmented reality games, has teamed up with waveguide designer Dispelix to make a new 3D image technology.

The companies said the technology was nearly impossible just two years ago. They said they designed and
Fabrication of a “waveguide collector” that can simultaneously accurately display variable depth 3D content within a user environment. For the first time, users will be able to enjoy immersive AR gaming experiences where digital content can be placed in their physical world and they can interact with it naturally and comfortably. This technology can be used for wearable devices, i.e. AR headsets or smart glasses.

The two companies also announced the formation of a business partnership to develop new 3D waveguide technology for mass production readiness. This will enable headset manufacturers to be able to start their AR product roadmaps now.


Early augmented reality experiences seen so far through headsets such as Magic Leap, Microsoft HoloLens, Vuzix, and others, produce 2D stereoscopic images at fixed focal distances, or one focal distance at a time. This often causes eye strain and nausea for users and doesn’t deliver the necessary immersive 3D experiences — for example, objects can’t be interacted with normally at arm’s length, and they’re not exactly placed in the real world.

In order to deliver the kinds of immersive experiences needed for AR to reach mass market adoption, consumers need a sufficient field of view and the ability to focus on 3D images in the full range of natural distances – anywhere from 10 cm to visual infinity, simultaneously – in the same way with which they naturally deal with material things.

The waveguide collector is the industry’s preferred method for displaying AR images in a compact form factor. Optimized for 3D applications such as games, this next-generation waveguide and accompanying software means consumer brands around the world can unlock their full market potential.

Waveguides (also known as “collectors” or “waveguide collectors”) give a lightweight, traditional-looking (i.e., look like a regular glass lens) front end to AR headsets, and are critical to their widespread adoption. Aside from the advantages of the form factor, the waveguides on the market today perform a process called pupil repeat. This means that they can take an image from a small display board (also known as an “eyebox”) and effectively make it larger by creating a grid of tiny copies of the image in front of the viewer’s eye – a bit like binoculars but instead of displaying one, it creates multiple views . This is necessary to make the wearable AR comfortable and easy to use.

Small squares of the eyes are notoriously difficult to line up with the pupil and the eye can easily “drop” the image if it is not aligned properly. This requires that the headset be a perfect fit for the user, since differences in the pupillary distance (IPD) of different users can mean that they may not line up exactly with the eye square and be unable to see the virtual image.

Because there is a fundamental trade-off between image size (which we call “eyebox” or “exit pupil”) and field of view (FoV) in view, this replication allows the optical designer to make the eye box very small, relying on the replication process to give a large effective image to the viewer, while also maximizing FoV.

“There has been significant investment and research into technology that can create the kinds of augmented reality experiences we dreamed of, but it just doesn’t live up to basic user expectations,” said Darran Milne, CEO of VividQ. “In an industry that has already seen its fair share of hype, it can be easy to dismiss any new invention as more than the same, but the fundamental problem has always been the complexity of viewing 3D images placed in the real world using a decent field of view and with an eye box large enough to accommodate A wide range of IPD (interpupillary distance, or the distance between a user’s pupils), all covered by a lightweight lens.”

Added Milne: “We solved that problem, designed something that could be manufactured, tested and proven, and built the manufacturing partnership needed to mass produce it. It’s a breakthrough because without 3D imaging you can’t deliver augmented reality. Simply put, while others have been developing a 2D display to wear on your face We have developed the window through which you will experience real and digital worlds in one place.”

Concept image of a simulation game where the user can interact with a digital world at arm’s length.

VividQ’s patented 3D waveguide integrator is designed to work with the company’s software, both of which can be licensed by wearable manufacturers in order to build a wearable product roadmap. VividQ’s holographic rendering software works with standard game engines like Unity and Unreal Engine, making it incredibly easy for game developers to create new experiences. The 3D waveguide can be manufactured and supplied on a large scale through VividQ’s manufacturing partner Displex, an Espoo company that makes transparent waveguides for wearable devices based in Finland.

Wearable augmented reality devices have huge potential all over the world. For applications such as gaming and professional use, where the user needs to be immersed for long periods of time, it is critical that the 3D content is real and placed in the user’s environment. This also overcomes the problems of nausea and fatigue. We are very excited to work with VividQ as your waveguide design and manufacturing partner on this amazing 3D waveguide. “

At its headquarters in Cambridge, UK, VividQ has demonstrated its software and 3D waveguide technology to device manufacturers and consumer tech brands, with whom it works closely to deliver the next generation of wearable augmented reality devices. Playing now is a reality.

The task the companies achieved was described as “almost impossible” in a paper published in the Nanophotonics Journal in 2021.

Current waveguide collectors assume that the incoming light rays are parallel (hence a two-dimensional image) because they require that the light bouncing back inside the structure follow paths of the same length. If you were to put divergent rays (a hologram), the light paths would all be different, depending on where on the input hologram the ray originated from.

This is a big problem because this effectively means that the extracted light has traveled different distances and the effect, as shown in the image at, is to see multiple partially overlapping copies of the input image all at random distances. Which makes it basically useless for any application. Instead, this new 3D waveguide collector is able to adapt to diverging rays and display images correctly on a couch.

VividQ’s 3D waveguide consists of two components: first, a modification of the standard design of the pupillary repeat waveguide as described above.. and second, an algorithm that computes a hologram that corrects distortion caused by the waveguide. The hardware and software components work in harmony with each other, therefore you cannot use the VividQ waveguide with anyone else’s software or system.

VividQ has more than 50 people in Cambridge, London, Tokyo and Taipei. The two companies began working together in late 2021. VividQ was founded in 2017 and can trace its origins to the UK’s Department of Photonics at the University of Cambridge and Cambridge Judge Business School.

The company has so far raised $23 million in investments from deep tech funds in the UK, Austria, Germany, Japan and Silicon Valley. When asked about the inspiration, Tom Durant, CTO of VividQ, said in an email to GamesBeat, “Understanding the limitations and then working around them. Once this path was identified, our multidisciplinary team of researchers and engineers across visuals and software set out to solve each one.” Rather than simply considering this an optics issue, our solution relies on hardware and software that are set to work in tandem.”

As to how this differs from competing technologies, the company said that existing waveguide integrators on the market can only present images in two dimensions at specific focal distances. These are usually about two meters in front of you.

“You can’t bring them closer to focus on or focus on other digital objects that are far away,” the company said. “And when you look at these digital objects floating in front of you, you can very quickly experience eye strain and VAC (vergence accommodation dispute), which causes nausea. For games, that makes them very limiting. You want to create experiences where the user can pick up an item in their hand and do With something in it, without the need for a controller. You also want to put multiple real-world digital elements in place with the freedom to focus on them and on real objects as close as you like, resulting in a strong sense of immersion.”

GamesBeat creed When covering the gaming industry “Where Passion Meets Business”. What does this mean? We want to tell you how the news matters to you – not just as a decision-maker at a game studio, but also as a game fan. Whether you’re reading our articles, listening to our podcasts, or watching our videos, GamesBeat will help you learn about and enjoy interacting with the industry. Discover our briefings.

Leave a Comment