Development of See-Through AR Glasses (2011-2013)
Product Design of See-Through AR Glasses (2013)

Development of See-Through AR Glasses (2011-2013)

Our journey into augmented reality (AR) eyewear began with a bold question in 2010: "Could we augment real-world events in real time?"

That year, we secured nearly $1 million in funding to develop a proof of concept (PoC) for the Emir of Qatar as part of the Asian Football Cup (AFC). Qatar was hosting the AFC in January 2011 and was exploring cutting-edge technologies for the opening ceremony and other special events. The goal? To augment the opening ceremony and display a massive AR football player in Khalifa Stadium, Qatar, which would be visible to special guests.

That project was both thrilling and intense, as we had just seven months to pull together a working PoC that could demonstrate the power of augmented reality on a grand scale. Our small team rapidly expanded to 22 engineers, creatives, and software specialists, all focused on pushing the limits of AR technology at the time. I was burning the candle at both ends, leading the product design, creative, and software engineering teams, while also being hands-on with most aspects of the project. Whether it was solving technical challenges or guiding the overall vision, I felt deeply immersed in the work, and it made me feel very much alive.

Paul Kouppas demonstrating a see-through AR experience at Khalifa Stadium, Qatar. The system used real-time tracking of a large fiducial marker, while the Unity 3D animated experience - visible through the AR glasses - can be seen on the computer screen.

There's much more to explore about this project, but I'll save the full story for another post.

It did however set the stage as was the catalyst for much of the innovation that followed, forcing us to tackle significant technical challenges like drift, form factor, and latency - all of which shaped our approach to developing see-through AR glasses. While the AFC project laid the groundwork, our focus shifted to miniaturisation as our initial see-through prototype was very large and clunky. We began exploring how we could take the large-scale AR concept we had developed and refine it into something more portable, wearable, and suited for everyday use.

Our AFC see-through AR prototype featured Immersion (USA) technology with our computer vision and Unity 3D solution.

Some of our early, smaller prototypes were video passthrough AR devices, built by modifying video glasses marketed and sold as a “cinema in your pocket” at the time, sourced from a manufacturer in Hong Kong. We attached a webcam to our modified video glasses to track a large fiducial marker, which we hand-sewed from sailcloth.


Cameron Griffin (our lead software engineer) demonstrates one of our video passthrough AR prototypes.

These initial experiments laid the foundation for our ultimate goal: making AR accessible through see-through eyewear.


Defining the Objective

Our primary objective was to design and build AR glasses that would be lightweight, ergonomic, and equipped with the necessary technology to deliver a smooth AR experience. The glasses had to support real-time AR rendering and deliver interactive content directly into the user’s field of view. We wanted a device that not only delivered on functionality but could also be used in various environments—from a busy street to open outdoor spaces like stadiums.

Key objectives included:

  • Integrating AR optical parts into a lightweight, ergonomic frame.
  • Achieving real-time 3D rendering and precise 3D tracking for interactive overlays.
  • Providing seamless user interaction without visible latency or disconnection between virtual objects and the real world.


Hardware and Technical Details

As I look back at the early development of our AR glasses, I’m reminded of a comprehensive list of components that Peter Koch and I outlined in an R&D document back in July 2011. At the time, we were ambitiously defining the technical elements that we believed would be essential for delivering high-quality AR experiences through see-through eyewear. It’s fascinating to see that many of the features we proposed back then have become standard in the top AR and VR devices on the market today.

Our goal was to make sure the glasses could handle real-time data transmission, video output, and sensor management, while still keeping the hardware lightweight enough to be wearable. We divided the components into inputs and outputs, each with specific purposes to enhance the user experience.

Here’s a snapshot from that document:

Inputs:

  • 9DOF inertial positioning sensor for tracking head orientation, using a 3-axis gyroscope, magnetometer, and accelerometer.
  • Forward light sensor to adjust display brightness based on the surrounding environment.
  • Camera input for tracking the user’s viewpoint, with VGA 30 FPS as the minimum standard. We also considered stereoscopic tracking and using additional cameras for things like gesture input.
  • Control buttons that could be software-configurable for home or select functions.
  • Altitude sensor to track vertical movement and adjust accordingly.
  • Voice and sound inputs to filter environmental noise and process data such as digital tones.
  • Eye pupil tracking module to optimise display content based on where the user was looking.
  • RFID tags to help identify the glasses and allow for external tracking.

Outputs:

  • Multi-channel audio for immersive sound through integrated headphones or speakers.
  • Monocular and stereoscopic displays, with a DVI signal for 2D monocular use at 800x600, 60Hz, or 3D stereoscopic video using interleaved frames via HDMI 1.4.
  • Status LEDs to indicate operational states and provide alerts to the user.

Of course, not everything on that list made it into our minimum viable product (MVP). We had to make some tough decisions and ultimately focused on the core components—things like 9DOF tracking, a front-facing camera for SLAM (Simultaneous Localization and Mapping), and basic control inputs. Features like voice and sound inputs, eye pupil tracking, and camera tracking for gesture input had to be shelved so we could prioritise the elements that would ensure the product’s stability and effectiveness.

?

SLED Design

The “SLED”: As we added more sensors, cameras, and components, the strain on the processing unit grew significantly. To address this, we implemented a multiplexed digital protocol between the computer and the glasses, ensuring efficient real-time data transmission while managing the load on the system.

This is where "The Sled" (our product name for it) came into play—an embedded processor specifically designed to handle certain tasks like sensor management. By processing key functions locally before sending data to the glasses, "The Sled" reduced the burden on the main system, allowing us to streamline operations and maintain performance.


Solving for Performance and Efficiency

One of the main challenges we faced was managing I/O bandwidth. While we aimed to transmit all input and output data through the digital interface, it quickly became clear that video output had to be handled directly by the hardware (via DVI or HDMI) to avoid overwhelming the CPU. This division of labour—allowing hardware to handle video output and passing sensor, camera, and audio data through a multiplexed digital connection—was essential for maintaining smooth performance.

The remaining I/O (sensor data, camera feeds, and audio input) was transmitted through a single digital connection. To accommodate this, we developed a robust communication protocol capable of multiplexing various data types, similar to USB but operating at the software layer.


The Multiplexed Protocol and Communication Strategy

To ensure scalability and flexibility, we designed the protocol to function as a single-byte stream with encoded data messages. This abstraction allowed communication across multiple transmission types—whether wired (serial), wireless (Bluetooth), or via a network. For early prototypes, we opted for a reliable, error-corrected stream, essential given the close proximity and point-to-point connections of the devices.

Naturally, you might be wondering how we planned to display digital information on a clear lens. While I had confidence in our product design, software development, and ability to create immersive AR experiences, optics and miniature displays were not our areas of expertise. To address this, I sought out specialists in the field. At the time, experts in miniature displays or projection technology were relatively scarce.

We explored various optical technologies from companies such as Immersion (USA), AAXA Technologies (USA), and Optinvent (France). We selected the team at Optinvent, which had originally specialised in rear-screen TV projection and had successfully pivoted to miniaturised optical modules after the television market transitioned to plasma screens. Their expertise (and the fact they were great guys too) made them an ideal partner for our needs.


Optinvent's Clear-vu Modules

They had developed a patented technology called Clear-vu, based on moulded plastic components. This innovation allowed for low-cost mobile video and informative eyewear applications, offering see-through video or data display capability. Recognising the potential of the Clear-vu modules, we integrated them into our product design.

We embedded their early prototype optical modules were embedded into our glasses. and worked closely with Kayvan Mirza, Khaled Sarayeddine and their team, communicating our requirements and vision for how their product could be adapted for our specific use case in AR glasses. Although their initial field of view (FOV) was somewhat limited, we were confident they could expand it to meet our target of 40 degrees and I'm happy to report that they did and have since improved their technology further and focus on engineering AR display technologies.

?

Peter Koch’s Early iPod Touch Prototype

One of our earliest breakthroughs came when Peter Koch rigged an iPod Touch (4th generation) to the Optinvent modules to test some of our core concepts. Using its 3-axis gyroscope for head tracking, the iPod Touch provided 3 Degrees of Freedom (3DOF), allowing us to capture rotational movements—pitch, yaw, and roll—enabling us to track how users rotated their heads while wearing the glasses. However, it lacked the ability to track positional movements in space, such as forward/backward or up/down motion, which are essential for full 6DOF tracking.

Despite these limitations, the setup allowed us to perform critical early tests with Unity 3D to render AR content in real time. Unity’s flexible platform let us build interactive 3D experiences that could be projected into the user’s real-world environment. Its real-time rendering engine was invaluable for simulating AR interactions and making adjustments quickly. We were able to experiment with object positioning and how AR content responded dynamically to head movements. Unity’s versatility was key in bridging the gap between digital content and the physical world.

By tracking rotational motion, we created an interactive experience where digital content appeared to float in front of the user’s eyes, reacting to their head movements. However, to make the AR experience fully immersive, we needed to go beyond rotation and incorporate positional tracking. To address this, we began experimenting with SLAM (Simultaneous Localization and Mapping), but the limitations of the iPod Touch highlighted the need for a more advanced sensor solution. This led us to develop our own 9DOF sensor, which combined gyroscopic tracking with additional accelerometers to capture both rotational and positional data.

Additionally, we even built a light sensor into the hardware to adjust the brightness of the digital content based on real-world lighting conditions. The idea was for the digital content to brighten in well-lit environments and dim in darker settings, allowing it to blend more seamlessly with the real world. I'm not sure if we got around to fully implementing this feature, as it dropped in terms of priorities based on the number of cutting-edge tasks we had given ourselves, but it was definitely a cool thing we had engineered in the hardware.

This rig was one of the first integrations of see-through AR eyewear connected to a mobile device, allowing us to quickly prototype and test different AR experiences. Interestingly, over a decade later, companies like XREAL, Qualcomm, and Intel are developing similar concepts where AR glasses are powered by smartphones.


Designing the AR Glasses

When I set out to design the AR glasses, I had to consider several key components that needed to be integrated seamlessly into a compact, wearable form. The design had to accommodate the Optinvent display module, HD cameras, light sensor, and 9DOF sensors, as well as the cable that connected to our SLED, which itself required an enclosure design. Additionally, I wanted the glasses to feature fashionable covers that could house prescription lenses, ensuring both style and functionality.

I worked through many 3D iterations and developed rapid prototypes to test the fit, form factor, and overall assembly of the initial prototypes. This approach allowed us to refine the design, ensuring that each component fit perfectly, was easy to assemble, and remained ergonomic and lightweight for the user. The goal was to create AR glasses that were functional, practical, and aesthetically pleasing,

?


2013 Visionary Video

In 2013, we produced a visionary video to showcase the potential of AR eyewear by 2020. The video illustrated how AR could transform everyday experiences—from enhanced navigation to interactive workspaces. The goal was to encourage investment by showing investors what would be possible with AR eyewear in the near future.

In the video we made a point of accurately demonstrating the FOV of our working prototype, which was 26 degrees, and compared it to our discussions with Optinvent who were (at the time) improving their technology to achieve a 40 degrees FOV. The video garnered attention and even earned us a feature on Cybershack, a popular Australian TV show known for its in-depth coverage of consumer technology, gaming, and gadgets.

But despite the positive feedback and the potential of our AR glasses, we couldn’t sustain the financial burden of development. We funded the project through profits from our professional services company—primarily from the many, white-labelled AR apps we created—and secured government grants. However, the strain on resources and finances forced us to make the difficult decision to shelve the product. To this day, I still have a suitcase full of prototypes in my garage, a reminder of how much we invested in the journey.


We sought investment, but in late 2013, it was challenging to secure funding in Australia without giving up significant equity. Additionally, the market wasn’t quite ready for AR; many people didn’t fully grasp how this technology could be integrated into their daily lives.

In hindsight, we were well ahead of the curve (on the bleeding edge) but the experience underscored our belief in the transformative power of augmented reality. It was an unforgettable journey and a true testament to the potential of AR.

Stay tuned for another instalment, where I’ll dive into more of our groundbreaking projects and the lessons learned along the way.

Matthew Burley

Non-Executive Director, founder

4 个月

Meta still using Waveguide on Orion Glasses.

回复

A trip down memory lane ??

Matthias K.

Director @ Augment Reality Holdings, Inc.

6 个月

Here's part four of Paul Kouppas' ExploreEngage/Auggd augmented reality recap series. A few significant things about this: - Paul and the AR eyewear prototype he presented to me were the reasons why I invested in the business in April 2011 and got committed to bringing AR to the market. - It's a blueprint to building AR eyewear, in case you've got the coin. - In 2012/13, we presented our solution to the US military's Joint Improved Explosive Device Defeat Organization. (Microsoft swooped in to steal our thunder with their eyewear shortly afterwards. They still have shit to show for their $23 billion price tag. I digress.) - The lessons learned from this helped us tremendously in our future AR developments. It's a bit of a technical read but if you're interested in what needs to go into AR eyewear then this post is for you. If that's not your shtick, go to the end of the article and watch our conceptual video from 2012 called Project 2020AR. Enjoy.

要查看或添加评论,请登录

Paul Kouppas的更多文章

社区洞察

其他会员也浏览了