November 10, 2025

Since the earliest signs of civilization, humans have sought to augment ourselves with technology to make our lives better. Consider the many technological advances we have used to make higher and higher performing garments, for example. Animal hides were our earliest  form of protection, until spinning and weaving technologies gave us the ability to improve warmth with wool and breathability with cotton. Advances in metallurgy gave us better protection as armor advanced from leather to brass and steel and eventually carbon fiber and kevlar.

Beyond protecting ourselves, we seek to directly improve our bodies. Prosthetic toes have been discovered as old as 3,000 years ago in ancient Egypt. In the Roman era, wood and iron arms and legs began to be fabricated. In the 1500s, French Surgeon Ambroise Pare created the first prosthetic with functional joints, further augmenting the capability of his patients to use technology in a human-like manner.

Sponsored

Modern prosthetics perform so well that they are banned from many sports competitions as giving an unfair advantage. We begin to see cases where the technologically augmented human performs better than the natural one.

An astounding number of parts of our body can now be replaced with, or augmented by, technology. We augment our hearts with a pacemaker. We improve our hearing with hearing aids. Wearable devices monitor and optimize blood sugar. Artificial knees are commonplace.

It is our inherent desire to make ourselves better that convinces me that Augmented Reality is a killer app. Of all of our senses, it could be argued that we value vision the most.

We began correcting our vision in 1286, when Italian friars first used glass lenses and coined the term eyeglasses. Galileo was among the inventors of the telescope in 1608 to augment the distance we can see. Binoculars followed about 200 years later and by 1888 Adolph Fick created the first contact lens.

But humans have a desire to see well beyond what our natural eye can do. We’ve developed headsets to give us night vision and thermal scanning. Advanced science has led to the first successful artificial eyes, connecting CMOS camera technology directly to the optical nerve. While the first prototypes are still fairly low resolution, “Terminator” style vision is not far away.

The Terminator movie provided one of the earliest widespread visions of what augmented reality could become. Computer generated data, overlaid directly onto the line of sight gave Arnold Schwarzenegger’s character a huge advantage over mere humans. (Well, that and his titanium robotic skeleton). The whole concept of the Terminator was that he was a human, highly augmented with technology.

So why hasn’t AR taken off? There is no significant technology limitation at this point. Yes, headsets are still bulky and uncomfortable. And there are challenges dissipating heat from the device processors, when worn in a small form factor like a set of eyeglasses. But these are solvable problems that provide friction as the AR market expands, but are not inherently barriers to market growth.

The root challenge that is holding AR back today is privacy. None of the AR/VR headset companies today are providing access for developers to the front-facing cameras on their headsets. The result is that developers can create virtual reality applications, but are handcuffed to create augmented reality ones.

VR is cool, but far less practical. Augmenting ourselves with technology means that we can conduct our day to day activities, but better. It doesn’t require a change in our own behavior or in our daily workflows. But what is different this time is that we are augmenting ourselves with a technology that is delivered over the internet.

Your hearing aid improves your hearing, but only for you. Your eyeglasses do the same for your vision – privately. The world isn’t yet sure how to manage private data, delivered across both public and proprietary networks. Let’s look at the technology stack to explain.

A typical AR headset includes either a lens or a screen which the human eye focuses on.  If it is a lens, like a traditional eyeglass, the headset has the ability to project images directly onto that lens. Many new cars have “heads-up displays” that do exactly this. You look through the windshield and there is additional data reflected onto the glass. In the case of the car, the data shown may not have anything to do with what the car “sees”. It may be speedometer information or GPS-enabled turn-by-turn directions. But headsets rely on front-facing cameras, built into the eyeglasses, to inform what the user is looking at.

More sophisticated headsets, like the new Apple Vision Pro, use a digital screen, instead of a transparent lens. In this case, when the user puts them on, they still see their real world, in real time, but a camera-generated view of it, projected onto the screen. When done properly, with low enough latency and sufficient peripheral camera view, the impact can feel like the user isn’t wearing anything at all. The vision experience is mimicked pretty well.

Sponsored

The actual augmentation – the data overlaid on what we see, is cloud-provided. Heavy computer vision AI that can’t yet be conducted completely on the device. Therefore these devices send what you see to the internet and receive data back. Algorithms in the cloud extract context from the video feed, and overlay data useful to the AR application, whether that’s identifying the bird you’re watching or showing instructions on how to fix the garbage disposal.

There are about a dozen major AR/VR headset companies – Apple, HTC, Meta, Sony, Microsoft, Samsung, Valve, Lenovo, HP, Pico and Magic Leap are the global players. (Google is heavily investing, but not selling devices). All have some sort of front-facing camera technology capability. But none are giving access to those cameras to developers outside of their own companies. This could be due to an effort to capture 100% of the AR application market. But I don’t think so.

The hardware device industry has long understood that creating an open developer community to create a rich library of applications is key to driving hardware sales. But none of the major players is confident that once that front-facing camera is accessible, there won’t be a major privacy hack.

What happens when any developer suddenly has the capability to see what you are seeing, directly and in real time? Will bad actors turn on the camera without users knowing it is on? Will they store your vision feed when you aren’t aware you are recording? Will they analyze what is in your view, outside of your expectations of the purpose of the application in use?

Of course this risk could happen within the big parent companies themselves. We each as individuals make that trust decision when we decide to buy a headset or not. There is some confidence that the major brands will manage our most private data appropriately. But the temptation to misbehave is real.

You may recall the criticism Zoom came under when it was realized that Zoom was using user sessions to train AI models for new features like meeting transcription. While this practice was covered under the Zoom terms of service, it wasn’t readily understood by their users.

What is to stop AR developers from putting similar obscure language into the “click-through” user agreement that permits the capture of everything you see during your day?

One day, we may see completely standalone AR devices. When processing power becomes strong enough, maybe we can pull all the “smarts” onto the device itself, disconnecting it from the internet. But that is still years away. In a practical sense, the digital overlays on what we view will be provided via a connected cloud.

Will we see one of the hardware providers break rank and open that front-facing camera? I think the benefits of the many AR applications that have not yet been coded will pressure the market to fully adopt the technology. The positives outweigh the risks. And there is lots of money to be made. This is a huge market.

We’re a long way from becoming human Terminators. But our inherent desire to augment ourselves keeps us moving in that direction.

The post Tom Snyder: AR could be a killer app; data privacy concerns keep evolution slow first appeared on WRAL TechWire.

Tom Snyder: AR could be a killer app; data privacy concerns keep evolution slow first appeared on Web and IT News.