It’s been more than a decade since Google Glass was announced in 2013, followed by its swift withdrawal – partly due to low adoption. Their subsequent (and lesser known) second iteration was released in 2017 and aimed at the workplace. They were withdrawn in 2023.
In December 2025, Google created a a new promise for smart glasses — with two new products set to launch in 2026. But why have Google’s smart glasses struggled where others have thrived? And will Google succeed for the third time?
These are types of supplements that have been around for centuries and are now accepted as normal in society.
Some of the latest academic research uses this approach, building sensors for jewellery that people would actually want to wear. The research developed a scale to measure the social acceptability of wearable technologies (i.e CARRY the stairsor Wearable Acceptability Range), which includes questions such as: “I think my colleagues would find this device acceptable to wear.”
Noreen Kelly of Iowa State University and colleagues he showed it at its core, the scale measured two things: that the device helped people achieve a goal (making it worth wearing) and that it did not create social privacy anxiety and was considered rude.
This last issue was most prominently highlighted by a term that emerged for Google Glass users: Glassholes. Although many studies have considered the potential benefits of smart glasses, from mental health to use in surgery, privacy concerns and other issues are ongoing newer smart glasses.
All that said “look and feel” Still comes the most common concern of potential buyers. The most successful products have been designed to be desirable as accessories first and smart technologies second. Typically from designer brands actually.
At the end of the show
After Google Glass, Snapchat released smart glasses called “spectacles” that had built-in cameras, focused on fashion and were more easily accepted into society. Now, the most prominent smart glasses have been released by Meta (Facebook’s parent company) in collaboration with designer brands such as Ray Ban and Oakley. Most of these products include front-facing cameras and support for a conversational voice agent from Meta AI.
So what do we expect from Google Smart Glasses in 2026? Google promised two products: one that is audio only and one that has “screens” displayed on the lenses (like Google Glass).
The biggest assumption (based on the promo videos) is that these will see a significant change in form factor, from the futuristic, if not scary and unfamiliar design of Google Glass to something more commonly perceived as glasses.
Google’s announcement also focused on the addition of AI (they actually announced it as “AI glasses” rather than smart glasses). However, these two types of products (audio-only AI glasses and AI glasses with field-of-view projections) are not particularly new, even when combined with TO.
Meta’s Ray-Ban products are available in both modes and include voice interaction with its own AI. For example, these were more successful than the recent Humane AI Pin, which included front-facing cameras, additional sensors, and voice support from an AI agent. This was the closest thing to Star Trek lapel communicators yet.
Driving direction
It is likely that the main directions of innovation in this direction are, firstly, to reduce the noise of smart glasses, which were necessarily bulky to contain electronics, and yet appear to be normally proportional.
“Building glasses you’ll want to wear” is how Google puts it, so we can expect an innovation from the company that only improves the aesthetics of smart glasses. They also work with popular brand partners. Google also advertised the launch of XR (Mixed Reality) wired glasses, which are significantly scaled down compared to virtual reality headsets on the market.
Second, we can expect greater integration with other Google products and services, where Google has many more commonly used products than Meta, including Google Search, Google Maps, and GMail. Their promotional material shows examples of information from Google Maps being displayed in AI glasses while walking through the streets.
Finally, and perhaps the biggest area of opportunity, is to innovate to incorporate more sensors, perhaps integrating with Google’s other wearable health products, where we see many of their current ventures, including introducing their own smart rings.
Much research has focused on things that can be sensed from common touch points on the head, including heart rate, body temperature, and galvanic skin response (skin moisture that changes with stress, for example), and even brain activation, such as through EEG. With current advances in consumer neurotechnology, we could easily see this Smart glasses using EEG to track brain data over the next few years.
This edited article is republished from Conversation under a Creative Commons license. Read on original article.

Leave a Reply