Every major tech company is working on computer glasses. None of them really want to go first.
They all remember how Google Glass, and the “Glassholes” who wore them in public, became the laughingstock of the world. So they’ve been waiting, biding their time, refining their prototypes, and every so often making sure investors know that, no, they’re not going to let the first potentially iPhone-sized opportunity since the iPhone slip by.
But now, Google itself is taking the next step. And whether you’ve been dreading the moment when Big Tech’s all-seeing eyes reappear on people’s heads or merely counting the days until you can own a hands-free camera-computer, you should know we’re on the verge of contending with them once again.
Last Tuesday, Google revealed that it will begin testing camera-equipped augmented reality glasses in public, and the company’s blog post contains numerous statements designed to assure you that this won’t be the era of Glassholes all over again. Google claims it’s starting with “a few dozen” testers, and the cameras and microphones on its glasses “don’t support photography and videography.” They do collect visual data, but Google wants you to imagine use cases like “translating the menu in front of you” — not recording someone across from you at a bar.
The company’s support page also contains an entire list of FAQs like “What is image data used for?”; “How long is it stored?”; and “How will I know if I’m in close proximity to products being tested?” Turns out there’s an LED that lights up if Google decides to save images for analysis, and it promises to delete them 30 days later.
For now, Google says its testers won’t be using them in schools, hospitals, churches, playgrounds, and the like — though it says nothing about restaurants and bars, where Glass famously got wearers in trouble years ago.
And in 2022, I wouldn’t actually bet on disgust, mainly because we’ve had a decade of pointing phones at things in public, documenting every element of our lives, to prepare us for what’s to come.
Since the day in 2012 when a team of Google skydivers landed on Moscone Center with the first public Google Glass prototypes, mobile camera use has exploded. Not only have phone cameras utterly destroyed point-and-shoots but they’ve also changed social norms. In 2012, it was still a little weird to whip out a camera in a bar or restaurant; now, it’d be weird not to nab a selfie with friends or snap some shots of a particularly tasty-looking meal. And the fear you might accidentally capture a stranger in your shot? It’s such a normal everyday occurrence that Google uses a “magic” background person eraser as a selling point for its Pixel phones.
Besides, Google isn’t the first to dip a toe back in these waters. Snapchat is now on the fourth generation of its Spectacles camera glasses, Meta has its Ray-Ban Stories, and you could argue Meta’s Project Aria test is pretty similar to what Google’s doing now. None has yet generated the kind of stink that Google Glass experienced a decade ago.
Sure, that could change if a future pair of glasses proves to be more intrusive than our existing phones and drones. There are definitely going to be serious questions about data collection and privacy, particularly given the track record of some of the companies building them.
But in 2022, I think the bigger challenge facing Apple, Google, Meta, Microsoft, and Snap is figuring out how to build AR experiences we’d actually pay for — experiences more compelling or convenient than what phones already offer. As we wrote in May when Google teased some real-time language translation glasses, the company does have an intriguing idea there: