May 23, 2018 at 07:06AM
via Feed: All Latest
“I said, ‘It’s a wearable computer,’ ” Starner recalls. “I gave them a demo.” He asked them for their business cards so he could demonstrate how he could enter their information into his computer’s address book. They handed them over. Guy Number One: Larry Page. Phone number at Stanford. Guy Number Two: Sergey Brin. They were working on some kind of web search project, they said.
A decade later, Starner—by then a researcher at Georgia Tech—looked into his head-up display and realized he still had Brin’s email address. He clicked out a note: You haven’t seen wearable stuff in a few years. Come have a look. “Next thing I knew, I was in Mountain View giving demos, not realizing it was actually a job interview,” Starner says. Page and Brin were working on something related, they said. And they had a job for him.
Starner called what he saw through his lens “augmented reality”—a term he coined to describe the superimposition of the digital world onto the real. After agreeing to work with Page and Brin at Google, he was put in charge of designing the first full-on commercial augmented reality system: Google Glass. It would burn ultrabright for a few months in 2012 and 2013, ascending to the acme of cultural hotness only to plummet, Wile E. Coyote–like, to failure. Or so it seemed.
Glass arrived with fanfare, and people had to lobby via hashtag for the privilege of dropping $1,500 for one. But it was more or less a beta release, with a veneer of high fashion and promise of constant access to the internet that couldn’t disguise a Rent-a-RoboCop vibe. Turning on the screen required a wearer to assume a discomfitingly dorky head angle; if you wanted to do anything else you had to say “OK, Glass” at what Google’s engineers had, it seemed, parametrically determined would be the most awkward moment in any conversation. It didn’t actually do much except let you take creepshots. People started calling anyone wearing Google Glass a “Glasshole.”
Google pulled the product in January 2015, but anyone who’s seen a movie can tell you that cyborgs are hard to kill. Technology that at first seems irrelevant often becomes unavoidable—or inevitable. Sure, Glass became a gossip-tinged metonym for tech bro narcissism. But before you toss Glass into the drawer with your CueCat and your PalmPilot, let’s talk about the wristwatch.
Until World War I, men wore pocketwatches. Wristwatches were bracelets, and bracelets were for women. But then macho military dudes learned that having the time on your wrist made it easier to operate heavy machinery and blow stuff up. So would-be macho dudes started wearing them too. Click! The world looked different. “The same thing happened with sunglasses,” says Clarissa Esguerra, a curator for costume and textiles at the Los Angeles County Museum of Art. “And the zipper too.” Same with headphones; when people started wearing Walkmans in public, it seemed creepy. But now? All cyborgs are revolutionaries. Their modifications look weird and affected until they don’t.
At CES 2018, half a dozen companies showed working prototypes for eyeglass-based computing. Magic Leap became one of the most hyped startups of the past few years on the strength of augmented reality goggles. Police in China are using glasses-enabled facial recognition to expand their panopticon surveillance. The end of Google Glass wasn’t even the end of Google Glass; X, the incubator that oversees the product, sells them as a head-up display for assembly workers. But more than that, the idea of augmented reality has normalized. Capturing everyday moments with a ubiquitous camera and inserting digital elements into your everyday field of view—well, that’s just Snapchat.
Technology, connectivity, and culture have finally caught up with Starner, who says eyeglass computers like Glass aren’t science projects anymore—now you can hardly tell they’re there. “People no longer think of it as a separate device,” he says. “They think of it as them.” Which was his vision all along.
Adam Rogers (@jetjocko), who wrote about director Luc Besson in issue 25.07.
This article appears in the May issue. Subscribe now.