Google Announces Nexus Tablet At Its Developers Conference I/O
Updated: June 28, 2012 4:10PM
Google’s first “for real” public demo of Google Glass during Wednesday’s developer conference generated three genuine gasps from the crowd. Remember Google Glass? It’s the computer you wear like a pair of eyeglasses. It can take photos and live video from your personal perspective, and post directly to Google Plus and other social sites.
It’ll also annotate your field of view with augmented reality data. Stuff like a green line on the street that helps you navigate from the subway station to the coffee shop where you’re meeting your friend, and a text from that same friend saying “Why are you late? I thought you said that those crazy Google glasses you insist on wearing everywhere made that impossible.”
A skydiver leaped from a plane circling San Francisco and transmitted live video of his descent to a rooftop next to the convention center through the Google Glass he was wearing. He handed the glasses off to a BMX rider, who jumped the roof gap over to the adjoining building where Google’s presentation was underway (gasp number one). The rider handed the pair to someone who repelled to the sidewalk, who then handed them to a second biker who tore off down the sidewalk, into and through the lobby, nearly crashing into several unaware pedestrians on the way (gasp two), before leaping triumphantly to the stage.
(Yes, several innocent people died, but oh, boy, the looks on their faces, captured with agility and clarity by Google Glass, were priceless. The imagery was so good that a hundred personal-injury lawyers instinctively filed to subpoena the video evidence before they realized that no injury had actually happened.)
What an interesting trip Google Glass has taken. When Google first shared the details of Project Glass to the world, many people scoffed. And why not? It was a PowerPoint presentation. There was no indication of how these things worked, their limitations (Battery life? Connection to the Internet? Do need to be wired to another device in your pocket? How heavy are they?) or even if Google had anything functional. Comparisons were made to Apple’s Knowledge Navigator video, in which Apple demonstrated a futuristic computer they had no intention of ever making (the thing in the video was a prop made out of wood) simply as a way of illustrating important technologies. As well as Microsoft’s Courier demo video, which showed a research group’s concept of a two-screen tablet that could be opened and written in like a book.
The PowerPoint edition of Project Glass left everyone with one key question: Is this thing for real? Will it ever be a commercial product, or is it just a research project developed by a core team of Google employees who want to tell their friends they’re working on something more interesting than “a new analytic that correctly places an ad for patio furniture in front of people who’ve recently had a new deck installed”?
The on-stage demo of Google Glass was unusually detailed. It appears to be a self-contained device (no external wires and batteries). It’s lighter than many pairs of sunglasses (we hopefully assume that we’re talking about Ray-Bans, not the solid gold frames favored by Kanye West). It can be intimately connected to Google+ for live posting. Yes, you can wear them over your prescription glasses.
And as the skydiving demo demonstrated . . . the damned things actually work. CEO Sergey Brin apologized that many of Glass’ features are hard to demonstrate in front of a crowd (“You’d need to wear a pair in front of a pair”), so Google didn’t show off things like street directions and text messages.
But they did the most important thing of all: They explained the core philosophy of the device. Augmented reality glasses have been the stuff of science fiction for decades. Popular skepticism has dismissed them as a means to cut off people from their environment and feed their vision with distractions.
This perception was certainly fed by videos of professor Steven Mann, an augmented-reality pioneer who’s been building wearable computers since the days when the computer had to be carried in a backpack and the video was delivered by the CRT eyepiece from a camcorder, hard-mounted onto a bicycle helmet, which obscured half of his vision. Videos of Mann wearing his (increasingly svelte) devices usually showed his eyes darting left and right, never making eye contact with the person he was speaking with, his tongue even sometimes darting in and out as he immersed himself in a reality that seemed completely disconnected from the room he was in.
The second perceptual difficulty of Project Glass? Well, we already own some wonderful and mature augmented-reality devices: smartphones. We can easily snap and share a photo from wherever we are, collect new and helpful information about our immediate environment, and throw birds at pigs. Why replace this with a pair of super-digital-specs?
The point of Google Glass, as explained by Brin, is to allow people to maintain eye contact with their world. Smartphones work great, but they’re useless to you unless you take your focus off of the person you’re talking to or the space you’re in and spend a minute staring at your hands.
In a sense, the difference between Glass — as explained by Brin — and a smartphone is like the difference between a set of iPhone earbuds and a set of noise-canceling Bose earmuffs. The Bose headphones completely replace all environmental sounds with artificial ones of your choice and when you’re not listening to music, it replaces sound with silence.
Earbuds simply add a soundtrack to the natural sound environment. And when you don’t want anything added to the natural environment, they block nothing.
Google played dirty during their keynote. The skydiving demo was cool in a Red Bull Axe Body Spray kind of way. But then they called out another demo of a Google engineer who’d just had a baby. Yes, she wears her prototype Glass from time to time. She captures smiles and moments that would ordinarily be fleeting. Best of all, she can maintain eye contact with her (oh-so-cute widdle) kid while they’re playing, rather than fishing out a smartphone and placing this sheet of technology between herself and her youngling. Five seconds of video of a baby making eye contact with the camera, and realizing that this is the delighted facial expression of a kid looking into its mother’s eyes, is enough to make silly technology columnists sigh.
(Oh, shut up. The baby was adorable.)
The presentation came during Google IO, the company’s annual developer conference. There are several Oprah moments during any Google developer keynote. They don’t show off a new piece of technology without ultimately saying to the audience “YOU get a new phone and YOU get a new notebook and YOU get a new tablet and YOU…”
Suffice to say that there was a certain amount of new excitement in the room when Sergey Brin started talking about hardware availability, and how eager they are to put Project Glass in the hands of developers.
Alas, he didn’t end this spiel by saying “Check underneath your seats.” But all U.S. developers who were at Google IO could pre-order special early, development-grade (meaning: flaky, but useful enough to test out and write code for) Glass.
They’ll ship out early next year.
The cost: $1,500.
And yes, that was the third gasp. Followed by a clinical silence, as thousands of people in the room thought about the free Nexus 7 tablet and the free Nexus phone and the free Nexus Q device they were promised at the end of the earlier keynote, and they did the math to work out whether they could get $1,500 for the three of them on Craigslist.
In a followup with reporters, Brin said that Google was hoping to release Glass to consumers in 2014, and that while the consumer edition would sell for less than the developer edition, they considered it a “premium device” that would cost more than a smartphone.
And many other problems need to be solved. Beyond things like battery life and software. What happens when a roomful of people contains a nonzero number of folks wearing Glass? What happens to public social interaction when people don’t really have a clue as to when they’re being photographed, or when everything they say and every move they make is being transmitted live to a Google+ hangout? When someone is holding up a smartphone in your direction, you have options. Such as: quickly pulling your finger out of your nose.
But Google’s trying something brand-new, and that’s usually a good thing. And unusually, they’re doing it completely in the open. Apple wouldn’t let you see a new product until millions of shrinkwrapped units are sitting in warehouses ready to ship.
Good for Google. And good for us. I suspect that it’s going to take us all a year or two to get our heads around Project Glass before we decide to put our heads inside them.