In one demo, a Google product manager tells someone wearing the glasses, “You should be seeing what I’m saying, just transcribed for you in real time — kind of like subtitles for the world.” Later, the video shows what you might see if you’re wearing the glasses: with the speaker in front of you, the translated language appears in real time in your line of sight.
I’m sure we all remember Google’s first foray into connected eyewear with a little fondness. They were ugly and didn’t work very well.
But we thought they were cool.
However, if this new model ever becomes a real product, how helpful could it be if you got real-time translation while someone was speaking with you in another language?
Or if you had hearing issues, you’d have subtitles to help.
The real question will be what Google does with the data they gather from all the eyeballs.
Oh, and then there’s the whole “why is that creeper continuing to stare at me with those weird glasses” issue that I’m sure will come up in a courtroom somewhere.
More from The Verge