Google displays a prototype of augmented reality glasses that can translate languages in real-time.
In this amazing video, slick-looking specs are shown.
After showing off an early version of a pair of linked specs during its I/O developer conference opening address, Google hinted at a return to mass-market smartglasses.
While it didn’t provide any specifics about the glasses’ specifications, it did show off a prototype in a brief video, revealing a design that’s a far cry from Google’s Glass AR eyewear, which it sought to break into the public within 2012.
The demonstration demonstrated how the glasses could interpret and transcribe languages in real-time, including sign language. Google said in the amazing video (below) that it aimed to break through language barriers, utilising years of study and cooperation with Google Translate to bring the technology to spectacles.
The effect of Google’s purchase of North, the firm that created its own remarkable Focals AR glasses, may finally be seen in this film. When Google rushed in and bought North, it was in the midst of developing its second generation of smartglasses with a holographic display.
Despite the failure of the consumer version of Glass, Google remains committed to what it calls the “next frontier of computing.” It’s previously exhibited AR on Android phones through different applications and software, and it was widely reported earlier this year that Google was working on an AR headgear called Project Iris.
While Google Glass was a commercial failure, augmented reality eyewear is still available in the industry, with Glass Enterprise Edition 2 being the most recent incarnation.
Along with Google, Apple, and Meta, it seems like the race is on to be the first to integrate really useful augmented reality smarts into eyeglasses that people will want to wear in public.
Google didn’t say when these smart glasses may be on people’s faces, so it might be a while before we’re walking around with translating glasses.