Google translates hand gestures to speech with sign language AI

Google has announced a new machine learning model for tracking hands and recognising gestures, giving once soundless sign language a voice

There are thousands of spoken languages the world over. Each has its own subtleties and nuances. Developing AI to interpret and translate those can be difficult, and even the most sophisticated translation tools struggle to do so correctly. Sentences can become jumbled, meanings are misread, and colloquialisms are mostly lost on machines. 

However, there has been a breakthrough in perceiving and translating the language of the human hand.

Google’s development of “Real-Time Hand Tracking”, which perceives hand movements and gestures, allows for direct on-device translation to speech.

The tech giant has mapped 21 3D keypoints, or coordinates, to around 30,000 real images of hands performing a variety of gestures and shapes. 

Using a complex machine learning model, they have created a mixed training schema to synchronise the data from rendered images, real-world images, and hand presence, to give a “hand present classification.”

Google hand tracking machine learning

In a blog post, Google said: “The ability to perceive the shape and motion of hands can be a vital component in improving the user experience across a variety of technological domains and platforms.”

The research and development of this machine learning algorithm could create numerous possibilities, not least for sign language understanding.

Google has not developed a stand-alone app for its algorithm but has published it open-source, allowing for other developers to integrate it into their own tech, a move which is welcomed by campaigners.

There are also potential uses for virtual reality control and digital-overlay augmented reality, or even gesture control functions in driverless vehicles and smart devices. 

Assistive tech is making strides as more and more entrepreneurs and engineers enter the market ahead of the consumer curve, with wearables and IoT devices aimed at making the lives of those with assistive needs a little easier.

Luke Conrad

Technology & Marketing Enthusiast

2022: Delivering omnichannel digital retail innovation with an emotional connection

Sarah Friswell • 17th December 2021

Radical changes took place in 2021 retail sales models, with traditional in-store retailers adopting digital apps and clienteling to offer a hybrid in-store and digital presence. Already-online retailers raced to embrace complex technologies to deliver omnichannel digital experiences. Customer experience can’t mean more to brands, and the human touch of ‘in-store’ assistants is a growing...

A new dawn of digitalization: 2022’s defining insurtech trends

Lorenz Graff • 16th December 2021

As the insurance industry continues to evolve in line with the shift in tech, culture and consumer demand, 2022 looks set to see the innovation stakes raised even higher. Lorenz Graff, CEO and co-founder of bsurance, advises the four defining trends which will shape the insurtech industry in the year ahead:

5 top tips for ensuring your conversational AI project is...

Amber Donovan-Stevens • 11th October 2021

Conversational AI is becoming more popular as a way of automating messaging and speech-enabled applications that offer human-like interactions between computers and humans. But it’s often hard to get these projects right. This article will look at five key areas that will help improve your Conversational AI project, explains Dan Johnson, Head of Automation, Future...