“Building a more helpful Google for everyone”
This year’s Google I/O 2019 connected to me on more levels than just one. Before I elaborate on that, here is an excerpt from the presentation made by Google’s Trystan Upstill at the keynote.
Trystan: “Like many people I watch videos without sound when I am on the go, with captions I can still keep up even if I am in a crowded space or sitting in a meeting. So, for me, they are super helpful. But for nearly 500 million people, who are deaf or hard of hearing, captions are critical. Today, lots of mobile content embed audio from videos to voice messages and everything in between. Without captions, this content is nowhere near as accessible.”
I am one of those 500 million individuals that Trystan mentioned and I connected to every word he said. There have been so many podcasts, YouTube videos etc, which are in my watch list but I cannot enjoy them because of the lack of closed captions. But things are now changing with Google emerging as the champion of cause for the disabled. At last night’s event, Google had an entire segment dedicated to the differently abled and by the end of it, I felt a surge of positivity like I never did. Wondering why I say so? Stay with me and read on.
In a not too distant past, I had reviewed Google’s Live Transcribe app for the hearing impaired. I had a major gripe back then, which is the need for Google’s Cloud Speech API and a working internet connection for it to work properly. However, The app now supports over 70 languages and dialects to make it accessible for a majority of the deaf community. At the event, Google CEO Sundar Pichai announced that the company is now expanding the scope of the app and what it can do, with the addition of several new features including Live Caption, Live Relay and Project Euphonia. And I must admit, I can see myself using at least the first two extensively in my day-to-day life.
Live Caption is one feature that I can easily imagine using every second every day. Google claims that Live Caption is capable of making any content, no matter where it origins from, more accessible to the hearing impaired. Frankly speaking, this is like a dream come true for me. All those podcasts, movies, YouTube Videos and other content I mentioned earlier is now possible for me to watch and “hear”. The Live Caption feature shows a closed caption bubble on the display, which transliterates everything that’s being said on the screen in real time. Be it a movie you are playing on your smartphone, watching a video sent by someone to you, listening to a podcast or quite literally anything else, you will now be able to read and understand the dialogues or conversation in real time on your screen. All you have to do is to press the volume button, tap on Live caption button and move the live caption bubble on your display to a position that is comfortable for you.
When I reviewed the Live Transcribe app, I had wished for some way to integrate the feature into telephone calls, so that hearing impaired folks can at least handle basic calls. It seems like the coding Gods at Google heard my prayers because the Live Relay is exactly what I asked for. In essence, Live Relay uses on-device speech recognition and speech-to-text functionality to help the smartphone to listen and speak on behalf of the user while they type. The AI-backed Smart Reply and Smart Compose come to the fore here, allowing users to type fast enough to hold on to an on-going call. The best thing about Live Relay is that it’s completely on device and keeps your calls totally private. I never thought I would be able to answer a call and also be able to actually “converse” on phone, but it seems like a possibility now and I look forward to using this feature.
At the event, Google also announced Project Euphonia for those with speech impairment. Notably, a majority of those who have hearing problems also suffer from speech impairment to some extent, and Project Euphonia will help them convey exactly what they are trying to say. Not just the hearing impaired, the project is also useful for people whose speech has been affected by a stroke or ALS. For this purpose, Google has partnered with several institutions like ALS Therapy Development Institute and ALS Residence Initiative for transcribing words spoken by people with speech difficulties. However, Pichai mentioned that in order to make this a reality, the company needs more data. For this purpose, Google has called for people with speech impairments to submit their voice samples here.
Google ended the segment on accessible features on the note that it still needs to go a long way. However, as the target audience, I personally feel that hearing impaired people couldn’t have asked for more. More than the features themselves, I am delighted about the fact that one of the largest technology companies on Earth is working towards improving the quality of life for the disabled. All I can say at this point is, Godspeed Google!