Google Adds a New Feature to Android Phones to help 900 Million Users from Hearing Loss

Google is rolling out two new features to its Android Phone users that will help around 900 million users across the Globe. Yes! you read it right. These two features are two Android applications which are called as Live Transcribe and Sound Amplifier and work exactly the same way as their names suggest. This move from Google comes to help the users around the world who are said to be suffering from hearing loss by 2055, as per a report from WHO(World Health Organisation).

Now talking about the applications, the Live Transcribe app will be taking the real-world speech and turns it into real-time captions with the help of the microphone of the phone. On the other hand, the Sound Amplifier will be filtering the sounds along with augmenting and amplifying it, in the environment around the user. It also increases the low sounds while not over-boosting the loud sounds and it can be easily customized according to the user with the help of toggles and sliders to minimize distractions in the background and also to reduce noise.

Starting today, the Sound Amplifier application will be available form the Google Play Store and the Live Transcribe app will start rolling out in limited beta today. Furthermore, these apps will be coming pre-installed on Googles Pixel 3 devices and users can signup (from the Android blog post) to get notified when it is available widely. According to the blog post, Google showed how the apps can be used and also highlighting the efforts of Dimitri Kanevsky, who is a research scientist.

He has been working on the speech recognition and communications technology from the past 30 years and from his work, Dimitri who is deaf from his early childhood helped in increasing the accessibility of the technology that he relies on and one among it is CART. Chet Gnegy who is the teammate of Dimitri has worked with the companies Accessibility team to built a tool that could reduce Dimitri’s effort that he spends preparing for meetings.

Dimitri used to carry multiple devices along with him which were costly and watching this they thought to use cloud-based automatic speech recognition to display spoken words on the screen and a prototype was built. The result of this is the Live Transcribe that turns real-time speech into real-time captions via phones microphone making it easier for deaf people to communicate easily. Comment in the section below if you have more queries on the same and stay tuned to PhoneRadar for more.

Source

Comment what you think