Alexa has learned to understand and respond to sign language but it wasn’t Amazon who came up with this new brilliant functionality.
The future seems quite centered on the idea of using voice activation for every product. However, as voice activation grows in popularity, what does that mean for those who are deaf or cannot speak? How do all these advancements help those people? Will it actually hurt them?
One software developer was thinking these exact questions when he came up with a code to help Alexa's assistant understand sign language commands. Abhishek Singh explains his code in a new video showing how easy it is to accommodate those who can't hear or speak with a little innovative thinking.
If there was ever a space to watch it's Assistive Technology and how it will shape and support disabled and non-disabled people to communicate better together. I look forward to seeing how this developes over the course of my life."Alexa" being taught signhttps://t.co/1x7vS0S6u9— Elaine Birkett (@birkett_elaine) August 2, 2018
In the demonstration video, the Amazon Alexa is connected to a laptop. The laptop has a webcam so that the software can decipher what Singh is saying in sign language. He also shares how he came up with the idea as a thought experiment and realized a seamless design needs to happen for voice activation to be inclusive. He even created a database so the software would understand what the various signs meant.
Although it is fantastic that Alexa can read and understand sign language, it needs to be said that, as of right now, it is only a concept idea. Singh created the video and tested his theory to prove that it could work. There are a few glitches, but now, at the very least, there is proof that these devices can be made to be more inclusive in regards to deaf and non-speaking people thanks to Singh's code.
Ironically on the same day the video was released, Amazon announced there would soon be an update for Alexa that will allow users with the screen-equipped Echo Show to interact with the virtual assistant without using the voice command. Users just have to tap to access the Alexa and give her commands. It is not quite as interactive as Singh's idea, but it is a step in the right direction.
RT: @terence_mills— Data Talent (@datatalentrec) July 25, 2018
Sign-language hack lets Amazon Alexa respond to gestures#AI #AIio #BigData #ML #NLU #Iot https://t.co/0nz1JVVpuE
CC: @guzmand @datatalentrec @DavidBrin @denisegarth @dez_blanchfield @diioannid pic.twitter.com/eyKgdgbMjy
Maybe Amazon should have a sit down with Abhishek Singh. He seems to have come up with a genius way for Alexa to read and understand sign language!