Chatbots and their potential to help people with disabilities

Categories: Artificial Intelligence |

Chatbots and their potential to help people with disabilities

 

What’s new in the world of chatbots?

Chatbots are a representation of one of the biggest topics in technology today, which is natural language processing. Natural language processing already exists with the interpretation of text structures, and can follow a conversation based on predefined topics that a chatbot can handle, and sentiment analysis can recognize whether a comment is positive or not. However what is relatively new is the implementation of short-term memory to a neural network in order to make more complex inferences.

A great example is one given by a 2015 research paper called “Memory Networks” from Facebook AI Research which showed how it could answer more complex questions based on a short story:

Bilbo travelled to the cave. Gollum dropped the ring there. Bilbo took the ring.

Bilbo went back to the Shire. Bilbo left the ring there. Frodo got the ring.

Frodo journeyed to Mount-Doom. Frodo dropped the ring there. Sauron died.

Frodo went back to the Shire. Bilbo travelled to the Grey-havens. The End.

Where is the ring? Answer: Mount-Doom

Where is Bilbo now? Answer: Grey-havens

Where is Frodo now? Answer: Shire

The latest research in 2016 has shown that you can give an image to the algorithm, and it will be capable of answer questions about the image. However, implementation in this case will take years, because although the research is promising, it needs to be tested in more real scenarios.

What’s changing?

The biggest change is the ease of creating your own chatbot. Companies like Google, Amazon, and IBM are offering services to process text and analyze it for you. This gives enterprises an easier option to create their own chatbot. For example, I have used Google language processing and it provided full grammatical analysis of text, but I could choose between two more options, one of them sentiment analysis and the other recognizes sentences. More platforms are being developed which will help organizations create chatbots with less effort. They will just need to be given examples of conversations and activate some options. This means organizations will be able to rapidly deploy their own chatbots. And on the end-user side there is competition between those companies with applications like Alexa, Siri and a month ago Google Assistant was announced, although available only in a few languages, As we can see chatbots are changing how we interact with our phone and with other devices and applications.

 What’s likely to happen in the next year?

We’re going to see more and more companies implementing text natural language processing because chatbots create a close connection between them and their customers. This connection is created by quick responses and ease of use. Based on our experience, the biggest challenge for companies will be ensuring a great user experience, because the technology is still maturing (albeit at a rapid pace).

Can chatbots help people with disabilities?

Interestingly, we believe we’ll see more advances in specific areas which will have a dramatic impact on people’s lives, such as the process of interpreting sign language and translating it to text or voice. Chatbots will also be used to help visually impaired people understand their surrounding environment.

Companies are already using chatbots now to help people make payments. Many disabled people have difficulties making payments via standard credit or debit cards, so a better solution needs to be found. For instance, Amazon’s Alexa makes it easy to purchase items, while it also has partnerships with financial institutions such as Capital One, so you can check your account balance or pay your credit card bill via voice control. This type of functionality will become increasingly prevalent. Facebook Messenger also already has payment functionality in its chatbots. For example, imagine you are a blind person and you want to first check our bank balance, and then afterwards purchase a new braille book. This becomes much easier with this new technology.

There has also been some fascinating recent research into the potential for algorithms to recognize sign-language in real-time. The chatbot then reads out the results. This would mean that people who communicate with sign-language could easily talk with those who don’t understand sign-language.