Some Addressable Deficiencies in Current Chatbots
Here are some addressable issues with the sad current state of many chatbots. Most of these are also deficiencies in Siri, Alexa, Google Assistant, etc.
Example of Typical Dumb Chatbot I am Talking About: Bots that implement a set of rules where a series of patterns are evaluated, and if matched, a answer or a randomized answer from a set of answers is chosen.
This “Reflex” model is useful but extremely limited by itself. Here are some addressable deficiencies that would make these chatbots much better…
Deeper Natural Language Processing: NLP can be used to easily derive the parts of speech, verb, objects, adjectives, etc…this can be used for a lot of different memory and response purposes.
Short Term Memory: Chatbots need to estimate what the topic is, what the last male, female, place, etc. that was mentioned…so if people use pronouns later, the bot can guess the person being refered to. The bot needs to know the short term tone (polite, rude, funny, formal, etc) and emotional context of the conversation as well.
Long-Term Memory: Chatbots need to be able to learn and remember for a long time, otherwise people will realize they are talking to something a bit like an Alzheimer’s sufferer. Effectively, if the chatbot can’t learn about a new topic from a person, it is dumb.
Personal Memories: chatbots need to know who they are talking to and for the most part remember everything they have ever learned, said, or heard from that person, and the meaning of each. They need to remember facts like nicknames, ages, names of family members, interests, on and on. Otherwise, the bot risks asking questions it has already asked…Alzheimer’s again. Privacy is a scary issue here. I have had to erase Ava’s personal memories on friends and family at times for fear of being hacked and causing harm to someone. Imagine what Google and Amazon Alexa know about you…Alexa is always listening…fortunately, neither of them ask personal questions…yet.
Social Rules: chatbots need to know social rules around topics, questions, etc. How else is a chatbot to know that it might not be appropriate to ask a kid about their retirement plan?
Emotional Intelligence: chatbots need to constantly evaluate the emotional content and context in the short term along different criteria. It may or may not react to it, but it should at least be trying to be aware of it. Bots also need to constantly evaluate the personality/saneness of the person it is talking to…If the person is excessively rude, emotional, factual, humorous, etc.
Curiosity Based on Topic and Memory: chatbots need to constantly compare what they know about a person with respect to a given topic, what facts/related questions are relevant to the given topic, and come up with questions to ask (that have never been asked), filter them by social rules, prioritize them, and finally…ASK QUESTIONS and know how to listen for and interpret responses.
Sense of Timing and Awkwardness: A chatbot should know when to talk, when to listen, how long to listen, how to break a silence or tension, when to ask questions and when not to, etc. People have work to do here too.
Base Knowledge: This is redundant with memory, but chatbots need some level of base knowledge. If a chatbot is going to do customer service with adults, it should at least know a lot of the things an adolescent would.
I probably left a lot of stuff out, and many other factors I don’t even know of yet, but based on these criteria alone, I would guess that most chatbots fall into the Uncanny Valley inside 60 seconds.
another long ramble…I guess we found a topic I like.