Bot essentials 7 : Chatbots NLP aspects – the deep dive 1

Now that we understand the essentials of the NLU mechanisms that a bot applies, let’s turn our attention to the Natural Language processing elements.

NLP is a set of algorithms that help machine technology understand, interpret and process language which can be in the form of text or speech.

Users will ask the bot arbitrary questions to throw the bot off track. The bot will address with by having a default answer something to the effect of “I’m sorry, I did not understand the question, would you mind repeating it differently or contact our help-desk at…”

NLP helps you “train” your bot and make it more intelligent

That basically means adding to its repository of responses based on real-world questions that the users ask. NLP will help you deliver a higher level of interaction and engagement with your user settings and further reduce any transfers to humans. The need for humans will always remain – as a “Talk to an expert” path, as bots become more advanced in its speed and accuracy of understanding text and speech.

At the broadest level of NLP, the utilisation and fits have evolved over time. We still are progressing on Generation 1 and 2 bots, simple Q&A bots or bots that integrate to back-end platforms to transact.

Essentials

Evolution 1 

At the very beginning, we used NLP models and still use it for spam filtering. We started using NLP algorithms in another area that is Speech tagging. It was felt that we could solve these issues by using simple interpretative essentials models. The need for more complex deep learning algorithms was felt since the basic interoperability algorithms only took us so far.

The main issue in normative translation using simplified models was the extent we use intents based on colloquialism, contextual sensitivity, political correctness, beating around the bush, idioms, sayings interlaced with oxymorons and irony in our daily communication. Why should our conversation with a bot be any different?

Evolution 2

Our base FAQ algorithms used for scoring words, extracting context and intent would be inadequate. It will not make the bot more intelligent. The average summation of a bag of words model and using vector science will only take us so far along the accuracy curve. To further improve we try n-grams or shingles. N-grams is a statistical probability determinant model for predicting the next item in a sequence of words. An n-grams approach struggled with the dimensionality of interpretations. Even Markov models were inadequate to cater to this level of vagaries and needs. The N-grams approach is what has been described in blogs above where word scoring and an average summation led to a vector for sentences to compare with pre-stored Q&A formats.

Evolution 3

The main issue with the approach was the linearity of words and the vector science behind it. With distinct words, all vectors representing single words would be equidistant. It lacked semantic information. If we migrate from the bag of words model approach then it will help increase the accuracy of the response. This model allowed for words similarity pairs if used in the same context.

We will talk about further evolution in the NLP space and how bots leverage it for gaining better accuracy.

Want to build your own business bot? Get in touch with the experts!

Schedule a demo today