Bot essentials 10 : The NLU deepdive – Entities and Intent

NLU aspect

Getting the NLU aspect is key to designing a conversation experience for a bot. If your bot cannot understand a sentence or the underlying intent it will lead to a default or a very frustration experience for the user.

The elements of Natural Language Understanding (NLU) is the next step in our journey of understanding how bots work. The NLU algorithm works on 2 basic concepts of language construction.

–    Entities

–    Intents and

–    Context

Entity extraction

It is essential in understanding the construct of the sentence and the meaning behind it. Entities are mentions in the sentence on who an action needs to be performed. E.g. How much is in my checking account? The entity here is a checking account which gets extracted by NLU techniques. We will talk about stemming and lemmatisation in future blogs. At this point of the journey lets just assume that there are algorithms that will extract entities from sentences provided to them.

The intent here is to know the balance

So entity is checking account, intent is getting the balance. Figuring out the intent is one of the most important aspects of NLU. Just analysing words only gets us so far. Consider these 2 sentences

–    I need to make a reservation at an Italian restaurant

–    I need to book a table at the pizzeria

These 2 sentences mean the same but are constructed very differently. The intent here for both is the same, “booking a table”. How do we have an intelligence engine understand the “intent” of the sentences?

Read about Bot Essentials 11 – The NLU deep dive – A trained NLU system

Enter word semantics, statistics and word vectors. Each word is a vector, an array of numbers. As is the nature of vectors, they have weight and direction. You can measure the distance between 2 words using the concept of vectors.

Similar words have vectors close to each other. The distance between the vectors for bicycle and motorbike is closer than bicycle and horse.

Word vectors

They are good at comparing words but how do we then solve for whole sentences? That is relatively simple since you can perform arithmetic operations on vectors. You can derive sentence vectors by averaging the word vectors but from our experience it does not work well in most cases. Its difficult to average words to get an aggregate meaning of sentences. At best it can be an aggregation.

Now that we have sentence vector weights as well as distance from other similar sentences, we can use semantics to figure out closely matched sentences to what the user is asking the machine. The machine thus understands and learns using similarly matched words and groupings using the science of statistics, probability and vectors.

Get in touch with us to know more about Engati and chatbot technology!