Migrating from NLC and Dialog

For those of you who have been using NLC, you may be asking “Why bother to migrate?”. Well one advantage is that you can download the questions you put in.

To migrate is painfully simple. Import your CSV file and you are done. Don’t touch the dialog section. Then for your JSON make sure you have alternate intents enabled.

Example:

{
  "alternate_intents":true,
  "input":{
    "text":"Hello World"
  }
}

You will get your intents back in a JSON array. Maximum intents returned will be ten.

"intents":[
    {"intent":"conditions","confidence":0.5939187876312475},   
    {"intent":"temperature","confidence":0.4060812123687526}
]

Migrating from Dialog

This is where things get tricker. Dialog and Conversation are very different systems. Dialog had no machine learning, but had extremely complex NLP in it which Conversation does not yet fully mimic.

Here is the areas of Dialog and how they compare.

Folders

Conversation does not have folders in the same way Dialog does. You can create a node with a conditional statement of “false”. Use the output  part to name your folder. Tree traversal will skip it (in Dialog it will traverse into it). You can use Continue from to jump into the folder.

conv0210-1

Output Node

In Conversation Output and Input nodes are part of the same node. You can chain nodes together to construct multiple output similar to dialog. You do this by setting a continue from to the next nodes output and so on. You can also generate multiple random/sequential responses as dialog did. You can read more details about this in the advanced output documentation.

Get User Input

conv0210-2 This icon is similar to the Get User Input. Conversation however at this time does not have Dynamic Node Resolution (DNR) functionality. So if the tree is traversed it will return to the root once completed, and not back to the last get user input.

Search Node

Conversation currently does not have this functionality. You can mimic it at the application layer by passing back in the previous nodes visited ID to jump back to an earlier part in the tree.

Default Node

At the root level of the tree you have “Anything else” node that is automatically created. For branches of the tree, you create a node with a condition of “true”. These kinds of default nodes are more important in Conversation, as if you do not hit a conditional node, then it will fall back to the root to find the answer.

Input Node

As mentioned earlier input and outputs are merged into one node. Variations that exist in Dialog do not exist in Conversation. To emulate these you can build multiple regular expressions. But get into the habit of using Intents and Entities. Intents use machine learning to match questions never seen before.

Goto Node

Conversation uses “Continue From” which is very similar. I detail how it works in “Understanding how a Conversation flows“.

Profile Check

This is part of the conditional section of the conversational node.

Concepts

Conversation does not have concepts. Intents will learn new terms from what it is trained on. Conversation entities can be used in a similar way to concepts, but get used to using intents.

Function Node

Conversation does not have this functionality as it is stateless.

Random Node

Conversation does not have this functionality, but you can mimic it. First create a folder with the nodes you want to randomly hit. Give each node a conditional against a context variable to see if it matches a certain value. Then in your firing node, create something like the following.

{
  "output": {
    "text": "Finding random response."
  },
  "context": {
    "random": "<? T(java.lang.Math).random() * 5.0 ?>"
  }
}

This will give you a number from 0-5 with which to check against as follows.

conv0210-3

Here is a sample conversation script demonstrating it.

Dialog Entities

Entities in Dialog can be quite complex. For example you can have nested entities,  concepts and regular expressions. As well as system entities which can recognise dates, locations, time, etc. Conversation doesn’t have this functionality yet.

Tags

Conversation does not have this functionality. You can mimic this by using Entities, but it is not recommended as it will be used as part of the training. Another way is to have the application layer intercept the text and replace out constants.

Auto-Learn

Conversation does not have auto-learn functionality. You would need to mimic this at the application layer.

 

Leave a Reply