Conversing in three dimensions.

There is one feature of Conversation that many people don’t even factor in when creating a conversational system. Let’s take the standard plan to spell it out.

  • Unlimited API queries/month
  • Up to 20 workspaces
  • Up to 2000 intents
  • Shared public cloud

Yep, you have 20 workspaces to play with! Most people starting off just use it for development, testing and production. But there is so much more. Putting it in context.

complexityofconversation

Functional Actions

For those new to Conversation, the first experience is normally the car demo. This is a good example of functional actions. In this case you know your application, and you want your end user to interact with it. So the user normally has prompts (conversational or visual) to allow them to refer to your user interface.

These offer the least resistance of building. The user is taught the names of the interfaces by using it, and are unlikely to deviate from that language. In fact if they do use non-Domain language, it is more likely a fault of your user interface.

Question & Answers

This is where you have collected questions from the end user, to determine what the answer/action that is needed to be taken.

Often the end user does not understand the domain language of the documentation or business. So training on their language helps making the system better for them.

Process Flows

This is where you need to converse with the user, to collect more information to drive to meeting their needs.

Multiple Workspaces

Most see this as just creating a workspace for development, testing and production. But using these as part of your overall architecture can dramatically increase the functionality and accuracy of the system.

Two main patterns that have been used are off-topic + drill down.

Off-Topic / Chit chat.

In this model we have a primary workspace which stores the main intents, as well as an intent for off-topic + chit-chat. Once one of these is detected, a second call is made out to the related workspace.

uml0117

From a price point of view this works well if you are calling out to a subject matter that is asked infrequently. If the user will often ask these questions though, then the drill down method is a better solution.

Drill Down.

This model is where the user asks a question which has a more expanded field going forward. For example when you enter a bank you may ask the information desk about mortgages, who will direct you to another desk to go into more detail on your questions.

uml0117a

For this to work well, you need clear separation of what each workspace does. So that an off topic is triggered so as to pass back to the main workspace.

When planning your model look for common processes vs entities. The example above might not be good to separate by pets, as they will share common questions with a different entity. But you could separate between purchasing, accessories, etc.

As long as the conversation will not switch topics often then costs are kept down.

Multiple Workspace Calls.

This is not a recommended model as your costs go way up. It was originally used when there was a 500 intent limit per space (NLC).

uml0117b

If money is no object, then this model works where you may have more then 2000 intents, or a number of intents that share similar patterns but you need to distinguish between them.

You need to factor in if your conversation service is returning relative or absolute confidences. If relative, then responses are relative to their workspace and not to each other.

If you do have numerous intents, it may be easier and better to use a different solution, like Discovery Service for example.

 

5 thoughts on “Conversing in three dimensions.

  1. Hello, I am successfully using conversation Service. But now requirement is to put math Equation in Watson Response. How do I do that? Also I tried using an external Mathematics keyboard to type, but since my TTS service does not understand equation, I don’t get a response. Is there a way to use LaTex here? Also is there a way to put function in my JSON Watson Response.
    Thanks so much

    • TeX exchange does not know anything much on IBM Watson, so getting little difficult. Anyways I’ll post if I have any progressive update. Thanks

  2. I updated your post comments there. Hopefully someone can help you. If this is a rendering issue, the conversation test interface will likely not render it for you. So if it is rendering on a web page, then MathML might help. https://www.w3.org/Math/

  3. In this blog you indirectly answered one of the questions I had about setting up Conversations: how to do the design work. For small conversations directly coding in WC is acceptable but what about if it gets bigger and bigger. Think about interfacing with other API’s & more logic in the applications. I had the plan to use Sequence Diagrams for it and I am happy to see them being used by you in this blog.

Leave a Reply