Recently I have been working with Azure QnA maker cognitive service and was integrating it with bot framework built using SDK v4 and clearly had two choices as an implementation approach
and Startup.cs will look like this
- Implement it as a regular workflow in bot
- Implement it as a bot middleware
This article assumes that you understand bot framework and you are aware of few essential concepts around e.g. bot services, configuration and middleware etc. If you are new to it, then it is highly recommended that you go through this documentation and learn it before proceeding further.
In short, the middleware is a component which sits in between the channel adapter and bot and can see and process each request sent to the bot and any response sent by the bot to the user.
In a nutshell, the channel adapter is a component which is responsible for converting channel data into a JSON format which can be understood by the bot framework.
As we can see in the image above that middleware sit one after the another in the execution pipeline before the message sent to the bot received by the bot and then gets executed in reverse order when response is being sent to the user by bot and hence the registration sequence of each middleware matters.
As we can see in the image above that middleware sit one after the another in the execution pipeline before the message sent to the bot received by the bot and then gets executed in reverse order when response is being sent to the user by bot and hence the registration sequence of each middleware matters.
One of the practical examples could be
Now let’s get to the point and see how we can integrate QnA maker cognitive service as middleware. Why does it make sense to create it as a middleware?
Well, one of the reasons for this decision was because we first wanted to try and get response to any user message by searching inside the QnA repository and if it nothing is fetched then bot core logic can take care of the sent message and run business rules on it.
You can go through the process of creating QnA maker service in Azure portal here and understand how to train it by feeding data, your sources can be public URL of your documentation, tsv files, or excel sheets containing FAQs.
To work with QnA service and to use it inside the bot you will four things
- Your knowledge base Id
- QnA service endpoint key / access key
- Azure web app host url (gets provisioned when QnA service is created)
- NuGet - Microsoft.Bot.Builder.Ai.QnA - https://www.nuget.org/packages/Microsoft.Bot.Builder.AI.QnA/
All these can be a part of your .bot file can be initiated at the time of your bot startup in startup.cs, sharing a sample of .bot file with QnA service configuration.
Please note that the code above is just for conceptual reference and is referred from the available samples here. You can find definition of the BotServices.cs on same link.
And now with this, since we have configured the bot with QnA maker service, lets move on to middleware.
As you can see that the source code is quite self-explanatory – all we are doing is implementing the OnTurnAsync method of the IMiddleware interface and simply calling QnA service to fetch results.
Note that this implementation has been kept simple for reference, but you can customize it e.g. processing retrieved response before handing it over the user, log it etc.
Also note that currently it checks for the scores of retrieved results from QnA service and tries to build the top response. You can further configure the 50% threshold value as per your need.
If we receive any response from the QnA service, we are ending the conversation there and handing over the response directly back to user without invoking further components sitting in the pipeline in this sample.
Now the last part is to make bot aware of this middleware by registering it in Startup.cs
And that’s it, now your QnA service is set using a bot middleware.
Hope this helps someone.
And now with this, since we have configured the bot with QnA maker service, lets move on to middleware.
As you can see that the source code is quite self-explanatory – all we are doing is implementing the OnTurnAsync method of the IMiddleware interface and simply calling QnA service to fetch results.
Note that this implementation has been kept simple for reference, but you can customize it e.g. processing retrieved response before handing it over the user, log it etc.
Also note that currently it checks for the scores of retrieved results from QnA service and tries to build the top response. You can further configure the 50% threshold value as per your need.
If we receive any response from the QnA service, we are ending the conversation there and handing over the response directly back to user without invoking further components sitting in the pipeline in this sample.
Now the last part is to make bot aware of this middleware by registering it in Startup.cs
And that’s it, now your QnA service is set using a bot middleware.
Hope this helps someone.