Omnichannel for Customer Service – Use Machine Learning to Route Cases

Microsoft’s Omnichannel for Customer Service can make use of the Unified Routing option to intelligently route items to agents. In this post I will show how I configured the ability to route cases to agents using machine learning.

Commonly organising will want to ensure they offer the best possible customer service in an efficient and timely manner. This often involves ensuring any cases are assigned to agents that have the right skills to deal with them. Typically this involves creating some form of bespoke way to categorise cases. Those categories can then be matched with agent skills. This is all great in theory but often the cases will be incorrectly categorised. Or the process of categorising the cases will involve a manual step to triage the case prior to allocating it to an appropriate agent.

Wouldn’t it be better if we could use artificial intelligence to “read” the case and allocate to the best agent based on the actual case detail? And also over a period of time train and re-train the model to improve the routing logic and adapt to evolving case issues.

With unified routing we can now create intelligent machine learning models that can be trained to use AI to determine the skills needed to resolve a customer query. And this information can then be used to intelligently route the cases without needing a complex and inaccurate categorisation process. Plus we can train and re-train these models to improve the results. (The training can be based on additional case data or by importing data from other sources.)

In this post I will describe the steps I followed to configure the intelligent routing. The steps involve are;

  1. Create skills
  2. Allocated skills to agents
  3. Create intelligent skill finder model
  4. Create and approve training data
  5. Train the model
  6. Configure work classification to use Machine Learning model

Note:
I am assuming here that you have already configured record routing for cases in Omnichannel for Customer Service. I am also assuming that you are using the newer “Omnichannel Admin Center” and not the original (now legacy) Omnichannel Admin app.

As I describe these steps I will also try to make the occasional comment on any tips I have as a result of me testing out this process.

I’m sorry if this blog post is a little long … but it does include quite a lot of information. However if you follow each step in turn I hope you will agree the process is actually quite straight forward! I am also going to explain exactly how I created my initial test data using an Excel import. (An approach you might find useful for your initial setup!)

Step One – Create Skills

The idea here is to intelligently work out what skills are needed to resolve my cases. So my first step is to ensure I have created some skills. And then I will need to assign these to my resources.

Note:
If you are already using Omnichannel for Customer Service you may have the skills already defined. But I mention them here as ensuring you have them configured is obviously a key step in implementing intelligent routing.

To define your skills open your Omnichannel Admin Center. Then in advances settings you should find the “user attributes” option.

With user attributes you will find a skills option. Click manage.

Below you can see that I have defined a number of characteristics. In my example I mainly have technical skills like “JavaScript” or “C#.Net”. Obviously you’ll create skills which are relevant to your organisation.

TIP:
You should consider the terms skill and characteristic as interchangeable. Microsoft’s naming conventions are always 100% consistent!

Step Two – Allocate skills to agent

Now we have skills we need to say which agents have which skills. Again in the “Omnichannel admin center” navigate to the users option.

Open your resource and in the omnichannel tab you should find a skills configuration record. (Add a new one if the resource doesn’t already have one!)

Opening or creating your skills configuration will open the bookable resource record associated with this resource.

Note:
This might seem “complex” but the reason is Field Service, Unified Scheduling, Ominichannel and Project Service all share the same concepts of bookable resources and characteristics (aka skills).

For our purposes, once you have loaded your bookable resource click the Omnichannel tab and add skills as needed.

Notice that each of my skills have a proficient rating value. This means different agents can have different levels of proficiency in each skill. The Microsoft guide does suggest that you try and use the same rating model for all of the skills to be used. I guess to ensure proficiency on multiple skills can be effectively ranked. You can see I have consistently used a rating model of “Familiar” to “Expert” for my skills.

Tip:
The use of proficiency is optional. It many situations it could be sufficient to say an agent does (or doesn’t) have a particular ability.

Step Three – Create intelligent skill finder model

Next we need to start to create our model to define what skills might be used to resolve which cases. (or other “conversations”!) We define our skill finder models in the intelligent skill finder option. This can be found in your “Omnichannel admin center”. (See the user attributes option in the advanced settings area.)

Below you can see that I created a skill finder model. I gave it a name and also said which attributes will be used. In my example I wanted to “read” the description. (I could obviously have selected case title, description and even other fields if required!)

I can also defined a date range to fetch record for. This will be used when I am creating data based on historic cases to train my model. (Although in my example because I was using a test system I didn’t initially have enough data … so later I will explain how you can also import training data from a spreadsheet.)

Tip: You may need to understand the data criteria field! I believe it may default to “Issue (Case)”. This would be including cases which have been created during web chat conversations. However in my particular scenario I am using record routing. Meaning my cases will be created in (say) a customer portal and routed within Omnichannel. So to analyse cases that were routed I changed this relationship to “Routed Item(Case)”. Meaning in the first drop down you are selecting the relationship and table. And in the second drop down you pick the column.

Filters! Initially I had no filters as I wanted to include all of the data available to me. (Which wasn’t much!) But later I decided that it might be a good idea to only include resolved cases when training my data. As cancelled cases (for example) should probably be ignored.

When you add a filter. If you simply select “Add row” then the columns you will filter on will relate to the conversation record associated with the cases. (So the msdyn_ocliveworkitem table.) This might be useful, for example, if you want to focus on conversations from one particular channel or queue.

But in my scenario I wanted to filter on cases. So I selected “Add related entity”. I then selected “Router record (Case)” and added a row to only look for cases with a status of “Problem Solved”. You notice that the filter selection includes loads of potential related entities!

Tip:
I think this hints at how complex you could get with the attribute selection and the filters. I could have (for example) filtered cases and added a second relationship to some custom entity.

Step Four – Create and approve training data

Once you model has been saved you can use the “Load training data” option to create some training data. In my example (initially) I found that most of my test data was skipped. (I guess its quality is pretty poor and I also commonly don’t actually resolve my test cases.)

Each time I use the “Load training data” option I can see the number of records that were processed and skipped. I ran the load a couple of times gradually increasing my date range. This did increase the amount of data that was considered but even when some records weren’t skipped the quality was pretty poor. (This was simply a reflection on the quality of my test data!)

I guess my problem is going to be pretty common. To be able to create a model which works I needed to create a minimum of valid 50 records. My amazingly poor test data gave me just three!

Tip:
I actually found I needed much more than 50 records before my model started to return meaningful results! The minimum data volume is important but you also need multiple examples for each skill. I would suggest you need an absolute minimum of 10 examples per skill. If you have 10 skills then you are going to really need 100+ records.

If you have poor quality test data or maybe this is a new Dynamics 365 instance then all is not lost. As you can import data to be trained from an Excel spreadsheet. (or two!) As I assume many people will use this approach to generate their initial test data I will explain the steps I followed below.

Note: I will give some additional information on the “Load Training data” option later in this post. As you will find it useful when re-training your model.

Sorry but there are quite a few steps involved in creating your data in Excel and importing it. But the process is actually quite straight forward!

I first created two spreadsheets. The first needed to be called “msdyn_ocsitrainingdata.csv“, this contained details of my model and the input data. It also contained a unique record name that will link to my second spreadsheet. My second spreadsheet was called “msdyn_ocsitdskill.csv“. This contained a row for each skill used in connection with my input data.

Tip: Sometimes I created two or more rows in “msdyn_ocsitdskill.csv” for each reference. As the agent might have needed multiple skills to resolve a particular case.

Below you can see my “msdyn_ocsitrainingdata.csv” spreadsheet. Notice the column names of “Skill finder model”, “Training record name” and “Input Data”. You should use the same column names in your spreadsheet!

Below you can see my “msdyn_ocsitdskill.csv” spreadsheet. Notice the column names of “Training record name”, “Characteristic mapping” and “Characteristic”. You should use the same column names in your spreadsheet.

My characteristics had names which matched the skills I created previous in step one.

Tip:
I put the same value into the “Characteristic mapping” and “Characteristic” columns. To be honest I wasn’t 100% clear on the difference between these columns. But this approach worked for me!


Now I’d created my sample data in these two spreadsheets I wanted to import them together. Do to this I created a “zip” file that contained both of my spreadsheets. Below you can see that I have selected both csv files in file explorer. I then used the “Send to” option and selected “Compressed (zipped) folder”.

Now I had my data I needed to import that into my model. So back in the Omnichannel admin center I using the “Import Excel” option. This can be found in the “Training data” tab of your skill finder model. (As shown below.)

Hopefully you have imported data into Dynamics 365 from Excel before! As we are going to use the standard Excel import approach. But don’t worry if you haven’t as I explain all of the steps below. First a dialog will appear and you can choose a file. This will be the zip file you just created.

After clicking next you should see a summary page something like the one shown below. Here you will see that I am going to import both spreadsheets from my zip file together.

Clicking next again lets you select the data map that will define how the columns in the spreadsheet will map to columns in my training data. Lucky for us all we need to do at this stage is stick with the default approach. So we just accept the “Default (Automatic Mapping)” option.

After clicking next again! (I know next, next next!!) We can now map the fields. The system is going to try to map all of the fields for us. But unfortunately I found that the training record was missing a mapping. This is denoted by the yellow icon containing an exclamation mark.

Clicking on the training record allowed me to add the missing mapping. Under optional fields I needed to manually map the field “Input Data” to one called “Input data”! After which I finally had a green tick against all of the record types I was about to import.

After clicking next (again) I get a summary of what is about to happen. And I also get the chance to click next again!

Finally we can submit our import. Notice that I have “allow duplicates” set to “No”. We don’t want any duplicates!

You might need to know that the import happens in background. So you are going to have to wait for the data to import. Mine only took a couple of minutes but obviously the length of time will vary depending on the size of your spreadsheet!

Step Five – Train the model

We are now ready to train our model. (A process you will end up repeating as you improve your data.)

Initially all of you training data will be in a status of pending. So our first task is to use the “Approve all”.

Tip:
You can open pending items and edit the skills on them manually if you want. I’m not 100% sure how often you will do that in a real-world scenario. As if your data volumes are very large I’m going to guess we won’t edit each one individually! But having said that I have found the ability to edit data really useful when I have been completing my tests using relatively small data volumes.

Once I have approved all of my data I’m ready to train (or re-train) my model. Simply click the “Train model” option.

On the training history tab you will be able to see that your training is in progress. This is going to take a few minutes or maybe longer depending on your data volumes.

Eventually your training will finish. And you can see the status changes to “Training completed”. We are now ready to publish our model. For this you select the row showing as complete and a “Publish model” option will appear. Clicking this will make you model ready to use.

One tip I have is that you can keep importing more data and re-training you model. This might be an iterative approach to gradually refine your model overtime.

Step Six – Configure work classification to use Machine Learning model

Now I have a model defined I changed my workstream to classify cases based on the intelligence I’d added into my model.

Below you can see my workstream for routing case records within the Omnichannel admin center. Sorry but in this post I am not going to attempt to explain everything about record routing! I am making an assumption here that you already have case routing defined and “simply” need to add in intelligent routing. Maybe as a replacement to an existing manual classification.

Above you can see that under work classification I have added one rule to use machine learning classification. When you add a new classification you will be prompted if you’d like to add a manual or machine learning classification. You can see below that when I did this I simply gave my ruleset a name and selected my model. (Meaning you could have multiple models if needed!)

Next you will be prompted to define your input and output attributes.

The output is filled in for us! As the output of this classification is going to be a skill from our machine learning model.

With the input attributes, I selected the “Add attribute” option. I then selected my case table and the description field.

Tip:
If I had based my model on case description and title I would have needed to add multiple attributes here. But as I have simply used description then just one attribute was required.

So now I am ready to use machine learning to route my cases! Or I will be in about 15 minutes time. As you need to be aware that often changes in your omnichannel configuration take upto 15 minutes to apply.

Tip:
There is one more option on your workstream you may want to “tweak”. Towards the bottom of the page you will see a section called “Work distribution”. In this area we can define if we are looking for an exact skills match or the closest match. I tested mine using “Exact Match”. But in a real world scenario you will probably need to ensure the matching algorithm is set to closest match. As we want to find an agent with the closest skills match to the ones suggested by our intelligent routing.

Results

So to test my routing I created a case and waited for it to be routed to my agent. (or not!) The description I used is shown below. I’d imported loads of data that mentioned various Omnichannel features and mapped those to a skill of “Omnichannel”. But importantly the description I entered had never come up before. And also did not include the word “Omnichannel”. So any simply routing method would be unlikely to spot that this complex question needed to be routed to one of my agents skilled with the Omnichannel product.

Guess what!! In my test the case I created stayed in open work items and wasn’t directly assigned to an agent. That was the perfect result I was looking for!!!


If you recall earlier …. my work distribution model was set to exact match on skills. I only had one agent available and that agent had no omnichannel skills. Meaning that in this example my case will stay in the open work items area until someone manually assigns it or an agent with the correct skills logs in. (Hopefully this scenario also demonstrates why you might want to opt for “closest match” to ensure records are always assigned to someone.)

But how can I prove which skill was actually assigned by the machine learning? Well there are probably a few options open to me, including using the diagnostics feature or maybe by adding the skills control to my case record. But I used another approach, that I will show you below ….

Whilst testing I simply ran an advanced find to locate any conversations created recently. (As shown below.)


When I opened the conversation record you can see that the skill or skills returned by the machine learning have been added to the conversation record. (Which in turn is linked to my case!) In my example my model had found the Omnichannel skills and given it a rating of 73%. Meaning the likelihood that this question related to omnichannel was pretty high.


Re-Training Model Overtime

I am not going to lie! The first few times I created cases the skills returned were not at all what I expected. This was simply because my data volumes were way to low for the machine learning model to draw any meaningful results. When I ran my first test I had just 51 rows in my training data. (The absolute minimum is 50!) But it wasn’t until I reach 250 rows did I start to notice that my results were getting more and more reliable.

From a testing point of view that meant I had to generate quite a lot of data to be able to see this feature work. But in the real world even my 250 rows will be a tiny data sample. I’d expect thousands of cases to be included maybe even tens of thousands.

This is where you will want to train and re-train your model over and over again. Below you can see that I filtered my data using a small date range. And then I can use the “Load training data” option to look for any newly routed cases.

This will generate me more data that I can train and publish. Meaning the more I use this system the better the results should get!

Tip:
It is possibly to add the skills control to your case form. (You can find out how in the Microsoft documentation here!) That might open up the ability for your agents to amend the required skills on cases as they resolve them. A process that would enhance your data and result in better and better routing.

I hope this long and detailed post has given you a comprehensive insight into how to configure intelligent routing within Microsoft’s Omnichannel for Customer Service / Unified Routing. If I’m honest I found the process of creating my initial test data quite involved but once complete I am really impressed with how this operates. Top job Microsoft, thanks.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s