MB-200: Microsoft Power Platform + Dynamics 365 Core – Import Data

I am creating a series of blog posts that collectively are designed to help anyone preparing for the Microsoft Power Platform + Dynamics 365 Core exam. (aka MB-200) In this post I will look at concepts around importing data.

There is a section of the MB-200 skills measured statement which covers managing entities and data, within this you will find a definition of what needs to be covered for the import / export of data. In this post I will try to touch on as many of the topics mentioned as possible …

I have covered field translations in a previous post, you can read that here.

Import Methods

There are several data import methods available, including;

  • Data Import Wizard – From web application, perfect in simple situations.
  • Programmatically using SDK – A flexible approach but requires development effort.
  • 3rd Party Tools – There are several third-party tools to support data migration. (Such as Scribe or KingswaySoft.)
  • Configuration Migration Tool – a tool from Microsoft which is ideal for moving reference data from one instance to another.

Note: The configuration migration tool isn’t specifically referenced in the skills measured statement! But you can find out more about it here.


Data imports commonly happen at the start of a project when data from your old systems is migrated into your new Common Data Service. When planning this initial import data there are multiple things to consider;

Item Details
The volume of legacy data to import Consider options including;

  • Should you Import all legacy data?
  • Or should you apply a cut off to limit data volumes?
    • Do you really need 10 years of sales history?
    • Will the last 12 months be sufficient?
  • Import no data! It is not uncommon to start with a “clean sheet of paper”.
The effort involved in designing the import routines In my experience it is very common for the time required to import data to be underestimated. (Or in some projects missed as a task completely!)The design, development and test effort to create the import jobs can be a significant task and shouldn’t be overlooked.
The time required to conduct the import With extremely large databases the actual import can literally take days! This might be uncommon but regardless you need to plan when the legacy data will be imported and allocate enough time to complete the data migration tasks.
Data cleanse Most (no all) legacy databases will contain inaccurate or redundant data. The purchase of a new solution might be the ideal opportunity to cleanse some (or all) of this data but again don’t under estimate the effort! In many scenarios the cleanse process can start whilst the data still resides in the legacy application. I like to compare data cleanse to moving house. You can clear out all your “rubbish” before you move. All of those books you’ve bought and haven’t read and never will read! Or you can pack everything up shift it to the new house and trash stuff once in your new home.
Storage Requirements Don’t forget you will need to consider the expected final size of your database. (CDS storage space is not free!)

Data Import Wizard

The data import wizard is specifically mentioned in the skills measured. It is a common and simple method to import data, so let’s look at how it works.

Often you will import one entity at once but it will also be common to need to import related data. In this example I am going to import accounts and contacts. With the contacts being related to the account and vice versa. As a contact can be related to a company and an account can have a primary contact.

Tip: As part of your revision you might want to first test importing just one entity from a csv file. In this example I am deliberately being “clever” to demonstrate how related datasets can be imported.

First of all, I have created a couple of comma separated spreadsheets and a zip file, as shown below.

My zip file is simply the two spreadsheets (CSV files) combined into a single zip file. (I used the “send to” option to create a zip file, as shown below.)

My Account CSV file looked like this. I simply entered account name, business phone and primary contact name. Obviously in the real world you would have many more columns.

One limitation of this approach is that the data will be imported exactly as shown. If you need to do any transformation on your data this should be done in the spreadsheet prior to import or alternatively done programmatically. (Using SDK or 3rd party tools.)

My contact spreadsheet looked like this, notice that the name on the contact is split into first name and last name columns. Mirroring the way the data is held in the CRM contact entity. Also notice that I have added the related account name to link the contact to the account.

Having prepared the data we are now ready to use the import wizard, for this navigate to data management in settings within the Power Platform admin center. (As shown below.)

Tip: Below the “Data import wizard” option we have “Data maps”, if you have previously saved a data map you can see those here. (Data maps are useful if you need to repeatedly import the same dataset with the same mappings.)

Then I select the data import wizard option. In the imports option you will see details of any previous imports, including the number of records successfully loaded and the number of errors. From this screen you can select the import option to start the data import wizard.

Firstly, the wizard will prompt you for the name of the file to import. See that I have selected the zip file mentioned earlier.

Notice that the supported file types are xml, csv, txt, xlsx and zip. With a zip file needing to contain one or more of the preceding file type.

Having selected the file type you click next to progress in the wizard. Now you will be presented with the file or files you are about to import.

Optionally you can control the delimiter settings. Valid options include comma, colon, semi-colon and tab. Plus if text fields have a data delimiter of double quote, single quote or none. (In my case I created the csv file using Excel and the defaults were just fine.)

Clicking next will now allow me to select a data map. If this import has happened before you may have created a data map. (FYI:
A data map simply contains meta data to describe which columns in the source file to map to which entities and columns in Dynamics 365.)

Also, some default maps exist to help with importing data from Business Contact Manager (an Outlook add-in).

In my example I selected the default option of automatic mapping.

Clicking next allows me to map the record types. Here we are going to map each file in the import to an entity in Dynamics 365. Below you can see that I linked my accounts file to the account entity and my contacts file to the contact entity.

Click next will the allow me to map all of the fields on each entity. The required fields are listed first followed by the optional fields. You don’t have to map every field in the source file, I have often found I’ve been given columns of data in the source file that I simply don’t want to import!

You may notice that some fields are automatically mapped. This is because an exact match was found between the column heading in the source file and a Dynamics 365 field. In my data there wasn’t a field called “Business Phone” on the account entity so that couldn’t be mapped automatically. This is easily resolved by manually selecting another field, such as “Main Phone”.

Having mapped all of the fields for account I turned my attention to the contact entity. In this one see how the related account field in my source has not been automatically mapped. This one is a little different as the account field is a lookup.

See how in my example I have linked the related account field in my source to customer name in the contact entity. With customer name being a lookup onto account or contact. When I do this I have opted to say I am going to match to account using the Account Name field.

With all my fields mapped I click next. On the next screen I am simply presented with a summary of the import. Assuming you are happy with everything just click next.

Now before starting the import I have a few options that govern how it operates.

Firstly, I can allow duplicates or not. In my example to avoid any errors I said “yes” to allow duplicates. In a real world scenario, you are likely to opt to not allow duplicates! Now this does hint at a limitation of the data import wizard. If a duplicate record is found, then the row in the import will error. If you actually want to update the existing record when this happens then again you’d need to resort to programming options.

Next we select the owner of the records. Notice that all of the record will be owned by the same named user. Alternatively I would need an owner column in my import file that would contain the owner information.

If this data import is to be repeated then you can save the mapped as a data map for later re-use. One example of when I have done this is when you purchase lead data from a 3rd party. It is likely that you may need to import another set of leads in the same format so saving a data map will save you time next time around.

Finally clicking submit will start your import.

The imports happen in background and may taken a few minutes (or longer) to complete.

As your import is progressed you can refresh the screen and see the status change from “Submitted”, to “Transforming”, to “Parsing” and finally to “completed”. Assuming your import goes really well you will see that the total processed and successes columns will contain the same row count.

However, if you are unlucky and you see rows have errors or had partial failures you may need to review these failures.

Below I have simulated an error by importing the same file again. But this time I told it not to allow duplicates. Meaning all the rows will fail as duplicates! By opening the import and looking at the failures option I can see which rows errored and why.

There is a really useful feature to create an export file that only contains just the rows which failed to import. Say a small percentage of a large import file failed you could export just the failed rows, correct the issues and re-import just those rows.

To access the option to export the failed rows we use the related tab. Then select failures. In here you will see the “Export Error Rows” options.

Hopefully you have found this post useful for your preparation for the MB 200 exam. As always I suggest you get some hands on experience! Maybe try importing various types of data and simulating errors etc. Enjoy!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s