In the previous article, we got an introduction to integrating out data into Cosmos DB using a data migration tool and using a sample SQL database server.
In this blog post, we will see how to migrate the data from the SQL server into Cosmos DB. First step to begin with is to install the Data migration tool from the Microsoft Azure Cosmos DB Documentation page.
When this is downloaded, we see that there are two versions of the tool, one is to use from the command line and the second one is a graphical user interface.
We will use GUID to proceed, i.e. dtui.exe
Here we get this option to import the data from and we get these many options as in the image below:
We then choose any of the given options and the page appears accordingly. So for example, I choose SQL, it then asks me for Connection String, query, and the nesting separator.
Put in the connection string of your SQL server and the query from your database. You can use the nesting separator to be a ‘.’ This will make the tool look for a . inside your query and separate it with the help of a . into columns.
Having done that, you will get the option to define the target information now.
In the Export To option, you will see three alternatives:
When trying to export to the SQL API, you can probably use the Azure Cosmos DB options, or the other option is to use the JSON file, which will give you the option to work around on the data before pushing everything into the Cosmos DB account, so this comes out to be a better approach most of the times.
But when choosing Azure Cosmos DB as an option, you’d want to go with the bulk option which is super fast, but then only sequential supports the partitioned collections option which is mostly what we want.
Next we put in the connection string of the Cosmos DB SQL API. We can obtain this connection string from the Azure portal but that wont contain the database reference in it, so after the connection string, we need to update the database name in it.
Now paste it inside the connection string input and append with the database name.
Next, enter you collection name, partition key which you can use zipcode for, and then the collection throughput which you can keep as the default.
Also, you get this option to provide id inside the id field, which is not mandatory. If not given, it creates a unique id for your collection based on the GUID.
With this done, you can go on to next page and move on without creating error log file, if you don’t want it.
Finally, you’ll come to the summary page. Once you review everything, you can click on import and the process will start.
Now that your import is done without any failures, you can finally go to your data explorer to check if the database ‘mydatabase’ has been created, the collection that you mentioned has been imported and the data is present inside the documents.
With that, you’ll also be able to see all the other resource properties like rid etc. created for you inside the document.
In this blog post, we saw the usage of Data migration tool to get our data from any source into the Cosmos DB account. From the installation of the Data Migration Tool to getting the data inside the Azure Cosmos DB account, we covered the topics using a practical representation at every step.