How to – Export Dataverse (Dynamics 365) data to Azure SQL using Azure Data Factory pipeline template

[Visual Guide to Azure Data Factory -]

Using the new Azure Data Factory pipeline template – Copy Dataverse data from Azure Data Lake to Azure SQL – we can now easily export the Dataverse data to Azure SQL Database.

Check other posts on Azure Data Factory

Select Pipeline from template option inside the Data Factory

Search for Dataverse and select the Copy Dataverse data from Azure Data Lake to Azure SQL template

Let us specify the User Inputs required by the template – i.e. Azure SQL Database and the Data Lake Storage.

First we have created the linked service for the Azure SQL Database.

We’d use it to connect to the below table MyContacts.

Similarly create a linked service to Azure Data Lake Gen 2, which holds our Dataverse data.

Get the URL from the Container’s property. (replace blob with dfs in the URL)

To get the storage account key, select Access Keys >> Show Keys >> Copy the Key for the Storage Account.

Here we have already configured Azure Synapse Link for Dataverse

Now as we have defined the User Inputs, select Use this template.

Navigate to the data flow created – DataverseToAzureSQL

Select our source ADLS and check and configure its properties.

Source Settings

Here we have the Inline dataset type set to Common Data Model and the Linked service is the AzureDataLakeStorage1 we created earlier.

Source Option

Specify the Dataverse folder for the Root Location.

Here we have specified the contact entity from our Data Lake Storage.


In the projection we have cleared the generated schema using Clear Schema, also selected Schema options >> Allow schema drift

We have enabled Allow schema drift option which will create the required columns in the destination Azure SQL Table.



Data preview

As we have not turned on Debug mode, there is nothing to preview

Now let us move to our Destination – SQL.


Here we have AzureSQLTable dataset connected to contact table in Azure SQL and have checked Allow schema drift option.

Below is our AzureSQLTable Dataset connected to the MyContacts table.


Here we have selected Allow Insert as the Update Method and Table
action as Recreate table – as we want the destination table to be re-created dynamically based on the source.


We have left it to Auto mapping.



Data preview

Let us Publish All our changes and Debug our pipeline.

Let us monitor our pipeline run.

We can see that pipeline has run successfully and took around 4 minutes.

We can see the contact’s data copied to our Azure SQL successfully.

So here we covered the insert operation, in the next posts we’d see how we can configure update, upsert and delete operation.

Also, check

Posts on Azure Data Factory

Posts on Azure Data Lake

Hope it helps..


Author: Nishant Rana

I love working in and sharing everything about Microsoft.NET technology !

11 thoughts on “How to – Export Dataverse (Dynamics 365) data to Azure SQL using Azure Data Factory pipeline template”

  1. Hi Nishant

    I’ve got stuck at setting root location in the data flow. Have you ever come across this error?
    —> ADLS Gen2 operation failed for: Operation returned an invalid status code ‘NotFound’. Account: ‘storageaccounttesting22’. FileSystem: ‘dataverse-athena05-unq106375223f9140d1b6cdbf6e35648’. ErrorCode: ‘FilesystemNotFound’. Message: ‘The specified filesystem does not exist.’. RequestId: ’11c52907-501f-0054-3906-99d261000000′. TimeStamp: ‘Tue, 24 Aug 2021 16:40:49 GMT’. Operation returned an invalid status code ‘NotFound’ Activity ID: 3c37de07-425b-4907-a419-738d5eed79ca

    Liked by 1 person

    1. Thanks for that Nishant. I wish we could keep Data Export Service, the setup was so much easier, I can’t understand why they deprecated the service. Hopefully MS brings something more straightforward to be setup.

      Liked by 1 person

  2. Thanks for your Post. How can I get the labels of the choises from dataverse? They are stored in the Microsoft.Athena.TrickleFeedService/table-EntityMetadata.json I tried to lookup this labels but it doesn’t work.
    Thnaks in advance

    Liked by 1 person

  3. Good article. However, Data Export Service replicates data automatically upon update, delete or add events. Is that possible with ADF Pipeline too?


    1. No that smooth as I can see. If you see the blog, it looks like they are recreating the table on copy ( if the users actively actively use azure sql for querying, they could experience downtime I suppose? Nishanth can you throw some light on this scenario what would be user experience?)


  4. Hi Nishant, excellent post. We have an existing Azure SQL DB using DES we populate this from Dataverse, so if we replace the DES with Data Factory pipeline, we would want to keep the existing tables and views in this SQL DB intact and only map the data entities from ADLS to existing SQL DB tables and do an incremental updates, is that something that is possible?

    Liked by 1 person

    1. Hi Omkar thanks for your kind words. Have not tried it, but I think should be possible. Will have to try it out and see, I am very new to this topic, I had just documented those few things which I had tried.


  5. This seems to be quite expensive one. According to MS documentation “You pay for the Data Flow cluster execution and debugging time per vCore-hour. The minimum cluster size to run a Data Flow is 8 vCores”


Please share your thoughts

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

My Trial

It is my experience timeline.


Sharing my knowledge and experience about the Microsoft Power Platform.

Arpit Power Guide

a guide to powering up community

Welcome to the Blog & Website of Paul Andrew

Technical Leadership Centred Around the Microsoft Data Platform

Deriving Dynamics 365

Deriving Solutions and features on Power Platform/Dynamics 365

The CRM Ninja

Thoughts & musings from a Dynamics 365 Ninja!

D CRM Explorer

Learn about Microsoft Dynamics CRM Power Platform customization and implementation and other cool stuffs

Stroke // Jonas Rapp

I know pre-stroke. I will improve who I was.

Power Melange

Power Melange By Shalinee

Clavin's Blog

Power Automate - Power Apps - SharePoint Online - Azure - Nintex - K2 - Artificial Intelligence

Sat Sangha Salon

An Inquiry in Being

The Indoencers

The Influencers & Influences of Indian Music

Monika Halan's blog

Hand's-free money management

D365 Demystified

A closer look at Microsoft Dynamics 365.

Microsoft Mate (msftmate) - Andrew Rogers

Experienced consultant primarily focused on Microsoft Dynamics 365 and the Power Platform


Specific topics by Django Lohn on the whole Microsoft365 Stack

Manmit Rahevar's Blog

One Stop Destination for Microsoft Technology Solutions


Naturally Curious

Brian Illand

Power Platform and Dynamics 365

Steve Mordue MVP

A Microsoft Business Applications MVP

Subwoofer 101

Bass defines your home theater

SQLTwins by Nakul Vachhrajani

SQL Server tips and experiences dedicated to my twin daughters.

Everything D365

Discovering Azure DevOps and D365 Business Applications

Tech Wizard

Lets do IT Spells

XRM Tricks (Power Platform & Dynamics CRM )

Power Platform & Dynamics CRM


Mail to for queries and suggestions

Giving back to the community what I have learned

xrm CRM Dynamics

Dynamics CRM Technical & Functional Info

Dynamics 365 Blogs - Explained in unique way

Sometimes you need to look at things from different perspective.

CRM Keeper

Dynamics 365 Customer Engagement, CRM, Microsoft CRM, Dynamics CRM


Step into the world of a Dynamics 365 Consultant

Dianamics PCF Lady

Diana & Dynamics 365 & Power Platform

%d bloggers like this: