How to setup – Azure Synapse Link – Microsoft Dataverse


Azure Synapse Link (earlier known as Export to Data Lake Service) provides seamless integration of DataVerse with Azure Synapse Analytics, thus making it easy for users to do ad-hoc analysis using the familiar T-SQL with Synapse Studio, build Power BI Reports using Azure Synapse Analytics Connector or use Azure Spark in Azure Synapse for analytics.

As a first step, we need to create the Azure Synapse Workspace.

Login to Azure Portal (https://portal.azure.com/ ) with appropriate roles and create a Synapse workspace.

https://docs.microsoft.com/en-us/azure/synapse-analytics/get-started-create-workspace

Wait for the deployment to be complete. Below are the resources created as part of the deployment.

Login to the maker portal (https://make.powerapps.com/ ) and select the appropriate environment

Click on Azure Synapse Link and check the Connect to your Azure Synapse Analytics workspace (preview) and specify the storage account created in the previous step

Here we have selected the Contact and System User table for export.

On clicking Save, we got this error couple of times, even though the account being used was Owner across the Azure resources. Trying save 3-4 times more, allowed the link to be created.

Click on Go to Azure Synapse Analytics workspace, opens the Azure Synapse Studio

Within Synapse Studio we can see our Dataverse Database and run query.

Similarly, we create a Power BI report with Azure Synapse Analytics (SQL DW) connector.

Copy the Serverless SQL endpoint of the Synapse workspace.

Select the Direct Query option.

https://azure.microsoft.com/en-gb/blog/power-your-business-applications-data-with-analytical-and-predictive-insights/

Hope it helps..

Advertisements

How to – improve data migration performance – SSIS & Azure Data Factory (Dataverse / Dynamics 365)


In one of our projects, we were executing SSIS Packages (KingswaySoft’s Dynamics 365 SSIS Integration Toolkit) under Azure-SSIS Integration Runtime in Azure Data Factory.

Check out –

Deploy and run SSIS Package in Azure Data Factory

Deploy and run SSIS Packages that use KingswaySoft’s SSIS Integration Toolkit on Azure Data Factory.

After trying out different combinations, we eventually settled with batch size as 10 and thread as 15.

https://nishantrana.me/2021/06/08/data-migration-optimum-batch-size-and-threads-for-maximum-throughput-microsoft-dataverse-dynamics-365/

Also, we used multiplexing – running the CRM Destination Component under different application users.

To be precise, 4 in our case and we can increase it get further  improvement in the throughput.

And also based on the recommendation of our Microsoft’s Fast Track Architect we raised a Microsoft ticket to increase the number of web servers allocated from 2 to 3.

Below were our findings,

the earlier run was using batch size as 100 and thread as 20 with the number of servers as 2.

On updating the batch size to 10 and thread as 15 and with the number of servers allocated increased to 3, there was a huge performance gain.

Check the table below – 

The above table is sample run in the sandbox environment, during the final run in production we got the number of servers allocated, increased to 6, gaining further improvement.

Also, check out the below blog post to understand about the affinity cookie and its affect on performance, in case if we doing migration using custom code –

https://markcarrington.dev/2021/05/26/improving-bulk-dataverse-performance-with-enableaffinitycookie/

Hope it helps..

Advertisements

Azure Synapse Link / Export to Data Lake Service – Performance (initial sync)


Recently we configured the Export to Data Lake service for one of our projects.

Just sharing the performance, we got during the initial sync.

Entity

Count

Contact

2,36,2581

Custom entity

1,61,3554

The sync started at 11:47 A.M and was completed around 4:50 P.M. – around 5 hours i.e. 300 minutes

Let us consider total records synced as 3975000 – 1613446 + 2362581 + 2 = 3976029.

So the performance here comes down to

  • 795000 records per hour
  • 13250 records per minutes
  • 220 records per second.

Of course, it will vary depending on the specific environment, table / entity type, attributes in it etc. it gives us a rough idea.

Hope it helps..

Advertisements

Using AzCopy to sync the local data with Azure Storage


Using the sync command of azcopy, we can keep the local data synchronized with Azure Blob.

https://docs.microsoft.com/en-us/azure/storage/common/storage-ref-azcopy-sync

Suppose below is our storage account named – samplestorageaccountcrm

having the container named – mycrmfilescontainer inside it as shown below.

Below is how the URL for the container will look like

https://samplestorageaccountcrm.blob.core.windows.net/mycrmfilescontainer

i.e. format –

https://[storagename].blob.core.windows.net/[containername]

Let us generate the SAS token for the Storage account with the appropriate permissions..

Navigate to Shared access signature navigation link, specify the permissions and click on Generate SAS and connection string

Copy the generated SAS token and append it to the URL.

https://samplestorageaccountcrm.blob.core.windows.net/mycrmfilescontainer?sv=2020-02-10&ss=bfqt&srt=sco&sp=rwdlacupx&se=2021-03-09T02:50:19Z&st=2021-03-08T18:50:19Z&spr=https&sig=OKydecj8kMBzi%2Ff4dwutlHbIvYimQv9FGPQmKwott5w%3D

Now we are ready to run the AzCopy command to sync the contents of the below folder with the container.

On executing the command within PowerShell, it will scan the files at the source first, followed by the files in the destination, and will copy the files from the source that are not present in the destination.

Sample Run:-

We can see both the files uploaded in the container.

Now if we try to run the same command as the batch .bat file.

https://www.windowscentral.com/how-create-and-run-batch-file-windows-10

We might encounter the below error – “Server failed to authorize the request. Make sure the value of the Authorization header is formed correctly including the signature

This is because of the special characters within the SAS token – the signature part, that needs to be escaped.

https://bornsql.ca/blog/using-azcopy-with-batch-files-and-task-scheduler/

https://www.robvanderwoude.com/escapechars.php

Here the special character within the sig is replaced with appropriate escape sequences.

E.g. “%” with “%%”

Now updating the .bat file with the updated command allows us to run it successfully.

@ECHO OFF

“D:\azcopy_windows_amd64_10.9.0\azcopy.exe” sync “C:\Intel” “https://samplestorageaccountcrm.blob.core.windows.net/mycrmfilescontainer?sv=2020-02-10&ss=bfqt&srt=sco&sp=rwdlacupx&se=2021-03-09T02:50:19Z&st=2021-03-08T18:50:19Z&spr=https&sig=OKydecj8kMBzi%%2Ff4dwutlHbIvYimQv9FGPQmKwott5w%%3D”

PAUSE

Next, we can run the batch file within the task scheduler.

https://stackoverflow.com/questions/4437701/run-a-batch-file-with-windows-task-scheduler

Hope it helps..

Advertisements

Disable Security Defaults while login into Power Platform / Dynamics 365


Security Defaults provides preconfigured security settings such as MFA – Multi-factor authentication for all users, blocking legacy authentication protocols, etc.

Any tenant created on or after 22nd October 2019, will have this setting enabled for default.

    

An organization with complex security requirements could disable the security defaults and consider using Conditional Access instead.

Use Azure AD Conditional Access to block user access by device platform (Dynamics 365)

Use Azure AD Conditional Access to block access by country (Dynamics 365)

To disable Security default, login to Azure Portal

https://portal.azure.com

Navigate to Azure Active Directory > Properties

https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Properties


Toggle it to No and Save.


Hope it helps..

Advertisements

Upload file to Azure Blob Storage using BlobClient class – C#


Let us see a simple example to upload the file to Azure Blob Storage through a desktop application (C#).

Below is our Storage account and the container to which we will upload the files from the local drive.

Get the Connection String for the storage account from the Access Key area

Next – Create a console application or windows form application project and add the following NuGet Package

Azure.Storage.Blobs

Sample code –

The uploaded file – 

blob

 

Also check out – https://nishantrana.me/2020/11/25/use-azcopy-to-transfer-files-from-local-drive-to-azure-blog-storage/

Hope it helps..

 var filePath = @"D:\Sample.xlsx";

            var connectionString = "DefaultEndpointsProtocol=https;" +
                "AccountName=samplestorageaccountcrm;" +
                "AccountKey=aXX+FXLGqkT9yGOFQEfqPEKoW8oJZEX+kQQTW+kwU2AAcLNhVzpElaKqkzF18OLNd1pCy2NEniTMLTwwDoiv4Q==;" +
                "EndpointSuffix=core.windows.net";

            // intialize BobClient 
            Azure.Storage.Blobs.BlobClient blobClient = new Azure.Storage.Blobs.BlobClient(
                connectionString: connectionString, 
                blobContainerName: "mycrmfilescontainer", 
                blobName: "sampleBlobFileTest");
             
            // upload the file
            blobClient.Upload(filePath);
Advertisements