Advanced configuration settings – Azure Synapse Link / Export to Data Lake service (Dataverse/ Dynamics 365)


The Export to Data Lake service now has some Advanced configuration settings available.

To learn more on Export to Data Lake service

https://nishantrana.me/2020/12/10/posts-on-azure-data-lake/

The new settings allow us to configure how the DataVerse / CRM table data is written to Azure Data Lake.

  • In-Place update or upsert (default)
  • Append Only

With the in-place update, the default setting, the file will contain the full data set, and any update in the source will update the same in the synced CSV file or the data partition, similarly, any record deleted will delete the row from the data partition, unlike Append Only where a new row will be added in case of both update and delete.

For huge volume of data, Microsoft recommends opting for Append only mode. This mode is also preferable when an organization wants to incrementally review the changed data.

The other option is to define the data partition strategy.

  • By Month (default)
  • By Year

With this option, files generated are partitioned by either year or more granular month-wise, which can be specified per-table basis.

Microsoft recommends Monthly partition if data volume is high.

Now, let us see it in action.

For the Lead table, we haven’t selected the option for advanced configuration settings and are going by default.

  • Append Only – No
  • Partition Strategy – Month

For contact, we have enabled the advanced configuration settings and opted for Partition Strategy as Year.

For Account , we have opted for Append Only as true, for which the Partition strategy option is disabled and set as Year.

The final configuration à

Within the container inside the Storage Account, we can see corresponding folders created per table/entity along with model.json as shown below.

Let us explore the Lead folder –

We can see 2 CSV created with format YYYY-MM.csv i.e. having the month part in it because we had specified Partition Strategy as Month i.e. the default value.

For Contact and Account, the Pattern Strategy was Year, so we have files generated in format YYYY.csv

Let us update one of the lead records by appending ‘Updated’ in the last name field.

After the successful sync,

we can see the record updated in the .csv / partition.

The same is the case with the contact record.

Now let us update an account record, it had Append Only specified as Yes.

Here we update the Account Name field from Litware to Litware Updated.

After the sync

We can see a new row appended with the updated record along with the original record.

Let us delete the same account record

As expected, being Append Only mode, we can see a new row added for the Litware record.

We have 2 additional rows apart from the original row, one created for update and the other for delete action.

Export to Data Lake service is Microsoft’s recommended way of synchronizing Dataverse Data with external storage, and we can see them continuously investing and adding enhancements to it.

Get all the details here –

https://docs.microsoft.com/en-us/powerapps/maker/data-platform/export-to-data-lake#data-partition-strategy

Hope it helps..

Advertisement

Write batch size, data integration unit, and degree of copy parallelism in Azure Data Factory for Dynamics CRM / 365 Dataset


Let us take a simple example where we are moving contact records (.CSV) stored in Azure File Share to Dataverse or Dynamics 365 (UPSERT).

CSV file has 50000 sample contact records (generated using https://extendsclass.com/csv-generator.html) stored in Azure File Storage.

Another option of generating sample data

https://nishantrana.me/2020/05/26/using-data-spawner-component-ssis-to-generate-sample-data-in-dynamics-365/

The Source in our Data Factory pipeline.

The Sink is our Dynamics 365 / Dataverse sandbox environment, here we are using the Upsert write behavior.

For the Sink, the default Write batch size is 10.

Max concurrent connections specify the upper limit of concurrent connections that can be specified.

Below is our Mapping configuration

The Settings tab for the pipeline, allows us to specify,

Data Integration Unit specifies is the powerfulness of the copy execution.

Degree of copy parallelism specifies the parallel thread to be used.

Let us run the pipeline with the default values.

  • Write Batch Size (Sink) – 10
  • Degree of copy parallelism – 10
  • Data integration unit – Auto (4)

The results à It took around 58 minutes to create 50K contact records.

We then ran the pipeline few more times by specifying the different batch sizes and degree of copy parallelism.

We kept Max concurrent connections as blank and Data Integration Unit as Auto. (during our testing even if we are setting it to higher values, the used DIUs value as always 4)

Below are the results we got à

Write Batch Size Degree of copy parallelism Data Integration Unit (Auto) Total Time (Minutes)
100 8 4 35
100 16 4 29
1000 32 4 35
       
250 8 4 35
250 16 4 25
250 32 4 55
       
500 8 4 38
500 16 4 29
500 32 4 28
       
750 8 4 37
750 16 4 25
750 32 4 17
       
999 8 4 36
999 16 4 30
999 32 4 20

The results show that increasing the batch size and degree of copy parallelism improves the performance in our scenario.

Ideally, we should run a few tests with different combinations before settling for a specific configuration as it could vary.

On trying to set the batch size to more than 1000,

We would get the below error à
ExecuteMultiple Request batch size exeeds the maximum batch size allowed.

Also refer –

Optimizing Data Migrationhttps://community.dynamics.com/crm/b/crminthefield/posts/optimizing-data-migration-integration-with-power-platform

Using Data Factory with Dynamics 365https://nishantrana.me/2020/10/21/posts-on-azure-data-factory/

Optimum batch size with SSIShttps://nishantrana.me/2018/06/04/optimum-batch-size-while-using-ssis-integration-toolkit-for-microsoft-dynamics-365/

Hope it helps..

Advertisements

How to – Use AzCopy to sync the local data with Azure Storage


Using the sync command of azcopy, we can keep the local data synchronized with Azure Blob.

https://docs.microsoft.com/en-us/azure/storage/common/storage-ref-azcopy-sync

Suppose below is our storage account named – samplestorageaccountcrm

having the container named – mycrmfilescontainer inside it as shown below.

Below is how the URL for the container will look like

https://samplestorageaccountcrm.blob.core.windows.net/mycrmfilescontainer

i.e. format –

https://[storagename].blob.core.windows.net/[containername]

Let us generate the SAS token for the Storage account with the appropriate permissions..

Navigate to Shared access signature navigation link, specify the permissions and click on Generate SAS and connection string

Copy the generated SAS token and append it to the URL.

https://samplestorageaccountcrm.blob.core.windows.net/mycrmfilescontainer?sv=2020-02-10&ss=bfqt&srt=sco&sp=rwdlacupx&se=2021-03-09T02:50:19Z&st=2021-03-08T18:50:19Z&spr=https&sig=OKydecj8kMBzi%2Ff4dwutlHbIvYimQv9FGPQmKwott5w%3D

Now we are ready to run the AzCopy command to sync the contents of the below folder with the container.

On executing the command within PowerShell, it will scan the files at the source first, followed by the files in the destination, and will copy the files from the source that are not present in the destination.

Sample Run:-

We can see both the files uploaded in the container.

Now if we try to run the same command as the batch .bat file.

https://www.windowscentral.com/how-create-and-run-batch-file-windows-10

We might encounter the below error – “Server failed to authorize the request. Make sure the value of the Authorization header is formed correctly including the signature

This is because of the special characters within the SAS token – the signature part, that needs to be escaped.

https://bornsql.ca/blog/using-azcopy-with-batch-files-and-task-scheduler/

https://www.robvanderwoude.com/escapechars.php

Here the special character within the sig is replaced with appropriate escape sequences.

E.g. “%” with “%%”

Now updating the .bat file with the updated command allows us to run it successfully.

@ECHO OFF

“D:\azcopy_windows_amd64_10.9.0\azcopy.exe” sync “C:\Intel” “https://samplestorageaccountcrm.blob.core.windows.net/mycrmfilescontainer?sv=2020-02-10&ss=bfqt&srt=sco&sp=rwdlacupx&se=2021-03-09T02:50:19Z&st=2021-03-08T18:50:19Z&spr=https&sig=OKydecj8kMBzi%%2Ff4dwutlHbIvYimQv9FGPQmKwott5w%%3D”

PAUSE

Next, we can run the batch file within the task scheduler.

https://stackoverflow.com/questions/4437701/run-a-batch-file-with-windows-task-scheduler

Hope it helps..

Advertisements

Disable Security Defaults while login into Power Platform / Dynamics 365


Security Defaults provides preconfigured security settings such as MFA – Multi-factor authentication for all users, blocking legacy authentication protocols, etc.

Any tenant created on or after 22nd October 2019, will have this setting enabled for default.

    

An organization with complex security requirements could disable the security defaults and consider using Conditional Access instead.

Use Azure AD Conditional Access to block user access by device platform (Dynamics 365)

Use Azure AD Conditional Access to block access by country (Dynamics 365)

To disable Security default, login to Azure Portal

https://portal.azure.com

Navigate to Azure Active Directory > Properties

https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Properties


Toggle it to No and Save.


Hope it helps..

Advertisements

How to – Upload file to Azure Blob Storage using BlobClient class – C#


Let us see a simple example to upload the file to Azure Blob Storage through a desktop application (C#).

Below is our Storage account and the container to which we will upload the files from the local drive.

Get the Connection String for the storage account from the Access Key area

Next – Create a console application or windows form application project and add the following NuGet Package

Azure.Storage.Blobs

Sample code –

The uploaded file – 

blob

Also check out – https://nishantrana.me/2020/11/25/use-azcopy-to-transfer-files-from-local-drive-to-azure-blog-storage/

Hope it helps..

 var filePath = @"D:\Sample.xlsx";

            var connectionString = "DefaultEndpointsProtocol=https;" +
                "AccountName=samplestorageaccountcrm;" +
                "AccountKey=aXX+FXLGqkT9yGOFQEfqPEKoW8oJZEX+kQQTW+kwU2AAcLNhVzpElaKqkzF18OLNd1pCy2NEniTMLTwwDoiv4Q==;" +
                "EndpointSuffix=core.windows.net";

            // intialize BobClient 
            Azure.Storage.Blobs.BlobClient blobClient = new Azure.Storage.Blobs.BlobClient(
                connectionString: connectionString, 
                blobContainerName: "mycrmfilescontainer", 
                blobName: "sampleBlobFileTest");
             
            // upload the file
            blobClient.Upload(filePath);
Advertisements

How to – Manage Dynamics 365 Web API with Azure API Management


Azure API Management is an Azure service to create consistent API gateways for secure, scalable access for back-end applications and services.

Azure API Management consists of 3 main components

  • API Gateway

  • Azure Portal for administration

  • Developer Portal for API documentation

Each API inside Azure API Management contains a reference to the back-end service that implements the API and its operations.

Let us start by creating the Azure API Management resource –

Login to Azure Portal

https://portal.azure.com/

Search for API Management

Provide the appropriate details. (Here we have selected the – Developer tier)

After validation is passed, review and click on Create.

It will take around 30 minutes for the deployment to be finished

After the deployment is successful, we can navigate to it and can find the Gateway URL and Developer portal URL as shown below.

Here we will start with a Blank API.

Specify the display name, name and for Web service URL the URL of Dynamics 365 Web API

Click on +Add operation to add a new operation to the API.

Specify the URL as shown below to fetch all the contacts from Dynamics 365.

The URL of the operation

Right now we will get the 401 error as expected as we have not passed the token expected by the Web API.

Now for the token part for calling the Dynamics 365 API, register the Application in Azure AD, create a new Application User, and assign appropriate security roles to it.

https://docs.microsoft.com/en-us/powerapps/developer/data-platform/authenticate-oauth#connect-as-an-app

Here we would be defining send-request policy, to generate the Token and pass it in the Authorization header to the Dynamis 365 Web API request.

Select the GET operation, navigate to the Design tab and open the policy code editor for inbound processing

Add the send-request and set-header policy to generate and set the bearerToken

Specify the endpoint URL of OAuth token, client id, client secret of the application registered.

"copy code from the end of the post"

Save the change and let us test the API.

We can see the results as expected.

The other things that can be done are to associate the API with Products, specify Subscription, Security, enable Application Insights, Azure Monitor etc.

Specify policies –

https://docs.microsoft.com/en-us/azure/api-management/api-management-policies

References

https://transform365.blog/2020/03/29/azure-api-management-and-dynamics-365-web-api/

https://app.pluralsight.com/library/courses/microsoft-azure-developer-implement-api-management/table-of-contents

Hope it helps..

<policies>
  <inbound>
    <base />
    <send-request mode="new" response-variable-name="bearerToken" timeout="20" ignore-error="true">
      <set-url>https://login.microsoftonline.com/89a735bf-2d85-4a5b-a74a-59656af50f2e/oauth2/token</set-url>
      <set-method>POST</set-method>
      <set-header name="Content-Type" exists-action="override">
        <value>application/x-www-form-urlencoded</value>
      </set-header>
      <set-body>@{ return "client_id=510b66c9-4841-4d3d-8e95-150779adcb3e&resource=https://gcrm.crm.dynamics.com&client_secret=t~6DU7Ma4GZjh.M0Xf7eCizy.E~ME4zy_3&grant_type=client_credentials"; }</set-body>
    </send-request>
    <set-header name="Authorization" exists-action="override">
      <value>
        @("Bearer " + (String)((IResponse)context.Variables["bearerToken"]).Body.As<JObject>()["access_token"])</value>
    </set-header>
  </inbound>
  <backend>
    <base />
  </backend>
  <outbound>
    <base />
  </outbound>
  <on-error>
    <base />
  </on-error>
</policies>
Advertisements
%d bloggers like this: