Optimize Delete operation – Dataverse / Dynamics 365


Recently we had to delete records for one of our entities, and we tried out different combinations of batch sizes and the number of threads with 25000 records as a sample to find the optimum setting.

Below is our sample SSIS Package (uses KingswaySoft Dynamics 365 Tool Kit), it retrieves 25000 record’s GUID (Contact table / entity) and then distributes it equally among 3 different CRM Destination Component running under different users (CRM Connection Managers).

How to – improve data migration performance – SSIS & Azure Data Factory (Dataverse / Dynamics 365)

Below is our Premium Derived Column where we have added a new column with expression IncrementValue()

In Conditional Split component, we are then using this new column added to distrubute the output across three CRM Destination Component, each using a different CRM Connection Managers running under different application users.

We first started with 10 batch size and 20 thread followed by different combinations after that à

Below were our findings ->

Records Count Batch Size Thread Parallel Users Elapsed Time
25000 10 20 3 00:15:58.578
25000 10 15 3 00:14:43.734
25000 10 10 3 00:16:06.438
25000 10 5 3 00:23:52.094
25000 10 15 2 00:18.55.012
25000 10 15 1 00:39:15.828
25000 20 30 1 00:39:12.781

As we can see the Batch size 10 and thread around 15 gave us the best performance. However, evert environment / conditions will would be different so we should try out different combinations before finalizing.

SSIS and Microsoft Dynamics 365

Hope it helps..

Advertisements

Interactive login option in CDS/CRM Connection Manager in KingswaySoft Dynamics 365 Integration Toolkit


With the new release, the CDS/CRM connection manager adds a new Interactive Login option in the CDS/CRM Connection Manager for Authentication Type as OAuth.

Interactive login allows the user to log in using his account details (to establish the connection with CRM) without the need for registering the application in the Azure Active Directory.

This is supposed to be used only during design time.

Enter User Name and the CDS/CRM URL and click on Test Connection.

The login screen pops up, where we can enter the credentials and sign in.

We’d receive the Test connection succeeded message.

Now we are ready to use the CRM Connection.

Now when we will run the package from within the Visual Studio (SSDT), it will again ask for entering the credentials.

The other option is to use the OAuth Type Password along with default Client App ID and Redirect URL

https://docs.microsoft.com/en-us/powerapps/developer/data-platform/xrm-tooling/use-connection-strings-xrm-tooling-connect

Hope it helps..

Advertisements

How to – improve data migration performance – SSIS & Azure Data Factory (Dataverse / Dynamics 365)


In one of our projects, we were executing SSIS Packages (KingswaySoft’s Dynamics 365 SSIS Integration Toolkit) under Azure-SSIS Integration Runtime in Azure Data Factory.

Check out –

Deploy and run SSIS Package in Azure Data Factory

Deploy and run SSIS Packages that use KingswaySoft’s SSIS Integration Toolkit on Azure Data Factory.

After trying out different combinations, we eventually settled with batch size as 10 and thread as 15.

https://nishantrana.me/2021/06/08/data-migration-optimum-batch-size-and-threads-for-maximum-throughput-microsoft-dataverse-dynamics-365/

Also, we used multiplexing – running the CRM Destination Component under different application users.

To be precise, 4 in our case and we can increase it get further  improvement in the throughput.

And also based on the recommendation of our Microsoft’s Fast Track Architect we raised a Microsoft ticket to increase the number of web servers allocated from 2 to 3.

Below were our findings,

the earlier run was using batch size as 100 and thread as 20 with the number of servers as 2.

On updating the batch size to 10 and thread as 15 and with the number of servers allocated increased to 3, there was a huge performance gain.

Check the table below – 

The above table is sample run in the sandbox environment, during the final run in production we got the number of servers allocated, increased to 6, gaining further improvement.

Also, check out the below blog post to understand about the affinity cookie and its affect on performance, in case if we doing migration using custom code –

https://markcarrington.dev/2021/05/26/improving-bulk-dataverse-performance-with-enableaffinitycookie/

Hope it helps..

Advertisements

Data Migration – Optimum batch size and threads for maximum throughput – Microsoft Dataverse (Dynamics 365)


For one of our projects, we were trying to figure out the optimum batch size and threads while using the CDS/CRM Destination component of KingswaySoft.

https://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-365/help-manual/crm/destination

Now with Service Protection API limits in place, to get the maximum throughput the first point to consider is User Multiplexing.

Nicely explained here à

https://powerplatform.se/fast-data-management-in-a-limited-cds-world/

We are using the Conditional Split component to divide the load among the different CDS Destination Components which are running under different application users (using different CDS Connection Manager)

In the below data flow, we have 5 different CDS Destination components, each using a different Connection Manager configured using different users.

Here we took 50000 records as a sample for ContactCreate operation with 3 and 5
CDS Destination Component.

Below were our findings à

As suggested in the article, batch size as 10 and thread as 16 seem to give the best performance.

The trick is here to use as many threads as possible running under different users.

The CDS Destination component of KingswaySoft handles the server-side throttling automatically, it will follow the instruction and wait before trying again the same request.

On trying with 500 as the batch size, we got the error.

[Dynamics CRM Destination [18]] Error: An error occurred with the following error message: KingswaySoft.IntegrationToolkit.DynamicsCrm.CrmServiceException: CRM service call returned an error: The operation has timed out (Error Type / Reason: Timeout, Detailed Message: The operation has timed out) (SSIS Integration Toolkit for Microsoft Dynamics 365, v20.2.0.3083 – DtsDebugHost, 15.0.2000.128)KingswaySoft.IntegrationToolkit.DynamicsCrm.WebAPI.WebApiServiceException : The operation has timed out (Error Type / Reason: Timeout, Detailed Message: The operation has timed out)System.Net.WebException (Status Reason: Timeout): The operation has timed out”.

So basically try out the different combinations to get the maximum throughput.

Also check à

https://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-365/help-manual/crm/destination

I think below will still be relevant but for CRM On-Premise

https://nishantrana.me/2018/12/19/optimum-batch-size-and-thread-while-deleting-records-using-ssis-integration-toolkit-for-microsoft-dynamics-365/

Batch Size and Copy of Paralleism for Azure Data Factory and Microsoft Dataverse

https://nishantrana.me/2021/05/25/write-batch-size-data-integration-unit-and-degree-of-copy-parallelism-in-azure-data-factory-for-dynamics-crm-365-dataset/

More articles on KingswaySoft and Dynamics 365 / Dataverse

https://nishantrana.me/2018/11/26/ssis-and-microsoft-dynamics-365/

Hope it helps..

Advertisements

Retrieving security role privileges changes (audit) using KingswaySoft’s Dynamics 365 SSIS Integration Toolkit


We can specify Source Type as AuditLogs within the CDS Source Component Editor of KingswaySoft to fetch the Audit information.

Get the details :- https://nishantrana.me/2018/10/08/using-kingswaysofts-cds-crm-source-component-to-get-audit-information-in-dynamics-365-ce-ssis/

The Souce TypeAuditLogs includes Output type – Audit Details (Role Privileges), which can be used to fetch any audit information related to security roles.

Audit Details (Role Privileges) contains 4 columns, which we have mapped to columns in excel.

  • AuditId
  • OldRolePrivileges
  • NewRolePrivileges
  • InvalidRolePrivileges

Let us execute the package

The output à

Here, the 1st row is when a new security role was created, the 2nd row is when an existing security role was updated and 3rd row is when a security role was deleted.

To get the complete information we need to combine this information (id which is auditid – first column) with the Primary output.

The columns of the Primary Output

After mapping, let us run the package with the Primary Output type.

Here below we can see the complete details by combining both the output using the auditid column

Note – to get the Audit Details (Role Privileges), we need to use SOAP – Service End Point type.

Also refer –

http://mscrmshop.blogspot.com/2016/06/auditing-security-roles-in-crm.html

http://www.kingswaysoft.com/blog/2019/10/16/Extracting-Audit-Logs-for-Multiple-CRM-Entities

https://nishantrana.me/2018/11/26/ssis-and-microsoft-dynamics-365/

https://community.dynamics.com/crm/b/mscrmcustomization/posts/ms-crm-audit-database-table-details

Hope it helps..

Advertisements

Historical Data Migration – Created On and Modified On in Dynamics 365


Very insightful article by Debajit !

Just to summarize –

If we are using SDK method from external application to set values for created on, modified on – we can use overriddencreatedon for created on. Modified on will be ignored.

Setting values in Pre-Create Plugin – values specified for both created on and modified on will be set in the record.