Historical Data Migration – Created On and Modified On in Dynamics 365


Very insightful article by Debajit !

Just to summarize –

If we are using SDK method from external application to set values for created on, modified on – we can use overriddencreatedon for created on. Modified on will be ignored.

Setting values in Pre-Create Plugin – values specified for both created on and modified on will be set in the record.

Azure Data Lake Storage Component in KingswaySoft – SSIS


Download and install the SSIS Productivity Pack

https://www.kingswaysoft.com/products/ssis-productivity-pack/download/

Drag the Azure Data Lake Storage Source component in the data flow

Double click and click on New to specify the connection

Provide the connection details and test the connection

  • It supports both Gen 1 and Gen 2

  • Supports the below Authentication modes

Inside the Azure Data Lake Storage Source component, we have specified our CSV file.

  • All Contact.csv file

Item Selection Mode:

  • Selected Item: Retrieves only the item specified by Source Item Path.
  • Selected Level: Retrieves the selected item and all immediate files and folders under the path specified by the Source Item Path option.
  • Selected Level (Files only): Retrieves the selected item and all immediate files under the folder as specified by the Source Item Path option.
  • Recursive: Retrieves the selected item (specified by the Source Item Path option) and all sub items recursively.
  • Recursive (Files only): Retrieves items the same as the Recursive mode but only returns files.

The page size refers to how many records to retrieve per service call

The columns page shows all the available attributes from the object specified in the General page

We have used the script component as the destination to read the values of all the above columns

The value for each of the columns –

datalakecsv

Get all the details here

https://www.kingswaysoft.com/products/ssis-productivity-pack/help-manual/cloud-storage

Hope it helps..

User multiplexing- SSIS (KingswaySoft) + Dynamics 365 CE / CDS / Dataverse


Must read article on managing API limits

Thanks – Gustaf Westerlund

Transfer files from local drive to Azure Blob using Azure Blob Upload Task – SSIS


Similar to Premium File Transfer Task,

The Azure Blob Upload Task component can be used to easily transfer files from local drive to Azure Blob storage.

https://docs.microsoft.com/en-us/sql/integration-services/control-flow/azure-blob-upload-task

Let us take a simple example to see it in action.

Here we will pick the folder Customer and its subfolders along with the files inside it and will move it or transfer to the Azure Blob Container.

Create a new SSIS Package and drag the Azure Blob Upload Task to the control flow designer.

Double click the task and specify the following values as shown below

AzureStorageConnection – specify the SSIS Connection Manager for Azure Storage.

Blob Container – the name of the existing blob container

Local Directory – the local directory containing the files to be uploaded.

Search Recursively – specify whether to search for files within Sub-directories.

File Name – specify the pattern for the files to be selected.

Time Range from/to – to pick files modified within that range.

Let us execute the package.

We can see the content transferred successfully to Azure Blog storage

 

Also, check out –

Using Azure Blob Storage component with Dynamics 365

https://nishantrana.me/2020/10/20/using-kingswaysoft-azure-blob-storage-component-with-dynamics-365/

Hope it helps..

Transfer files from local drive to Azure Blob using Premium File Transfer Task – SSIS


The Premium File Transfer Task component of KingswaySoft can be used to easily transfer files from local drive to Azure Blob storage.

https://www.kingswaysoft.com/products/ssis-productivity-pack/help-manual/premium-file-pack/premium-file-transfer-task

Let us take a simple example to see it in action.

Here we will pick the folder Customers and its subfolders along with the files inside it and will transfer it’s content to the Azure Blob Container.

Create a new SSIS Package and drag the Premium File Transfer Task to the control flow designer.

Double click the task and specify the following Source Properties

  • Action – Send Files – the other options are – Delete files, Create Directory, Remove Directory.
  • Check the option – Include Subdirectories
  • Connection Manager – Local File
  • Directory Path – specify the folder

Similarly, for Destination, we can specify the Azure Blog Storage Connection Manager and the directory path as shown below

Note – Make sure we have already added the connection for it to be available inside the connection manager option of Premium File Transfer Task. The other connection types supported are FTPS, SFT, Amazon S3, Azure Data Lake Storage, Box, Dropbox, Hadoop, OneDrive, SharePoint.

Let us run the package.

We can see the content transferred successfully to Azure Blog storage

 

 

Also, check out –

Using Azure Blob Storage component with Dynamics 365

https://nishantrana.me/2020/10/20/using-kingswaysoft-azure-blob-storage-component-with-dynamics-365/

https://nishantrana.me/2020/10/16/ssis-kingswaysoft-and-dynamics-365/

Hope it helps..

Changes in CDS / CRM Destination Component – SSIS Integration Toolkit for Dynamics 365


With version 20.2 November 2020 release, there have few updates added to the KingwaySoft’s CDS/CRM Destination Component, few are changes in the label and others are metadata changes, that one should be aware of before updating.

Check Post on SSIS and Dynamics 365

https://nishantrana.me/2020/10/16/ssis-kingswaysoft-and-dynamics-365/

  • Label changes.

Version 20.1 –

Version 20.2 –

  • CrmRecordId renamed to SavedRecordId

Version 20.1 –

Version 20.2 –

  • CrmErrorMessage renamed to ErrorMessage

Version 20.1 –

Version 20.2 –

Thus, any package saved with the new version will not work with the older version, and it is recommended to take full back up of the package before updating to the new version.

Hope it helps..