Fixed – Cannot convert between unicode and non-unicode string data types error in SSIS


While using the OLE DB Source component in SSIS, on updating its SQL Command we started getting the below error for “Phone” column in our Table.

The fix is to open the Advanced Editor for the OLE DB Source component

Change the Data Type (non-unicode string) for the Phone column inside Output Columns in Input and Output Properties Tab.

Changing the Data Type from Unicode string [DT_WSTR] to string [DT_STR] fixed the issue.

Hope it helps..

Advertisements

How to – convert a blank string to Null for DateTime field – SSIS


Recently while writing an SSIS package, we were getting the below exception.

System.Exception: Error: Failed to set field ‘birthdate’ value to ” for entity ‘contact’ (field type: DateTime, value type: String). System.FormatException: String was not recognized as a valid DateTime. (SSIS Integration Toolkit for Microsoft Dynamics 365, v10.2.0.6972 – DtsDebugHost, v14.0.3025.39)

We were getting the exception because the source table was having the birth date field as a string with blank value instead of null and in the destination table we had the corresponding mapped field as DateTime type.

The fix is to add a Derived Column with the following expression, which converts the blank string to null à

“TRIM(ColumnName) == “” ? (DT_STR,4,1252)NULL(DT_STR,4,1252) : Columnname”


Hope it helps..

Advertisements

Fixed – The process cannot access the file .ispac because it is being used by another process error in Visual Studio (SSIS).


Recently while developing SSIS packages, the Visual Studio (SSDT) got crashed.

After restarting the Visual Studio and trying to execute the package we got the below error.

System.IO.IOException: The process cannot access the file ‘c:\folder\ssisproject.ispac’ because it is being used by another process.

at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)

at System.IO.File.Delete(String path)

at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder. IncrementalBuildThroughObj(IOutputWindow outputWindow)

at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.

BuildIncremental(IOutputWindow outputWindow)

ISPAC file is the integration services project deployment file containing the packages and the parameters.

The fix is to open task manager and end the DtsDebugHost.exe task.

DtsDebugHost

DtsDebugHost.exe is the SSIS Debug runtime process that executes the package.

Hope it helps..

Advertisements

Using Conditional Split Transformation in SSIS


Recently we had a requirement to Upsert Customer’s Spend related information from SQL DB to our CRM Online.

We had the spend captured per brand wise in the DB as shown below.

And in our CRM, we had one custom entity that had fields corresponding to each of these Brand as shown below

i.e. one section for each Brand and inside that section 5 fields corresponding to the columns in the SQL Table.

The GCR column of the table was the Alternate Key column in our custom entity which we used for Upsert.

Now to make sure we update only the corresponding Brand fields in CRM, we can make use of Conditional Split Transformation component.

Just after OLE DB source, add a Conditional Split component

Create one output per brand i.e. (select distinct Brand from the source table records) as shown below

Now add a CRM Destination Component and bind it to the corresponding output

And inside Columns section, map only the corresponding field of the selected Brand.

The finished package à

Hope it helps..

Optimum Batch Size while using SSIS Integration toolkit for Microsoft Dynamics 365


Hi,

https://nishantrana.me/2018/12/19/optimum-batch-size-and-thread-while-deleting-records-using-ssis-integration-toolkit-for-microsoft-dynamics-365/

Recently while working in a data integration task, we were evaluating the performance of using KingswaySoft’s Dynamics 365 Toolkit from within Visual Studio by trying out different possible combinations of batch size and threads.

The environment was CRM Online and had concurrent ExecuteMultiple max limit set to 20.

By default, it is set as 2 for online, but we can raise support ticket with Microsoft to get this limit increased. However, we need be very careful while using them as large number of ExecuteMultiple running in parallel might degrade the performance.

https://msdn.microsoft.com/en-us/library/microsoft.xrm.sdk.deployment.throttlesettings.executemultipleperorgmaxconnectionsperserver.aspx

Below are the 2 settings that we played around in the CRM Destination Component


These are the results we got with different batch size and thread.

  • Message Used– Upsert.
  • Total Number of Records – 20000.


It clearly shows that having batch size as 100 gives the best result.

And this is what is recommended by KingswaySoft as well.


https://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-365/faq

Hope it helps..

Advertisements

Fixed – CRM service call returned an error: The server is busy and the request was not completed. Try again later while inserting records using CRM Destination Component of KingswaySoft’s SSIS Integration Toolkit for Dynamics 365


While running one of our package we got the below error in few of the records

“CRM service call returned an error: CRM service call returned an error: The server is busy and the request was not completed. Try again later. (Error Code: -2147180543, Detail Message: The server is busy and the request was not completed. Try again later.)”

“CRM service call returned an error: CRM service call returned an error: The request channel timed out while waiting for a reply after 00:01:59.7991956. Increase the timeout value passed to the call to Request or increase the SendTimeout value on the Binding. The time allotted to this operation may have been a portion of a longer timeout. (Error Type / Reason: Timeout)”

One way to get around this error is to increase the Timeout (secs) in the Service Timeout Setting of CRM Connection Manager.

We updated it to 600 seconds and the number of records that failed got drastically reduced.

Also make sure that following option is checked

Hope it helps..

Advertisements