In the previous post, we saw how we can use the SuppressDuplicateDetection parameter to throw errors in case of duplicates when creating or updating a record.
Here instead of relying on exception, we can make use of the RetrieveDuplicatesRequest message to detect the duplicates.
Let us take the “Leads with the same e-mail address” duplicate detection rule to see it in action.
We can see 3 lead records already existing in the system with the same email address.
Below is our code that is trying to create a new lead record with the same email address and using the RetrieveDuplicatesRequest to check for the duplicates.
Below we can see that not only do we get the total number of duplicates found, but we can also get the details of the duplicate records found.
We can make use SuppressDuplicateDetection optional parameter of the Request if we want the configured duplicate detection rules to run and throw an exception while creating or updating the record.
We will use the below out-of-the-box duplicate detection rule, that checks for lead having same email address.
Below is our code and we can see the lead records with the same email address getting created without any exception.
Now we have updated the code to use the SuppressDuplicateDetection optional parameter. We have set it as false.
As expected this time we get the exception.
“The record was not created or updated because a duplicate of the current record already exists.”
With the preferred solution (preview), now we can specify a solution to which all our changes, that are made outside of the context of the unmanaged solution, should get automatically added.
To enable it, navigate to Settings >> Features >> Preferred solution (Preview)
Inside the Maker Portal, now we can see a new section added, asking us to specify the preferred solution, and we can see Common Data Services Default solution set as the preferred solution as default.
We can click on Manage to specify any of the existing unmanaged solutions as the preferred solution or to create a new solution.
Here we have set one of the solutions as the preferred solution.
Now let us add an update existing table outside the context of the solution. Here we are updating its form by rearranging some of the fields in it.
We have moved the Fax field to be the last field in that General Information section and have published the changes.
Back in our preferred solution, we can see the form we updated, automatically added.
Similarly, any other changes, i.e. any solution components added or updated, outside the context of the unmanaged solution, will be added to the preferred solution. (apart from changing the Default Solution where all the solution components reside).
Also, other users/makers can specify their preferred solution.
To add the cloud flows or canvas apps, created outside the context of the solution, in the preferred solution, we can enable the below features.
Here we have created this sample flow from outside the solution and also a canvas app.
We can convert our basic queue to an advanced queue by setting the field “Is Omnichannel Queue” or “Automatic work distribution” schema name “msdyn_isomnichannelqueue“- to Yes.
We would usually do it to use the existing basic queue in the Unified Routing.
We can find that option in the Conflicts Tab of the Queue form
Or in case you cannot find the conflicts tab, you can add the field to the form
Here we need to be careful while converting the basic queue to an advanced queue, as we cannot revert this. If we try to do this we will get the below error –
Recently we were getting the below notifications in our Visual Studio 2019.
The way we managed to fix it was by disabling the following option – “Use 64-bit process for code analysis“
Navigate to Tools >> Options >> Text Editor >> C# >> Advanced
The “Use 64-bit process for code analysis” option in Visual Studio allows the code analysis tools to run in a 64-bit process instead of a 32-bit process. This option is relevant when we are performing static code analysis, and here we had recently installed a Visual Studio extension that does the same. So might as well be a compatibility issue with that extension.