Analytics capabilities of Kupp Code Analytics


In the previous posts, we covered the Overview and Key Features of Kupp Code Analytics, the installation and setup process of the extension, and its Intellisense capabilities.

Here we’d have a quick look at the analytics capabilities of the extension.

To enable it, inside Visual Studio, navigate to Tools >> Options >> Kupp Code Analytics >> General or Extensions >> Kupp Code Analytics >> Analytics >> Configure Analytics

Set “Enable C# Code Analyzer” to “True”. Requires Visual Studio to be restarted on change.

To run the analytics, select Extensions >> Kupp Code Analytics >> Analytics >> Run

Below we can see the results of running the analytics on our sample plugin class.

Let us see the code analysis rules one be one.

PCA001: The lPlugin interface should not be used directly.

The suggestion is to use a custom base class instead to handle the call delegation, using the context information of the plugin.

PCA002: Plugins should be stateless. Remove all instances properties and values.

Plugins are instantiated on a per-request basis and handle specific execution contexts. The plugin instances are short-lived and should not be assumed to persist across multiple requests. Also if multiple plugin instances execute especially in the case of bulk operations, it could lead to concurrency issues.

PA002: Attribute collection should only include the changed attributes. Create a new entity for the update

While updating the table or entity, create a new Entity instance and only include those attributes that are changed.

PA001: Specify the required columns instead of retrieving all columns.

Specify the required attributes to be retrieved instead of all columns as this would impact the performance.

EBEA002: Late bound entities should be replaced with early bound entities.

SVA001: Entity Logicalname ‘myTable’ doesn’t exist.

It is suggested to use early bound entities as they provide Type Safety and IntelliSense Support decreasing the likelihood of runtime errors and can enhance productivity through auto-completing and context-aware suggestions.

These were a few of the examples, that show the key capabilities of the extension, for the complete list, please refer to the product documentation

Hope it helps..

Advertisements

Entity Id must be specified for Operation – Dataverse / Dynamics 365


We might get the below error while performing the update operation.

This could be if we have not specified the GUID of the record to be updated or if we are using an Alternate Key to update the record, we have not specified the KeyAttributes property.

We either need to specify the Id property or specify the KeyAttributes property

Or in the Entity’s Constructor.

Hope it helps..

Advertisements

ERROR REQUESTING Token FROM THE Authentication context – General ADAL Error (Dataverse / Dynamics 365)


We might get the below error while connecting to the Dataverse Web API using the client ID and client secret.

AADSTS700016: Application with identifier ‘6d8ff73a-27ef-443c-b524-d8b69ae87580’ was not found in the directory ‘w72tk’. This can happen if the application has not been installed by the administrator of the tenant or consented to by any user in the tenant. You may have sent your authentication request to the wrong tenant.

This could be if we have specified wrong Client or App Id.

Check the correct App Id either from Azure you can also refer to the correponding Application User created for it.

Hope it helps..

Advertisements

Free up storage space – Dataverse / Dynamics 365


Recently for setting up a new environment, we created a new sandbox environment and copied the production environment to it. Next, we had to reduce the storage space occupied by this new environment.

We followed the below steps

  • Bulk Deletion Job – Delete Email Messages having Attachments older than 1 month.
  • Bulk Deletion Job – Delete Notes having attachments older than 1 month.
  • Delete System Job with Status as Succeeded.
  • Delete Process Session with Status as Complete
  • For Logs, we deleted Audit Logs
  • Similarly, we can also delete Plugin Trace Logs records.

These steps allowed us to decrease the Database Usage from 110 GB to 65 GB.

And File Usage from 315 GB to 16 GB.

Also, check –

Get all the details here –

https://learn.microsoft.com/en-us/power-platform/admin/free-storage-space

Hope it helps..

Advertisements

Use Copilot to create a journey – Dynamics 365 Marketing


We can make use of Copilot to create a journey for us, using everyday conversational language.

To enable it, navigate to

Settings >> Overview >> Feature Switches >> Journey (Copilot)

Let us see it in action, by creating a new journey record.

We get the option to select predefined
examples to start with.

Here we have selected the last example “When a contact submits a marketing form, assign a phone call….”

On selecting it, we are presented with the option to specify the Trigger.


We have selected the existing Marketing Form Submitted trigger record here.

After specifying the trigger, we get the option to specify the marketing form or to leave it empty to run it for all the form submissions and also the audience type, which could be either Contact or Lead.

Clicking on Submit gives the option to review and then eventually Create Journey.

Clicking on Create Journey generates the journey for us.

We can review the journey, add any further content required, modify it, etc.

For example, we need to specify the follow-up email to be sent, before we can save and publish it.

Once we are done with defining the journey, we can publish it.

Get more details here.

Hope it helps..

Advertisements

Using xMultiple along with User Multiplexing for improved performance – KingswaySoft SSIS Integration Toolkit (Dataverse / Dynamics 365)


Let us continue our previous post, where we observed performance improvements by using User Multiplexing

Now let us try making use of the xMultiple feature (CreateMultiple, UpdateMultiple, and CreateMultiple messages) of the CRM / CDS Destination Component.

We have updated the Batch Size to 100 to trigger the xMultiple

However this time we got the service throttling error, and it took around 17:45 minutes.

Let us try decreasing the batch size to 50 (to trigger xMultiple), keeping the thread the same as 20, and User Multiplexing with 5 Application users.

No throttling warning this time and took around 10:42 minutes.

Now let us try the same setup, for a custom table instead of a standard table.

Here we have run our package to create 20K records, with User
Multiplexing ( 5 Application users), Batch Size 10, and 20 Threads for our custom table named My Table.

It took around 3:04 minutes.

Let us increase the batch size to 100, to get the xMultiple enabled.

It took 1:06 minutes.

Let us set the batch size to 500

It took around 42 seconds.

And with 1000 batch size – 1:04 seconds

We can see huge performance improvements using xMultiple when it comes to a custom table.

So I think to get the performance improvements for the standard table we could stick with Batch Size – 10, Thread10-20, and increase the number of users (Multiplexing).

But for the custom table, we could increase the batch size to either 100 or 500 to make use of xMultiple along with Multiplexing.

Hope it helps.

Advertisements

Nishant Rana's Weblog

Everything related to Microsoft .NET Technology

Skip to content ↓