How to – Access Dataverse environment storage using SAS Token


We can access our environment’s storage using a SAS token. 

Download – Azure Storage Explorerhttps://azure.microsoft.com/features/storage-explorer/#overview

Select Connect to Azure resources option

Select ADLS Gen2 container or directory for the Azure storage type.

Select the Shared access signature URL for the connection

To get the SAS URLhttps://[ContainerURL]/CDS?[SASToken],

we need ContainerURL and SASToken.

To get Container URL use the below link –

ContainerURL = <a href="https:///api/data/v9.1/datalakefolders”>https://&lt;EnvironmentURL>/api/data/v9.1/datalakefolders

In our case – https://mycrm2022.crm.dynamics.com/api/data/v9.1/datalakefolders

The containerendpoint = https://aeth2b4c5907361d483c9641.dfs.core.windows.net/aeth-b986098f-1315-45cc-b8e6-61c5a1a1631a

Copy the value of containerendpoint (ContainerURL)


Navigate to the below URL by replacing the value of the Environment URL and Container URL

<a href="https:///api/data/v9.1/RetrieveAnalyticsStoreAccess(Url=@a,ResourceType=’Folder&#8217;,Permissions=’Read,List’)?@a=’/CDS”>https://&lt;EnvironmentURL>/api/data/v9.1/RetrieveAnalyticsStoreAccess(Url=@a,ResourceType=’Folder’,Permissions=’Read,List’)?@a='<ContainerURL>/CDS’

i.e.

https://mycrm2022.crm.dynamics.com/api/data/v9.1/RetrieveAnalyticsStoreAccess(Url=@a,ResourceType=’Folder’,Permissions=’Read,List’)?@a=%27https://aeth2b4c5907361d483c9641.dfs.core.windows.net/aeth-b986098f-1315-45cc-b8e6-61c5a1a1631a/CDS%27

to get the SAS token (the SAS Token will expire after 1 hour)


Construct the SAS URL which we will use in the Azure Explorer’s connection info  <a href="https:///CDS?https://<ContainerURL>/CDS?<SASToken&gt;

https://aeth2b4c5907361d483c9641.dfs.core.windows.net/aeth-b986098f-1315-45cc-b8e6-61c5a1a1631a/CDS?skoid=3c1e8701-292f-478f-8f42-df7e9cc692c9&sktid=975f013f-7f24-47e8-a7d3-abc4752bf346&skt=2022-11-04T19%3A02%3A44Z&ske=2022-11-04T20%3A18%3A44Z&sks=b&skv=2020-08-04&sv=2020-08-04&se=2022-11-04T20%3A08%3A44Z&sr=d&sp=rl&sdd=1&sig=kKHHJLbb4gjAtTiQCbiuwQ8QrInY33zGEY1FvpxCLK%03D0

Now we have access to the storage

Here we have opened the contact folder just as an example


We only have read-only access, any other access like delete, rename, upload, etc. will fail


Get all the details here – https://learn.microsoft.com/en-us/power-platform/admin/storage-sas-token

Hope it helps..

Advertisements
Advertisement

How to – quickly find Tenant Id https://www.whatismytenantid.com/


Recently we had to find the tenant id, usually, we will log in to Azure Portal and within Azure Active Directory we will get the tenant id.

However, in this case, we only had the user’s email address.

So we had https://www.whatismytenantid.com/ to help us here.

Just enter the domain name, and click on Find my tenant ID.

The result –

The corresponding PowerShell script – https://craigporteous.com/azure-tenant-id-powershell/

Hope it helps..

Advertisements

Can’t delete billing profile. This is the only billing profile in the billing account- Azure (Trial)


Recently our Azure Subscription using free credit had expired and we wanted to remove the card details associated with the billing.

If we try to Detach it (and if the card is being used as the default payment), it will give us the below message

Cost Management >> Payment Methods > > Detach

There are active or disabled Azure subscriptions on this billing profile. To detach this payment method, cancel and then delete all such subscriptions.​”

And if we trying to delete the billing profile

Cost Management >> Billing profiles


We would get the below message


So first we need to delete the subscription here.

Now again we can only delete the free trial subscription after 3 days of canceling it. So we would have to try deleting the subscription after 3 days.


Trying after 3 days,


will give us the option to delete the subscription


But as we have the resources associated we will have to delete the resources first.


Navigate to All resources and delete the resources first.



After deleting the resources, deleting the subscription will be successful

Let us now try to detach the card again.

This time the Azure Subscription existing message was gone as we had deleted it, however, we were still not able to detach and it was asking to wait until the end of the billing period.


Instead of waiting till the end of the billing period – next we tried the below option

https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cancel-azure-subscription#how-do-i-delete-my-azure-account


i.e. Deleting the Azure AD Account. (through another admin user’s account)



After successful delete, we can see all the options disabled.


So basically the fastest way to remove your personal information from the Azure Free Trial is to cancel the subscription, delete it and then DELETE the account/user.

Hope it helps..

Advertisements

How to setup – Azure Synapse Link – Microsoft Dataverse


Azure Synapse Link (earlier known as Export to Data Lake Service) provides seamless integration of DataVerse with Azure Synapse Analytics, thus making it easy for users to do ad-hoc analysis using the familiar T-SQL with Synapse Studio, build Power BI Reports using Azure Synapse Analytics Connector or use Azure Spark in Azure Synapse for analytics.

As a first step, we need to create the Azure Synapse Workspace.

Login to Azure Portal (https://portal.azure.com/ ) with appropriate roles and create a Synapse workspace.

https://docs.microsoft.com/en-us/azure/synapse-analytics/get-started-create-workspace

Wait for the deployment to be complete. Below are the resources created as part of the deployment.

Login to the maker portal (https://make.powerapps.com/ ) and select the appropriate environment

Click on Azure Synapse Link and check the Connect to your Azure Synapse Analytics workspace (preview) and specify the storage account created in the previous step

Here we have selected the Contact and System User table for export.

On clicking Save, we got this error couple of times, even though the account being used was Owner across the Azure resources. Trying save 3-4 times more, allowed the link to be created.

Click on Go to Azure Synapse Analytics workspace, opens the Azure Synapse Studio

Within Synapse Studio we can see our Dataverse Database and run query.

Similarly, we create a Power BI report with Azure Synapse Analytics (SQL DW) connector.

Copy the Serverless SQL endpoint of the Synapse workspace.

Select the Direct Query option.

https://azure.microsoft.com/en-gb/blog/power-your-business-applications-data-with-analytical-and-predictive-insights/

Hope it helps..

Advertisements

How to – improve data migration performance – SSIS & Azure Data Factory (Dataverse / Dynamics 365)


In one of our projects, we were executing SSIS Packages (KingswaySoft’s Dynamics 365 SSIS Integration Toolkit) under Azure-SSIS Integration Runtime in Azure Data Factory.

Check out –

Deploy and run SSIS Package in Azure Data Factory

Deploy and run SSIS Packages that use KingswaySoft’s SSIS Integration Toolkit on Azure Data Factory.

After trying out different combinations, we eventually settled with batch size as 10 and thread as 15.

https://nishantrana.me/2021/06/08/data-migration-optimum-batch-size-and-threads-for-maximum-throughput-microsoft-dataverse-dynamics-365/

Also, we used multiplexing – running the CRM Destination Component under different application users.

To be precise, 4 in our case and we can increase it get further  improvement in the throughput.

And also based on the recommendation of our Microsoft’s Fast Track Architect we raised a Microsoft ticket to increase the number of web servers allocated from 2 to 3.

Below were our findings,

the earlier run was using batch size as 100 and thread as 20 with the number of servers as 2.

On updating the batch size to 10 and thread as 15 and with the number of servers allocated increased to 3, there was a huge performance gain.

Check the table below – 

The above table is sample run in the sandbox environment, during the final run in production we got the number of servers allocated, increased to 6, gaining further improvement.

Also, check out the below blog post to understand about the affinity cookie and its affect on performance, in case if we doing migration using custom code –

https://markcarrington.dev/2021/05/26/improving-bulk-dataverse-performance-with-enableaffinitycookie/

Hope it helps..

Advertisements

Azure Synapse Link / Export to Data Lake Service – Performance (initial sync)


Recently we configured the Export to Data Lake service for one of our projects.

Just sharing the performance, we got during the initial sync.

Entity

Count

Contact

2,36,2581

Custom entity

1,61,3554

The sync started at 11:47 A.M and was completed around 4:50 P.M. – around 5 hours i.e. 300 minutes

Let us consider total records synced as 3975000 – 1613446 + 2362581 + 2 = 3976029.

So the performance here comes down to

  • 795000 records per hour
  • 13250 records per minutes
  • 220 records per second.

Of course, it will vary depending on the specific environment, table / entity type, attributes in it etc. it gives us a rough idea.

Hope it helps..

Advertisements
%d bloggers like this: