How to – Access Dataverse environment storage using SAS Token


We can access our environment’s storage using a SAS token. 

Download – Azure Storage Explorerhttps://azure.microsoft.com/features/storage-explorer/#overview

Select Connect to Azure resources option

Select ADLS Gen2 container or directory for the Azure storage type.

Select the Shared access signature URL for the connection

To get the SAS URLhttps://[ContainerURL]/CDS?[SASToken],

we need ContainerURL and SASToken.

To get Container URL use the below link –

ContainerURL = <a href="https:///api/data/v9.1/datalakefolders”>https://&lt;EnvironmentURL>/api/data/v9.1/datalakefolders

In our case – https://mycrm2022.crm.dynamics.com/api/data/v9.1/datalakefolders

The containerendpoint = https://aeth2b4c5907361d483c9641.dfs.core.windows.net/aeth-b986098f-1315-45cc-b8e6-61c5a1a1631a

Copy the value of containerendpoint (ContainerURL)


Navigate to the below URL by replacing the value of the Environment URL and Container URL

<a href="https:///api/data/v9.1/RetrieveAnalyticsStoreAccess(Url=@a,ResourceType=’Folder&#8217;,Permissions=’Read,List’)?@a=’/CDS”>https://&lt;EnvironmentURL>/api/data/v9.1/RetrieveAnalyticsStoreAccess(Url=@a,ResourceType=’Folder’,Permissions=’Read,List’)?@a='<ContainerURL>/CDS’

i.e.

https://mycrm2022.crm.dynamics.com/api/data/v9.1/RetrieveAnalyticsStoreAccess(Url=@a,ResourceType=’Folder’,Permissions=’Read,List’)?@a=%27https://aeth2b4c5907361d483c9641.dfs.core.windows.net/aeth-b986098f-1315-45cc-b8e6-61c5a1a1631a/CDS%27

to get the SAS token (the SAS Token will expire after 1 hour)


Construct the SAS URL which we will use in the Azure Explorer’s connection info  <a href="https:///CDS?https://<ContainerURL>/CDS?<SASToken&gt;

https://aeth2b4c5907361d483c9641.dfs.core.windows.net/aeth-b986098f-1315-45cc-b8e6-61c5a1a1631a/CDS?skoid=3c1e8701-292f-478f-8f42-df7e9cc692c9&sktid=975f013f-7f24-47e8-a7d3-abc4752bf346&skt=2022-11-04T19%3A02%3A44Z&ske=2022-11-04T20%3A18%3A44Z&sks=b&skv=2020-08-04&sv=2020-08-04&se=2022-11-04T20%3A08%3A44Z&sr=d&sp=rl&sdd=1&sig=kKHHJLbb4gjAtTiQCbiuwQ8QrInY33zGEY1FvpxCLK%03D0

Now we have access to the storage

Here we have opened the contact folder just as an example


We only have read-only access, any other access like delete, rename, upload, etc. will fail


Get all the details here – https://learn.microsoft.com/en-us/power-platform/admin/storage-sas-token

Hope it helps..

Advertisements
Advertisement

Fixed – Azure Synapse Link for Dataverse not starting


Recently while trying to configure “Azure Synapse Link for
Dataverse” for one of our environments, we faced an issue, where it got stuck as shown below.

Also, there were no files/folders created inside the Storage Account.

Posts on Azure Data Lake

The accounts were properly setup – https://docs.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-synapse#prerequisites

Here we had only selected contact table for sync initially, and it had around 2.5 million records.

To fix this, we unlinked (using Unlink Option) and created the link again and this time we selected a table having very few record for e.g. country table as shown below.

This time the sync started immediately.

Than through Manage tables option, we added our contact table also, and it also started syncing properly as shown below-

Hope it helps..

Advertisements

Fixed – Authorization failed. The client with object id does not have authorization to perform action ‘Microsoft.Authorization/roleAssignments/write’ over scope ‘/subcriptions’ while configuring Azure Synapse Link for Dataverse


Recently while configuring Azure Synapse Link for Dataverse, for exporting data to Azure Data Lake we got the below error –

{“code”:”AuthorizationFailed”,”message”:”The client ‘abc’ with object id ‘d56d5fbb-0d46-4814-afaa-e429e5f252c8’ does not have authorization to perform action ‘Microsoft.Authorization/roleAssignments/write’ over scope ‘/subscriptions/30ed4d5c-4377-4df1-a341-8f801a7943ad/resourceGroups/RG/providers/Microsoft.Storage/storageAccounts/saazuredatalakecrm/providers/Microsoft.Authorization/roleAssignments/2eb81813-3b38-4b2e-bc14-f649263b5fcf’ or the scope is invalid. If access was recently granted, please refresh your credentials.”}


As well as the below error –


As the error suggests the error was because the user account used was not having the appropriate role(s) assigned.

The user needs to have the Owner as well as Blob Storage Data Contributor role on the Azure Data Lake Storage Gen2 account.

Also check –

https://nishantrana.me/2020/09/07/error-access-to-the-resource-is-forbidden-while-trying-to-connect-to-azure-data-lake-storage-gen2-using-power-bi-desktop/

https://nishantrana.me/2021/06/24/fixed-authorizationfailed-the-client-with-object-id-does-not-have-authorization-to-perform-action-microsoft-authorization-roleassignments-write-over-scope-storageaccou/

https://docs.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-synapse#prerequisites

Hope it helps..

Advertisements

How to – Export Dataverse (Dynamics 365) data to Azure SQL using Azure Data Factory pipeline template


[Visual Guide to Azure Data Factory - https://acloudguru.com/blog/engineering/a-visual-guide-to-azure-data-factory]

Using the new Azure Data Factory pipeline template – Copy Dataverse data from Azure Data Lake to Azure SQL – we can now easily export the Dataverse data to Azure SQL Database.

https://docs.microsoft.com/en-us/power-platform-release-plan/2021wave1/data-platform/export-dataverse-data-azure-sql-database

Check other posts on Azure Data Factory

Select Pipeline from template option inside the Data Factory

Search for Dataverse and select the Copy Dataverse data from Azure Data Lake to Azure SQL template

Let us specify the User Inputs required by the template – i.e. Azure SQL Database and the Data Lake Storage.

First we have created the linked service for the Azure SQL Database.

We’d use it to connect to the below table MyContacts.

Similarly create a linked service to Azure Data Lake Gen 2, which holds our Dataverse data.

Get the URL from the Container’s property. (replace blob with dfs in the URL)

To get the storage account key, select Access Keys >> Show Keys >> Copy the Key for the Storage Account.

Here we have already configured Azure Synapse Link for Dataverse

https://nishantrana.me/2020/09/07/export-data-from-common-data-service-to-azure-data-lake-storage-gen2/

Now as we have defined the User Inputs, select Use this template.

Navigate to the data flow created – DataverseToAzureSQL

Select our source ADLS and check and configure its properties.

Source Settings

Here we have the Inline dataset type set to Common Data Model and the Linked service is the AzureDataLakeStorage1 we created earlier.

Source Option

Specify the Dataverse folder for the Root Location.

Here we have specified the contact entity from our Data Lake Storage.

Projection

In the projection we have cleared the generated schema using Clear Schema, also selected Schema options >> Allow schema drift


We have enabled Allow schema drift option which will create the required columns in the destination Azure SQL Table.

Optimize

Inspect

Data preview

As we have not turned on Debug mode, there is nothing to preview

Now let us move to our Destination – SQL.

Sink

Here we have AzureSQLTable dataset connected to contact table in Azure SQL and have checked Allow schema drift option.


Below is our AzureSQLTable Dataset connected to the MyContacts table.

Settings

Here we have selected Allow Insert as the Update Method and Table
action as Recreate table – as we want the destination table to be re-created dynamically based on the source.

Mapping

We have left it to Auto mapping.

Optimize

Inspect

Data preview

Let us Publish All our changes and Debug our pipeline.

Let us monitor our pipeline run.

We can see that pipeline has run successfully and took around 4 minutes.

We can see the contact’s data copied to our Azure SQL successfully.

So here we covered the insert operation, in the next posts we’d see how we can configure update, upsert and delete operation.

Also, check

Posts on Azure Data Factory

Posts on Azure Data Lake

Hope it helps..

Fixed – AuthorizationFailed. The client with object id does not have authorization to perform action ‘’Microsoft.Authorization/roleAssignments/write’ over scope ‘storageaccount’ – Azure Data Lake


While configuring the Azure Synapse Link/ Export to Data Lake service, we were getting below error for one of the users.

{“code”:”AuthorizationFailed”,”message”:”The client ‘nishantr@pmaurua105.onmicrosoft.com’ with object id ‘d56d5fbb-0d46-4814-afaa-e429e5f252c8’ does not have authorization to perform action ‘Microsoft.Authorization/roleAssignments/write’ over scope ‘/subscriptions/30ed4d5c-4377-4df1-a341-8f801a7943ad/resourceGroups/RG/providers/Microsoft.Storage/storageAccounts/saazuredatalakecrm/providers/Microsoft.Authorization/roleAssignments/2eb81813-3b38-4b2e-bc14-f649263b5fcf’ or the scope is invalid. If access was recently granted, please refresh your credentials.”}

The current Role assignments of the user were as below.

To resolve it we need to give the Owner role to the user configuring the Export to Data lake service on the Storage account. (apart from System Admin role within Dynamics 365)

https://docs.microsoft.com/en-us/powerapps/maker/data-platform/export-to-data-lake#prerequisites

After assigning the Owner Role (and removing the other roles), it worked successfully.

Check other posts on Azure Data Lake/ Azure Synapse Link

Posts on Azure Data Lake

Hope it helps..

Advertisements

How to setup – Azure Synapse Link – Microsoft Dataverse


Azure Synapse Link (earlier known as Export to Data Lake Service) provides seamless integration of DataVerse with Azure Synapse Analytics, thus making it easy for users to do ad-hoc analysis using the familiar T-SQL with Synapse Studio, build Power BI Reports using Azure Synapse Analytics Connector or use Azure Spark in Azure Synapse for analytics.

As a first step, we need to create the Azure Synapse Workspace.

Login to Azure Portal (https://portal.azure.com/ ) with appropriate roles and create a Synapse workspace.

https://docs.microsoft.com/en-us/azure/synapse-analytics/get-started-create-workspace

Wait for the deployment to be complete. Below are the resources created as part of the deployment.

Login to the maker portal (https://make.powerapps.com/ ) and select the appropriate environment

Click on Azure Synapse Link and check the Connect to your Azure Synapse Analytics workspace (preview) and specify the storage account created in the previous step

Here we have selected the Contact and System User table for export.

On clicking Save, we got this error couple of times, even though the account being used was Owner across the Azure resources. Trying save 3-4 times more, allowed the link to be created.

Click on Go to Azure Synapse Analytics workspace, opens the Azure Synapse Studio

Within Synapse Studio we can see our Dataverse Database and run query.

Similarly, we create a Power BI report with Azure Synapse Analytics (SQL DW) connector.

Copy the Serverless SQL endpoint of the Synapse workspace.

Select the Direct Query option.

https://azure.microsoft.com/en-gb/blog/power-your-business-applications-data-with-analytical-and-predictive-insights/

Hope it helps..

Advertisements
%d bloggers like this: