Use CrmServiceClient to execute web request against Web API – Dynamics 365


In the previous post we saw how to use CrmServiceClient to connect to CDS using Authentication Type – OAuth and execute web request using Organization.svc service

https://nishantrana.me/2020/11/09/sample-code-to-connect-to-cds-dynamics-365-ce-using-oauth/

Here we will extend the same example to execute web request using Web API.

  • Create the contact record with first name and last name populated

using Microsoft.Xrm.Tooling.Connector;
using Newtonsoft.Json.Linq;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net.Http;

namespace SampleConsoleApp
{
class Program
{
static void Main(string[] args)
{
string ConnectionString = "AuthType = OAuth; " +
"Username = [username]@[domain].onmicrosoft.com;" +
"Password = [password]; " +
"Url = https://[orgname].crm.dynamics.com;" +
"AppId=51f81489-12ee-4a9e-aaae-a2591f45987d;" +
"RedirectUri=app://58145B91-0C36-4500-8554-080854F2AC97;" +
"LoginPrompt=Auto";

CrmServiceClient svc = new CrmServiceClient(ConnectionString);

// specify OData Headers
Dictionary<string, List<string>> odataHeaders = new Dictionary<string, List<string>>
{
{ "Accept", new List<string>() { "application/json" } },
{ "OData-MaxVersion", new List<string>() { "4.0" } },
{ "OData-Version", new List<string>() { "4.0" } }
};

if (svc.IsReady)
{
// create a contact record with firstname and lastname populated
dynamic contact = new JObject();
contact.firstname = "Meeska";
contact.lastname = "Mooska";
string jsonContact = Newtonsoft.Json.JsonConvert.SerializeObject(contact);

// create the contact record
// Parameters - HttpMethod, QueryString, Body, Customer Headers, Content Type
HttpResponseMessage httpResponse = svc.ExecuteCrmWebRequest(
HttpMethod.Post,
"contacts",
jsonContact,
odataHeaders,
"application/json");

if (httpResponse.IsSuccessStatusCode)
{
var contactUri = httpResponse.Headers.GetValues("OData-EntityId").FirstOrDefault();
Console.WriteLine("Contact URI: {0}", contactUri);
}
else
{
Console.WriteLine(httpResponse.ReasonPhrase);
}
}
}
}
}

  • Retrieve first name and last name for all the contact

// retrieve first name and last name of all the contact records 
HttpResponseMessage httpResponse = svc.ExecuteCrmWebRequest(
HttpMethod.Get,
"contacts?$select=firstname,lastname",
string.Empty,
odataHeaders,
"application/json");

 

Get the details here – ExecuteCrmWebRequest

Hope it helps..

Power Automate vs Azure Logic Apps


Power Automate

Azure Logic Apps

Power Automate is built on top of Azure Logic Apps
It is a SaaS service for workflow automation across several different apps and SaaS services. It is a PaaS service for workflow automation across several different apps, SaaS services, and IaaS services for enterprise integration.
For more of self-service and simple integration scenarios. For complex/advanced integration scenarios
Targeted for Business User, Citizen Developers, Developers, IT Pros. Targeted for Developers and IT Pros.
Brower based designer and mobile app UI only


In-browser as well as Visual Studio
Office 365 Service / License / Subscription Azure Service / License / Subscription
Flow specific Connectors

Power Automate Premium Connectors

Logic App-specific Connectors – SAP, IBM MQ, IoT, Liquid.

Logic Apps Connectors

Power Automate are specific to an environment There is no environment concept, each logic app is an independent entity.
Pay by run Pay by action run and by connector run.
Button flow

Create a button flow

Modern Approvals

https://nishantrana.me/2020/08/31/approvals-power-automate-dynamics-365/

Flow can be extended as Logic Apps

Export flow and deploy to Logic Apps

Power Automate is supposed to be designed and tested in a non-production environment and then promoted to the production environment. The solution makes it possible with connectors requiring reconfiguration. Connection references can be considered here https://flow.microsoft.com/en-us/blog/move-flows-across-environments-without-resetting-connections/
Logic Apps has ALM possibilities.

Automate deployment of Azure Logic Apps

Admin Experience through Power Platform Admin Center. Admin experience through Azure Portal.

Use Power BI to analyze the CDS data in Azure Data Lake Storage Gen2


In the previous post, we saw how to export CDS data to Azure Data Lake Storage Gen2.

Here we’d see how to write Power BI reports using that data.

Open the Power BI Desktop, and click on Get data

Select Azure > Azure Data Lake Gen 2 and click on connect.

To get the container URL,

Log in to the Azure portal and navigate to the container and click on Properties and copy the URL.

Replace the blob part in the copied URL with dfs

Below is the format of the URL.

https://accountname.dfs.core.windows.net/containername/

replace the account name and the container name.

In case you get the below error

Refer –

https://nishantrana.me/2020/09/08/error-we-dont-support-the-option-hierarchicalnavigation-parameter-name-hierarchicalnavigation-when-trying-to-load-table-in-power-bi-desktop-using-azure-data-lake-storage-gen-2-cdm-fo/

Select the CDM Folder View (beta)

Expand the CDM folder and select the entity.

In case if you get the below error

Refer

https://nishantrana.me/2020/09/08/error-we-dont-support-the-option-hierarchicalnavigation-parameter-name-hierarchicalnavigation-when-trying-to-load-table-in-power-bi-desktop-using-azure-data-lake-storage-gen-2-cdm-fo/

Once connected we can then create our Power BI report as shown below.

Check the below posts for creating a Power BI report with Dynamics 365 data as the source

https://nishantrana.me/2018/11/24/power-bi-and-microsoft-dynamics-365/

Hope it helps..

Error – We don’t support the option ‘HierarchicalNavigation’. Parameter name: HierarchicalNavigation when trying to load table in Power BI Desktop using Azure Data Lake Storage Gen 2 CDM Folder view (beta)


While trying to connect to a table within Azure Data Lake Storage Gen2 through CDS Folder View

we got the below error

Users have reported this issue with the August 2020 Update of Power BI Desktop.

As suggested in the forums, downgrading to June 2020 Update fixed the issue for us.

Check out Export CDS data to Azure Data Lake Storage Gen2

Hope it helps..

Error – Access to the resource is forbidden while trying to connect to Azure Data Lake Storage Gen2 using Power BI Desktop


While trying to connect to Azure Data Lake Storage Gen2 through Power BI Desktop we got the below error

Came as surprise cause the user was had the owner role assigned to the container

It turned out we need to assign the Storage Blob Data Reader role to the user.

After assigning the role we were able to connect successfully.

Hope it helps..

Export data from Common Data Service to Azure Data Lake Storage Gen2


Azure Data lake store gen 2 can be described as a large repository of data, structured or unstructured built on top of Azure Blob storage, that is secure (encryption – data at rest), manageable, scalable, cost-effective, easy to integrate with.

  • Export to Data Lake allows for continuous replication of CDS entities to Data Lake Storage Gen2, which involves initial write followed by incremental writes, which can be consumed by Power BI, Azure Data Factory, Azure Data Bricks, and Azure Machine Learning.
  • Replication of standard and custom entities having change tracking enabled and create, update, and delete operations.
  • Any changes in data and metadata are pushed automatically without the need of setting any refresh intervals.

Let us first create a general-purpose V2 storage account to access all of the Azure storage services like blobs, files, etc.

The storage account must be in the same Azure AD tenant.

Login to Azure Portal (with admin account)

https://portal.azure.com/

Search for Storage Accounts.

Here we have used a trial to create the storage account.

Leave the Account kind, Replication, and Blog Access as the default values while creating the storage account.

Before selecting Review + Create, navigate to the Advanced tab and enable the Hierarchical Namespace.

After validation is done and is successful, click on Create to create the storage account.


With the storage account created successfully, navigate to PowerApps select the option Export to data lake.

Select New link to data lake

Specify the storage account created earlier.

Select the entities to be exported to the data lake. Enable change tracking for the entities, as only these entities will be exported.

Clicking on save will link the CDS environment with Azure data lake storage.

It will create the file system in the Azure storage account having a folder for each entity selected.

and will start the initial sync.

we can use the Manage entities option for adding or removing the linked entities.

Inside Azure Portal, we can navigate to the storage account and select the Storage Explorer.

Expand commondataservice-environmentName-org-Id
container to view the details.

The CSV file will contain the data

Here Model.json is the metadata file in the CDM folder

that describes the data in the folders, metadata, and location.

More details-

https://www.bluegranite.com/blog/10-things-to-know-about-azure-data-lake-storage-gen2

Hope it helps.