PowerAutomate: Self-reference is not supported when updating the value of variable


Ajit Patra

We often come across requirements in which we want to add certain value to the same variable e.g. for integer variable, x=x+5 or for string variable, test=test+”additional”.

We were trying to achieve the same thing using PowerAutomate. However, while doing this using Set Variable action, we got an error saying “Self-reference is not supported when updating the value of variable”.

Below is what we were trying to do inside Apply to each action and the expression is: add(variables(‘Service Delivery Type Total Amount’),items(‘Apply_to_each_Service_Delivery_Type’)?[‘dxc_amount’])

As this is not possible using PowerAutomate at the moment, we achieved it using another variable of same type in the middle as shown below:

And, inside Apply to each action, we added the following Set Variable actions:

Expression for the above action: add(variables(‘Dummy Service Delivery Type Amount’), items(‘Apply_to_each_Service_Delivery_Type’)?[‘dxc_amount’])

Hope it helps !!

View original post

Use custom browser protocol to launch desktop applications from Dynamics 365


Recently we had a requirement to launch a .NET desktop application installed in the user’s machine from within Dynamics 365 and also pass parameters to it.

One of the ways of implementing is by using the Custom URL Protocol.

Modern browsers and operating systems allow us to implement a custom URL protocol and register an external application to handle it. So when a user clicks  on the link that uses that custom URL Protocol, the browser will open the application that is registered.

Check a few examples here

https://docs.microsoft.com/en-us/windows/uwp/launch-resume/launch-app-with-uri

So, let us start and register our Custom URL Protocol.

Below is the source code for creating a new key


var applicationPath = @"C:\MyParamApp\MyApp\bin\Debug\MyApp.exe";
var KeyTest = Registry.CurrentUser.OpenSubKey("Software", true).OpenSubKey("Classes", true);
RegistryKey key = KeyTest.CreateSubKey("OpenAppP");
key.SetValue("URL Protocol", "OpenAppP");
key.CreateSubKey(@"shell\open\command").SetValue("", "\"" + applicationPath + "\" \"%1\"");

Here OpenAppP is the protocol name and the application path that holds the path of application that we want to execute.

For key.CreateSubKey we have specified %1 for passing the parameter.

If we do not want to pass the parameter then we do not need to append it.

On running the above code, we can see the key added with the value specified

Below is the source code of our console application, basically it reads and writes the parameter in the output window.

Below is our Html page that we will call from within a button on a form from Dynamics 365 and which will open the MyApp.exe in turn.

Here we have published the page in Azure.

The html page on onload will retrieve the id query parameter and append it to the link for the custom protocol

OpenAppP:id

Back in CRM, we have a custom ribbon button named Open Console App added with the following definition.

Url Command Action and Crm Parameter in it to pass the id.

Clicking on the button

as expected, opens the alert along with opening the console app with the parameter passed to it.

Get more details

https://www.vdoo.com/blog/exploiting-custom-protocol-handlers-in-windows

Hope it helps..

 

App Switcher, Back option and other changes in Dynamics 365 – 2020 Release Wave 2


Below are some of the updates in the product as part of 2020 Release Wave 2.

  • To switch between the Dynamics 365 Apps, we need to click on down arrow icon next to Dynamics 365.

With 2020 Release Wave 2, we can click on the app name i.e. which opens up the pop-up windows listing all the apps.

We can search, create a new app, assign roles, filter etc.

Earlier to access the same we had to navigate to Advanced Settings à Apps

  • Next is the change in the Breadcrumb navigation

Is now seems to be replaced with the back button functionality

  • Also the User Account menu has also been updated earlier,

and now in 2020 Release Wave 2

View account takes user to the My Account page in Office 365

MyAccount

Check other blog posts on Release 2020 Wave 2

Hope it helps..

Azure AD – How to create your own SAML-based application using new Azure Portal


tsmatz

With new Azure Portal (https://portal.azure.com/), Azure AD provides very flexible SAML-based configuration, but some folks ask me where to do that ?
In this post, I show you the answer for this question using some bit of SAML-based federation sample code of PHP and Node.js.

Note : For the settings using Azure Classic Portal (Azure Management Portal), see my previous posts “Azure AD Web SSO with PHP (Japanese)” and “Azure AD Web SSO with Node.js (Japanese)“.

Settings with Azure Portal

First of all, I’ll  show you how the SAML settings page is improved by new Azure Portal.

When you want to register your own SAML-based application, select “Azure Active Directory” in Azure Portal, click “Enterprise applications” menu, and push “add” button.
You can select a lot of pre-defined (registered) applications (like Salesforce, Google, etc), but you click “Non-gallery application” link on top…

View original post 1,843 more words

Queue Item details change in Dynamics 365 – 2020 Release Wave 2


Queue Item details for a Queue Item earlier used to open in a new window.

Thus losing the context

2020 Release Wave 2 provides an immersive experience for the agents, as the queue item details open in the context of the parent window.

Small but nice update for the agents.

Check other blog posts on Release 2020 Wave 2

Refer to below articles to understand more on Queues

MB2-714 (Microsoft Dynamics CRM 2016 Customer Service) : Queue Management

https://www.itaintboring.com/dynamics-crm/working-with-queues-in-dynamics/

https://carldesouza.com/releasing-queue-item-behavior-in-dynamics-365/

Hope it helps..

#PowerBI – External Tool – Open Power BI Report Builder – part 1


Erik Svensen - Blog about Power BI, Power Apps, Power Query

As you may have noticed I have blogged twice about the new awesome feature in Power BI Desktop where we can build our own external tool buttons in Power BI Desktop.

Here is a link to the previous posts.

  • Analyze in Excel (link)
  • Open in Tableau (link)

One perhaps forgotten member in Power BI is the Power BI Report Builder – aka – Paginated reports – and even though it requires a premium capacity or Power BI embedded A-SKU to publish/share reports – I thought it might be interesting to see if we could link the local pbix file to the Report Builder.

I will write a part 2 where the external tool will support if the desktop file is connected to an Azure Analysis Server or Power BI Dataset as well.

You can download the free Power BI Report Builder from here.

How to…

View original post 369 more words

Using SQL Server Management Studio to deploy and run SSIS package in Azure Data Factory


In our previous post, we created the SSIS Catalog (SSISDB) in Azure and deployed the SSIS package using SSDT.

Supported version for SSDT – SQL Server Data Tools to deploy SSIS package to Azure.

  • For Visual Studio 2017, version 15.3 or later.
  • For Visual Studio 2015, version 17.2 or later.

In this post, we’d use SSMS to deploy the packages in Azure.

Connect to the Azure SQL Server

Expand the Integration Services Catalog, right-click the Projects folder, and select the Deploy Project option.

Enter the source details in the deployment wizard

Select the option SSIS in Azure Data Factory

Select the existing or create a new folder for the project

Click on Deploy after successful validation and review.

Here in our case, it failed with the below message

There is no available node. Please check node status on the monitoring page of the ADF portal and ensure that at least one node is in running 1 and try again. (Microsoft SQL Server, Error: 50000)

The error is because the Azure-SSIS Integration runtime is in the status Stopped.

navigate to your Azure Data Factory instance, and start the runtime.

After around 10 minutes or so the service would be up and running.

This time deployment is successful.

We can see the packages available within the pipeline.

Hope it helps..

Deploy and run SSIS Integration Toolkit for Dynamics 365 on Azure Data Lake (KingswaySoft)


In the previous post, we saw how to deploy and run SSIS packages on the cloud.

Here we take it one step further and will deploy and run the SSIS packages that use KingswaySoft’ s SSIS Integration Toolkit components.

Here we will need an Azure Subscription, where we will host the SSISDB, followed by provisioning Azure-SSIS Integration runtime instance.

We will also need the Azure Blob Storage account along with Azure Storage Explorer to upload the installation files of the SSIS Integration Toolkit.

Let us first start by creating an Azure SQL Server instance.

We have specified the below details.

Now next create the database inside the server.

Now with Azure SQL Server and Database created, the next step is to create the Storage account.

With the Azure Storage created, now let us connect to Azure using the Azure Storage Explorer.

Create a new blob container in the Azure Storage account created.

For the blob container created, right-click and select Get Shared Access Signature

Specify the expiry time along with Write permissions, this is for logging purpose when the Azure-SSIS IR is being provisioned.

Copy the URL (it will be used in the PowerShell script later)

Now let us get the installation files and programs from the KingswaySoft Shared Blob Container, which we’d place in the blob container we just created.

https://kingswaysoftgeneral.blob.core.windows.net/ssis-integration-toolkit-ultimate?st=2019-07-04T16%3A10%3A25Z&se=2059-07-05T16%3A10%3A00Z&sp=rl&sv=2018-03-28&sr=c&sig=LAGvouFpkZHEk%2BH8%2B0pK%2FDNg7B3jPUf%2FJ91%2BJ%2FEeKg0%3D

Right-click Storage Accounts and select Connect to Azure Storage

Select Use a shared access signature (SAS) URI

Paste the KingswaySoft blob container URL.

We can see the below contents added to the blob container.

Select all and copy all the files.

Paste it in the blob container we had created earlier.

With things now setup, let us get the PowerShell script to provision the Azure-SSIS Integration Runtime Initializations.ps1 and update it.

Specify the appropriate values and run the script. Get the Azure PowerShell.

Also, make sure to update the firewall rules to allow the client to connect.

Update the PowerShell Script appropriately

We can check the status as shown below.

In parallel, we can see our Azure Data Factory created with the integration runtime, which is in Starting status.

After a few minutes, we will have integration runtime up and running.

Below is our SSIS Package that we would be deploying to the cloud.

It uses Data Spawner Component to generate test data for Contacts and the CDS Destination component to create those records inside CDS.

Right-click the integration project and select Deploy

Specify connection details along with Path

After successful deployment, let us create a new pipeline inside the Azure Data Factory.

Drag and drop the Execute SSIS Package and click on the Settings tab.

Connect to the package deployed followed by Validate and Debug to test the pipeline.

The pipeline will be in Queued status

After successful execution,

navigate to our Dynamics 365 Sales Hub

We can see 10 contact records created by the SSIS Package.

Hope it helps..

Deploy and run SSIS package in Azure Data Factory


Before the SSIS package can be deployed to Azure Data Factory we need to provision Azure-SQL Server Integration Service (SSIS) runtime (IR) in Azure Data Factory.

In the previous posts, we had created an Azure data factory instance had used Azure SQL Database as the source.

Within Azure Data Factory in the Let’s get started page, select Configure SSIS Integration.

Specify the appropriate values to integration runtime.

Select Create SSIS Catalog option to deploy packages in SSISDB, provide Azure SQL Database server endpoint, and the admin credentials to connect.

Test the connection.

Specify advanced settings as appropriate.

This starts the creation of Azure-SSIS Integration Runtime.

Meanwhile below is our SSIS package that we would be deploying to Azure Data Factory.

It extracts a text file named contacts.txt from the blob source and loads it into destination blog storage.

Right-click the project  and select Deploy.

(Deploying individual package is not supported right now)

Select SSIS in Azure Data Factory.

Specify Server name and credentials and connect.

Click on Browse.

Create a new folder or select an existing folder and click on Ok

Once the validation is successful, click on Deploy and start the deployment.

After successful deployment, create a new pipeline in the Azure Data Factory, and drag the Execute SSIS Package activity

Connect to the package deployed.

Click on debug to trigger and test the pipeline.

On the successful run, we can see the contact.txt file extracted from mycontainer1 and loaded to mycontainer2.

Hope it helps..

D365 CE: Get Logged in User’s Security Roles using JavaScript


Ajit Patra

Many times we come across requirements such as show/hide ribbon buttons based on logged in user’s security role.

Earlier, we used to get security roles of logged in user at client side using Xrm.Utility.getGlobalContext().userSettings.securityRoles which used to return array of GUID value of each security role.

Now that it’s deprecated, we can use Xrm.Utility.getGlobalContext().userSettings.roles which returns collection of objects with GUID and name of each security role that is assigned to the user directly or through the teams.

Below is the script which checks if the user has certain security roles based on names and hides the ribbon button if the user doesn’t have any of those security roles:

SAB.ShowHideReopenButton = function () {
    var roles = Xrm.Utility.getGlobalContext().userSettings.roles;

    if (roles === null) return false;

    var hasRole = false;
    roles.forEach(function (item) {
        if (item.name.toLowerCase() === "cs manager" || item.name.toLowerCase() === "cs administrator") {
            hasRole = true;
        }
    });

    return hasRole;
}

View original post 4 more words

Use Azure Data Factory V2 to load data into Dynamics 365


Let us take a simple example where we will set up an Azure Data Factory instance and use Copy data activity to move data from the Azure SQL database to Dynamics 365.

Login to Azure Portal.

https://portal.azure.com

Search for Data factories

Create a new data factory instance

Once the deployment is successful, click on Go to resource

Inside the data factory click on Author & Monitor

Click on Author in the left navigation

Create a new Pipeline

And drag the Copy data activity to it

Go to the Source tab, and create a new dataset.

Below is our Azure SQL database with contacts table which will be our source here.


Select Azure SQL Database as the source dataset.


Create a new linked service to specify the connection properties.


Specify the details to connect to the Azure SQL Database.


We have selected the contacts table here.


Similarly, let us define a new dataset for Sink which will connect to our Dynamics 365 Instance.



Select the Dynamics data set and specify the linked service.

Specify the details of the Dynamics 365 instance to connect to.

We have selected contact entity as the destination.

Within the Mapping tab, we can specify the fields to be mapped.

Below is how we have specified the mapping.

Click on Validate and after successful validation, click on Debug to run the pipeline.

Within the Output window, we can see the status.

After the successful run, we can see the contact records created inside Dynamics 365.

We can specify a trigger for the pipeline as shown below.

Publish All will publish the changes to the data factory.

Hope it helps..

Failed to get response from server error while trying to connect to Dynamics 365 using linked services – Azure Data Factory


Recently, while trying to connect to Dynamics 365 data set through Linked Service

we got the below error

Seems like a product issue, so the workaround is

Opening Azure Data Factory in a new incognito or in-private mode.

Or

Cancel and do not select the certificate, while testing the connection.

Ignoring the certificate fixed the issue for us.

Hope it helps..

Use Power BI to analyze the CDS data in Azure Data Lake Storage Gen2


In the previous post, we saw how to export CDS data to Azure Data Lake Storage Gen2.

Here we’d see how to write Power BI reports using that data.

Open the Power BI Desktop, and click on Get data

Select Azure > Azure Data Lake Gen 2 and click on connect.

To get the container URL,

Log in to the Azure portal and navigate to the container and click on Properties and copy the URL.

Replace the blob part in the copied URL with dfs

Below is the format of the URL.

https://accountname.dfs.core.windows.net/containername/

replace the account name and the container name.

In case you get the below error

Refer –

https://nishantrana.me/2020/09/07/error-access-to-the-resource-is-forbidden-while-trying-to-connect-to-azure-data-lake-storage-gen2-using-power-bi-desktop/

Select the CDM Folder View (beta)

Expand the CDM folder and select the entity.

In case if you get the below error

Refer

https://nishantrana.me/2020/09/08/error-we-dont-support-the-option-hierarchicalnavigation-parameter-name-hierarchicalnavigation-when-trying-to-load-table-in-power-bi-desktop-using-azure-data-lake-storage-gen-2-cdm-fo/

Once connected we can then create our Power BI report as shown below.

Check the below posts for creating a Power BI report with Dynamics 365 data as the source

https://nishantrana.me/2018/11/24/power-bi-and-microsoft-dynamics-365/

Hope it helps..

Error – We don’t support the option ‘HierarchicalNavigation’. Parameter name: HierarchicalNavigation when trying to load table in Power BI Desktop using Azure Data Lake Storage Gen 2 CDM Folder view (beta)


While trying to connect to a table within Azure Data Lake Storage Gen2 through CDS Folder View

we got the below error

Users have reported this issue with the August 2020 Update of Power BI Desktop.

As suggested in the forums, downgrading to June 2020 Update fixed the issue for us.

Check out Export CDS data to Azure Data Lake Storage Gen2

Hope it helps..

Error – Access to the resource is forbidden while trying to connect to Azure Data Lake Storage Gen2 using Power BI Desktop


While trying to connect to Azure Data Lake Storage Gen2 through Power BI Desktop we got the below error

Came as surprise cause the user was had the owner role assigned to the container

It turned out we need to assign the Storage Blob Data Reader role to the user.

After assigning the role we were able to connect successfully.

Hope it helps..

Export data from Common Data Service to Azure Data Lake Storage Gen2


Azure Data lake store gen 2 can be described as a large repository of data, structured or unstructured built on top of Azure Blob storage, that is secure (encryption – data at rest), manageable, scalable, cost-effective, easy to integrate with.

  • Export to Data Lake allows for continuous replication of CDS entities to Data Lake Storage Gen2, which involves initial write followed by incremental writes, which can be consumed by Power BI, Azure Data Factory, Azure Data Bricks, and Azure Machine Learning.
  • Replication of standard and custom entities having change tracking enabled and create, update, and delete operations.
  • Any changes in data and metadata are pushed automatically without the need of setting any refresh intervals.

Let us first create a general-purpose V2 storage account to access all of the Azure storage services like blobs, files, etc.

The storage account must be in the same Azure AD tenant.

Login to Azure Portal (with admin account)

https://portal.azure.com/

Search for Storage Accounts.

Here we have used a trial to create the storage account.

Leave the Account kind, Replication, and Blog Access as the default values while creating the storage account.

Before selecting Review + Create, navigate to the Advanced tab and enable the Hierarchical Namespace.

After validation is done and is successful, click on Create to create the storage account.


With the storage account created successfully, navigate to PowerApps select the option Export to data lake.

Select New link to data lake

Specify the storage account created earlier.

Select the entities to be exported to the data lake. Enable change tracking for the entities, as only these entities will be exported.

Clicking on save will link the CDS environment with Azure data lake storage.

It will create the file system in the Azure storage account having a folder for each entity selected.

and will start the initial sync.

we can use the Manage entities option for adding or removing the linked entities.

Inside Azure Portal, we can navigate to the storage account and select the Storage Explorer.

Expand commondataservice-environmentName-org-Id
container to view the details.

The CSV file will contain the data

Here Model.json is the metadata file in the CDM folder

that describes the data in the folders, metadata, and location.

More details-

https://www.bluegranite.com/blog/10-things-to-know-about-azure-data-lake-storage-gen2

Hope it helps.

How to create a unified profile (Golden Customer Record) using Dynamics 365 Customer Insights – Part 1


Virendra Agrawal's Blog

I recently had an opportunity to work on Dynamics 365 Customer Insights for a retail client who was envisioning to consolidate their customer’s information to achieve a true customer 360-degree view.

The main objective was to remove the silos of data that represents customer purchases, payments, website visits, marketing & social interactions and customer service requests.

With it’s pre-built AI model, Customer Insights was able to help with ingesting massive amounts of data from separate systems and leverage matching strategies to build a unique profile and showcase customer’s total lifetime value, churn risk, and more.

In this series of posts, I would like to put some light on how we can use Dynamics 365 CI to ingest this customer data from multiple sources and set the matching rules to create a unified customer profile or a golden customer record.

We’ll see how we can connect Customer Insights with various data…

View original post 918 more words

Use Monitor to troubleshoot forms in the model-driven app (preview)– Dynamics 365


The new monitor feature added for the Model-driven apps can be used for troubleshooting issues with form related events.

We have 2 ways to access the monitor option.

Select the Model-driven app inside Power Apps  and select Monitor in the command bar.

The other option is to  add the parameter &monitor=true to the end of the URL

And select the monitor option

This will open the monitoring session in a new tab.

Click on the Play model-driven app.

This opens the app connected to the monitor session.

We can see any action performed tracked.

We can filter Category column to check specific events related to form.

Select the row to get the details populated on the right side.

Get all the details below

https://docs.microsoft.com/en-us/powerapps/maker/model-driven-apps/monitor-form-checker

Hope it helps..

Power Platform Dataflows


Append and Merge to combine data from multiple data source in Power Platform dataflows

Let us continue with our previous post where we loaded the data from on-prem SQL DB to CDS using dataflows. https://nishantrana.me/2020/07/07/load-data-from-sql-on-premise-to-cds-common-data-service-using-power-platform-dataflows-in-power-apps/ Now suppose we have another table (or any other data source) having the contact details, which we would like to append/merge along with our previous data source. For simplicity, here we have created a … Continue reading “Append and Merge to combine data from multiple data source in Power Platform dataflows”

Load data from SQL On-Premise to CDS (Common Data Service) using Power Platform dataflows

Let us continue our previous post where we created a connection to the below On-Premise SQL Server Database using an On-premises data gateway. https://nishantrana.me/2020/07/06/configuring-on-premises-data-gateway-to-connect-tosql-server-on-premise-data-source-power-platform/ Here we will use the Power Platform dataflows to load contact entity in CDS using that same on-prem table. Sign in to Power Apps https://make.powerapps.com/home Navigate to Data – Dataflows and … Continue reading “Load data from SQL On-Premise to CDS (Common Data Service) using Power Platform dataflows”

Configuring On-Premises data gateway to connect to SQL Server on-premise data source – Power Platform

Let us take a simple example to understand the steps to be performed for configuring an On-premises data gateway. We have below database in our on-premise SQL Server which we connect to using On-Premise data gateway. Below are the steps we need to perform to configure it – Login to Power Apps and navigate to … Continue reading “Configuring On-Premises data gateway to connect to SQL Server on-premise data source – Power Platform”

Email experience now available in Dynamics 365 Mobile App – 2020 Release Wave 2


With 2020 Release Wave 2, users can now finally compose, edit, and send emails from the Dynamics 365 Mobile App.

Check other blog posts on Release 2020 Wave 2

Now mobile users can –

  • Compose and send email from Dynamics 365 Mobile app

Navigate to Activities menu from the home page.

We can see the Email entity added there.

Clicking on it allows us to compose an email from within the app.

We have the full email editor formatting available.

We also have the functionality of inserting template, signature, knowledge article, etc. available.

We can also create email records from within the Timeline section.

Check the other blog posts on Dynamics 365 Mobile App

Hope it helps..

ARC and SLA Migration tool (Preview) in Dynamics 365 – 2020 Release Wave 2


With 2020 Release Wave 1, new experience was added for administrators which uses Power Automate for defining rules, conditions, and actions for SLA and Automatic create and update records rule.

https://nishantrana.me/2020/02/26/automatic-record-creation-and-update-rules-enhancements-in-dynamics-365-customer-service-2020-release-wave-1/

Now 2020 Release Wave 2 provides a tool to migrate the existing rules and SLA from classic app to the Customer Service Hub that uses Power Automate.

Other blog posts on 2020 Release Wave 2

Say e.g. we have below rule created for Task in the classic or legacy experience.

For the rule item, for simplicity, we have kept the condition as a description field containing data.

Now let us open the Customer Service Hub, and navigate to Service Management Area – ARC and SLA Migration tool

There we can see the 1 rule that we had created listed.


Let us click on the Start Migration button to start the migration.

Let us select the category and click on Next.

The next step is pre-migration check

It shows as passed after the check

The Rules and items to migrate give the option to select the rules to migrate.

The next screen gives the option to review the selection before starting the migration.

Clicking on Start migration starts the process.

Below we can see the migration completed.

Once completed, we are presented with the below summary screen

The new rule has the suffix migrated added to its name.

We can open the rule to check its definition. (and activate it)

As expected we have our rule updated to use the Power Automate.


Below is the rule created in legacy experience.

Note: –

Earlier we got the below error message, which was quickly resolved by Microsoft after it was reported to them.

message=’msdyn_migrationtracker’ entity doesn’t contain attribute with Name = ‘correlationid’ and NameMapping = ‘Logical’. MetadataCacheDetails: ProviderType=Dynamic, StandardCache=True, IsLoadedInStagedContext = False, Timestamp=3625460, MinActiveRowVersion=3625460, MetadataInstanceId=37958858

Dynamics 365 2020 release wave 2 –https://docs.microsoft.com/en-us/dynamics365-release-plan/2020wave2/

Hope it helps..

Log Canvas Power App telemetry data in Azure Application Insights | Power Apps


D365 Demystified

Here’s how you can register your Canvas Power App in your Azure’s Application Insights and log telemetry data into Azure.

Some basic info about what all you can see in Application Insights is –

  1. Count of Users who used the app
  2. Events logged, Sessions logged
  3. Device info
  4. Region info

It’s quite simple to set it up! Let’s take a look –

Registering in Application Insights in Azure

First, make sure you do have an Azure Subscription. Let’s look at how you can register an Application Insight record.

  1. Look for Application Insights in Azure in the search bar
  2. Then, among other records, you can register a new one which will identify with your Canvas Power App
  3. Review all that you entered and move ahead
  4. It’ll be deployed pretty quickly within a few minutes unlike some heavy Azure resources
  5. Upon completion, you can navigate to the resource and see the details

    Zoomed…

View original post 374 more words

Approvals – Power Automate & Dynamics 365


app

Use markdown to format approval emails – Power Automate and Dynamics 365

Let us update our previous flow, to use markdown to format the approval email. Markdown is the lightweight mark-up language for adding formatting elements to the plain text. Refer the Markdown cheat sheet We’d update the Details property of Start and wait for an approval action. Below we have added some sample text that uses … Continue reading “Use markdown to format approval emails – Power Automate and Dynamics 365”

Custom Responses in Approvals – Power Automate and Dynamics 365

Let us update our previous flow to use Approval Type – Custom Responses, using which we can define our custom response options (instead of limiting ourselves to Approve and Reject) Here we have updated the Start and wait for an approval action’s Approval Type from Approve / Reject – First to respond to Custom Responses … Continue reading “Custom Responses in Approvals – Power Automate and Dynamics 365”

Approval/Reject Type – Everyone must approve – Power Automate and Dynamics 365

Let us update our previous flow from approval/reject type – First to respond to Everyone must approve type. For First to respond, either Approval or rejection by any of the approver completes the request. In case of Everyone must approve, if any of the approvers rejects the request is considered rejected, for the request to … Continue reading “Approval/Reject Type – Everyone must approve – Power Automate and Dynamics 365”

Use markdown to format approval emails – Power Automate and Dynamics 365


Let us update our previous flow, to use markdown to format the approval email.

Markdown is the lightweight mark-up language for adding formatting elements to the plain text.

Refer the Markdown cheat sheet

We’d update the Details property of Start and wait for an approval action.

Below we have added some sample text that uses markdown syntax.

Below is how it renders in the outlook web access OWA.

And inside the Power Automate Approval Center.

Below we have used markdown syntax for defining the table.

Inside OWA.

Inside the Power Automate approval center.

And within the Power Automate mobile app

Kindly refer to the below table that lists down inconsistencies among different clients.

https://docs.microsoft.com/en-us/power-automate/approvals-markdown-support#client-support

Also check out –

Power Automate Approvals – Markdown or HTML?

Using Markdown in Microsoft Flow approval actions

Hope it helps..

Custom Responses in Approvals – Power Automate and Dynamics 365


Let us update our previous flow to use Approval Type – Custom Responses, using which we can define our custom response options (instead of limiting ourselves to Approve and Reject)

Here we have updated the Start and wait for an approval action’s Approval Type from

Approve / Reject – First to respond

to

Custom Responses – Wait for one response.

We have defined below custom responses – Accept, Reject, and Need more details.

We have updated the Condition to check for Need more details

Let us save, check, and run the flow.

The approvers are presented with the custom responses

Here among the users to who the request is assigned, the flow will complete if any of the approvers responds, it will not wait for all the approvers to respond.

Now let us update it to use Custom Responses – Wait for all
responses.

Here the flow will wait for responses from all the approvers before moving to the next action.

Lastly, if we define more than 5 responses,

Outlook and OWA have a limitation of only showing the first five responses in the actionable message as shown below.

Within the Power Automate approval center and Power Automate mobile application we do not have this limitation.

Power Automate approval centre:

Power Automate mobile application:

Hope it helps..

Parallel Approvals – Power Automate and Dynamics 365


Let us update our previous flow to include parallel approvals.

The difference between Approve / Reject Type – Everyone must approve and parallel approvals would be that using parallel approvals we could wait for responses for all the approvers, be it approve or reject.

In case of Everyone must approve, if any of the approvers rejects, the request is considered rejected and it will not wait for other approver’s response, and for the request to be considered approved all the approver needs to approve it.

Let us add a new action Add a parallel branch after Apply to each.

Let us keep only test user 1 in the first branch and in the other branch add a Start and wait for an approval action.

We have updated both to be approval type First to respond, set Assigned to the property to test user 1 and test user 2 for the respective branches.

And also we want to evaluate the responses from both the approvers before taking further action, so we have deleted the condition action from the first branch.

Let us add a new Condition action, with 2 AND conditions, i.e. if the first user approves and the second user rejects which corresponds to the respective Start and wait for an approval action.

Here if the condition is fulfilled we update the description of the case record.

Let us save, check, and trigger the flow.

After test user 1 has approved, we can still see the flow waiting for user 2 to respond.

The flow completes successfully after the response is received from both the approvers. Here user 2 rejects the request.

As expected, we can see the description field updated as defined in the update the record action.

Hope it helps..

Attachments in Approval – Power Automate and Dynamics 365


Let us update our previous sequential flow, to include file attachments to notes of the case record, as part of the approval.

First, we need to add Initialize variable action and define an Array variable to store all the attachments details.

Next, we will use List records action, to fetch all the notes records associated with that particular case record.

Next, we will use Apply to each action to loop through all the attachments and use it to populate our attachments array variable defined in the previous step.

Now as the last step, we’d specify the attachment array variable for the attachment field inside the Start and wait for an approval action.

Now we are ready to save and test our flow.

Here our flow is waiting for approval from the user.

The approver receives the notification and can now review the attachments before responding.

Hope it helps..

Approval/Reject Type – Everyone must approve – Power Automate and Dynamics 365


Let us update our previous flow from approval/reject typeFirst to respond to Everyone must approve type.

For First to respond, either Approval or rejection by any of the approver completes the request.

In case of Everyone must approve, if any of the approvers rejects the request is considered rejected, for the request to be considered approved all the approver needs to approve it.

We have updated the approval type from first to respond to

everyone must approve.

In the case of Approve / Reject – Everyone must approve

  • All the assigned users must approve, for the request to be approved.
  • Any of the assigned users if rejects, the request will be considered rejected.

Let us run the flow and test it.

We can see our flow waiting for approvals

We can see all the 3 approvers getting the approval request

Let us Reject it for one of the approvers.

It completes the flow without waiting for responses from other approvers.

The other approvers will see the below message.

Similarly, as expected, it will wait for all the approvers to approve before moving to the next action.

We need to make sure we specify the same value as shown in the Outputs above in the condition action.

Hope it helps..