Using DataSourceInfo and RecordInfo function for checking permissions for a Dataverse table or record in Canvas App


DataSourceInfo function can be used to check the Table / Entity level permission.

We can check for Create, Read, Edit, and Delete Permission and can disable, hide and show the corresponding create, edit and delete buttons for the users.

  • DataSourceInfo.CreatePermission
  • DataSourceInfo. DeletePermission
  • DataSourceInfo. EditPermission
  • DataSourceInfo. ReadPermission

DataSourceInfo function can also be used to obtain information about a particular column of the data source like Display Name, Max Length, Max Value, Min Length, Required.

https://www.inogic.com/blog/2020/11/how-to-use-datasourceinfo-in-canvas-app/

  • DataSourceInfo.DisplayName
  • DataSourceInfo.MaxLength
  • DataSourceInfo.MaxValue
  • DataSourceInfo.MinValue
  • DataSourceInfo.Required

     

Similarly, the RecordInfo function can be used to get the information about the record of a Dataverse data source.

We can check for the Read, Edit, and Delete Permission.

  • RecordInfo. ReadPermission
  • RecordInfo. EditPermission
  • RecordInfo.DeletePermission

Hide Show control based on Security Role in Canvas App

Here for our sample Canvas App we have a button and a Gallery  having Data source as Cases

1 

Visible    

Button

If(DataSourceInfo(Cases,DataSourceInfo.ReadPermission), true, false)

2 

Visible   

Edit

If(RecordInfo(Gallery1.Selected, RecordInfo.EditPermission),true, false)

3 

Visible    

Delete

If(RecordInfo(Gallery1.Selected, RecordInfo.DeletePermission),true, false)

Now to test it, we have assigned the below custom security role/rights to another user – Test User 1 (along with Basic User security role)

i.e. Read, Write and Delete permission at the User Level on Case.

We have also shared the Average order shipment time record with Test User 1 assigning him the Write Permission.

Now when Test User 1 will open the Canvas App, the first button will be visible as he has the Read permission on the Case table.

In the gallery only the one record shared will be visible along with the Edit button on the same.

Let us just share one more case record with only Read permission to Test User 1.

We can see the record displayed for Test User 1.

It shows the Edit button for the new record, although we shared the record with the Read permission.

We have the following formula for the Edit button’s OnSelect

Patch(Cases, LookUp(Cases,’Case Number’ = lblCaseNumber.Text), {‘Case Title’:”Sample Title”})

So clicking on Edit for the new record shared ‘Complete overhaul required‘ gives the below Permission error

Now pressing the same edit button for the ‘Average order shipment time’ record will update the record (set title as Sample Title) as it was shared with Write permission with Test User 1.

Get all the details here –

https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/functions/function-datasourceinfo

https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/functions/function-recordinfo

Hope it helps..

Advertisements

Delete the current (active) partition in Audit – Dynamics 365


We recently exceeded log storage capacity for our Power Platform environment

We can check the same at Resources >> Capacity inside the Power Platform admin center.

One of the sandbox environments had the most Log Usage

https://docs.microsoft.com/en-gb/power-platform/admin/legacy-capacity-storage#capacity-page-details

We deleted the plugin trace logs and all the audit partitions (except the active/current one, which the system doesn’t allow)

If we try deleting the current active partition, we will get the below error –

Microsoft.Crm.CrmException: You cannot delete audit data in the partitions that are currently in use, or delete the partitions that are created for storing future audit data.

The partitions are created on quarterly basis each year –

1 Jan – April, 1 April – July, 1 July – October, 1 Oct – January

Even after deleting the plugin trace logs and the partitions, we didn’t see any change in the storage capacity usage. So we raised a Microsoft Support Ticket and were informed by the team that they can delete the active partition for us, and it could take around 3 days. As that was a sandbox environment and we had no use of Audit data we went ahead and the Support Team deleted the active partition for us. (Also there was some issue in our data center with regards to the recalculation of the storage and it took few more days for that change to reflect inside the Capacity page of Power Platform Admin Center).

This way we were able to reclaim some of the log storage.

So basically we just need to raise Microsoft Support Ticket and request the same.

More on Audit

Also check the new process for deleting audit logs –

https://docs.microsoft.com/en-gb/power-platform/admin/free-storage-space#method-10-delete-audit-logs—new-process

Select audit logs to delete.

Hope it helps..

Audit Entity / Table – Few key points (Dynamics 365 / Power Apps)


In the earlier post we looked at different ways of extracting Audit History data.

Extract Audit Historyhttps://nishantrana.me/2021/05/17/how-to-export-the-audit-history-values-from-dynamics-365/

We can use Microsoft 365 Security and Compliance Center for the same, though it is limited to the Production environment only.

https://docs.microsoft.com/en-us/power-platform/admin/enable-use-comprehensive-auditing#requirements

Then we have Audit History Extractor, and we can also write SSIS Packages as well as Custom Code.

Let us just revisit some of the key points with regards to the Audit entity.

  • Do we have the Audit entity available for Advanced Find? >> No.

  • Do we have it available inside Report Wizard? >> No.

  • Can we write SSRS Report against the Audit entity using the TDS endpoint?

Let us create the Data Source.

Select the authentication as Active Directory Password Authentication for the TDS endpoint.

Enter the database name manually.

The final connection string >>

Data Source=orgnamae.crm.dynamics.com;Initial Catalog=orgname.crm.dynamics.com;Encrypt=True;TrustServerCertificate=False;Authentication=”Active Directory Password”

Within SQL4CDS the following query works

However, the same query doesn’t work inside SSRS.

It will give the below error message >>

Table audit is not available for reports

  • Do we have the Audit entity in the Power BI Dataverse connector? >> No

However, we can use the OData endpoint to create the report against the Audit entity.

https://www.365knowledge.com/2019/03/06/dynamics-365-user-access-report-with-fetchxml-and-power-bi/

  • Do we have the Audit entity available in Azure Synapse Link (Export to Data Lake)? >> No

How to set up Azure Synapse Link >>

https://nishantrana.me/2021/06/16/how-to-setup-azure-synapse-link-microsoft-dataverse/

  • Cannot we write a Fetch XML Based SSRS report?

Writing a Fetch XML based-report would be challenging, because of the way information is saved.

Check the below article to understand how the audit table stores the information.

https://mahadeomatre.blogspot.com/2015/02/ms-crm-audit-database-table-details.html

http://makdns.blogspot.com/2014/06/dynamics-crm-audit-entity.html

  • How about SQL Based SSRS Report if we are using SQL – for On-Premise?

Refer to the below article that provides the steps to do so.

http://makdns.blogspot.com/2014/06/dynamic-crm-20112013-audit-report-in.html

Hope it helps..

Advertisements

Step by step – LinkedIn Sales Navigator integration with Dynamics 365 Sales


To enable the LinkedIn Sales Navigator Integration with Dynamics 365 Sales, navigate to App Settings >> General Settings >> LinkedIn Integration and click on Enable LinkedIn Integration

We can do the same from System Settings >> Business Management >> LinkedIn Sales Navigator

Click on continue to install the LinkedIn Sales Navigator

LinkedIn Sales Navigator solution will add the LinkedIn Sales Navigator controls to the default forms of Lead, Opportunity, Contact, and Account.

The controls are LinkedIn Sales Navigator Lead control and LinkedIn Sales Navigator Account control along with their corresponding lookup controls.

These controls can be added to any other entity as well.

https://docs.microsoft.com/en-us/dynamics365/linkedin/integrate-sales-navigator#unified-interface-apps-sales-navigator-controls-for-the-unified-interface

The installation process will start.

The installation should complete in 10-15 minutes.

Click on Go to Configuration, select LinkedIn Sales
Navigator and enable it.

Toggle to Yes and Save the settings above or within the Sales Hub >> App Settings

The installation process has installed the below LinkedIn solutions.

We can see the LinkedIn Sales Navigator tab added to the Lead, Opportunity, Contact, and Account form.

To configure it further click on Sign in as a different
user

Enter your LinkedIn credentials and start the free trial of the Sales Navigator
Team.

After setting up the trial we can see the details populated in the LinkedIn Sales Navigator tab.

The sample contact record –

The sample lead record –

The LinkedIn Controls for member and account profile on the form are bound to Last Name and Company Name fields (single line of text field) inside the Lead form.

Similarly, we have lookup specific LinkedIn control

Here we have added the LinkedIn Lead Lookup Control in Opportunity form for the contact lookup.

The result –

The Match will associate the profile with the record.

Now back in our LinkedIn Sales Navigator, navigate to the Admin page

Click on Connect to CRM to configure the integration.

 

We have selected the Sandbox environment here

Enter the domain name of the Dynamis 365 Sales organization and Log In

Provide the requested permissions

Step 1 of the wizard will provide the details of the connection.

Click on Configure Settings

  • Data Imported from CRM (CRM >> LinkedIn)

We have the option to import/sync Leads, Contacts, and Account records from CRM to LinkedIn Sales Navigator.

 

  • Data exported to CRM (LinkedIn >> CRM)

We can export messages, InMails, calls, and notes from LinkedIn to CRM.

The Sandbox users’ steps require the exact email match between the LinkedIn Account Center user profile and CRM’s user profile.

Advanced Features


Data Validationhttps://business.linkedin.com/sales-solutions/sales-navigator-customer-hub/resources/data-validation-dynamics

Log to CRM option gets enabled after we have connected LinkedIn Sales Navigator to CRM

Notes à

Messages à

Back in the timeline, inside the contact record, we can see them added to the timelines.

We can also send InMail messages to the contact

Back inside our LinkedIn Member Profile, we can message and connect with the profile populated from within the Contact form.

Sample Account Record shows the News and Recommended Leads

News à

Here we saw the basic steps we need to perform to integrate Dynamics 365 Sales with LinkedIn Sales Navigator and the features provided.

To sum up –

  • LinkedIn Controls will be added to Dynamics 365  Sales for Member profile, Account Profile and InMail.
  • We can import Leads, Contacts, and Account records from CRM to LinkedIn 
  • We can export messages, InMails, calls, and notes from LinkedIn to CRM.

Get all the details here –

https://docs.microsoft.com/en-us/dynamics365/linkedin/integrate-sales-navigator

Hope it helps..

Advertisements

Mind map – Dynamics 365 Field Service


Prepared a mind map on Dynamics 365 Field Service for quick reference.

Access it here (recommended)

https://gitmind.com/app/doc/ac83047944

Download the .png format

Dynamics 365 Field Service

Refer the content here – https://nishantrana.me/2021/08/17/quick-reference-dynamics-365-field-service/

Advertisements

How to – Export Dataverse (Dynamics 365) data to Azure SQL using Azure Data Factory pipeline template


[Visual Guide to Azure Data Factory - https://acloudguru.com/blog/engineering/a-visual-guide-to-azure-data-factory]

Using the new Azure Data Factory pipeline template – Copy Dataverse data from Azure Data Lake to Azure SQL – we can now easily export the Dataverse data to Azure SQL Database.

https://docs.microsoft.com/en-us/power-platform-release-plan/2021wave1/data-platform/export-dataverse-data-azure-sql-database

Check other posts on Azure Data Factory

Select Pipeline from template option inside the Data Factory

Search for Dataverse and select the Copy Dataverse data from Azure Data Lake to Azure SQL template

Let us specify the User Inputs required by the template – i.e. Azure SQL Database and the Data Lake Storage.

First we have created the linked service for the Azure SQL Database.

We’d use it to connect to the below table MyContacts.

Similarly create a linked service to Azure Data Lake Gen 2, which holds our Dataverse data.

Get the URL from the Container’s property. (replace blob with dfs in the URL)

To get the storage account key, select Access Keys >> Show Keys >> Copy the Key for the Storage Account.

Here we have already configured Azure Synapse Link for Dataverse

https://nishantrana.me/2020/09/07/export-data-from-common-data-service-to-azure-data-lake-storage-gen2/

Now as we have defined the User Inputs, select Use this template.

Navigate to the data flow created – DataverseToAzureSQL

Select our source ADLS and check and configure its properties.

Source Settings

Here we have the Inline dataset type set to Common Data Model and the Linked service is the AzureDataLakeStorage1 we created earlier.

Source Option

Specify the Dataverse folder for the Root Location.

Here we have specified the contact entity from our Data Lake Storage.

Projection

In the projection we have cleared the generated schema using Clear Schema, also selected Schema options >> Allow schema drift


We have enabled Allow schema drift option which will create the required columns in the destination Azure SQL Table.

Optimize

Inspect

Data preview

As we have not turned on Debug mode, there is nothing to preview

Now let us move to our Destination – SQL.

Sink

Here we have AzureSQLTable dataset connected to contact table in Azure SQL and have checked Allow schema drift option.


Below is our AzureSQLTable Dataset connected to the MyContacts table.

Settings

Here we have selected Allow Insert as the Update Method and Table
action as Recreate table – as we want the destination table to be re-created dynamically based on the source.

Mapping

We have left it to Auto mapping.

Optimize

Inspect

Data preview

Let us Publish All our changes and Debug our pipeline.

Let us monitor our pipeline run.

We can see that pipeline has run successfully and took around 4 minutes.

We can see the contact’s data copied to our Azure SQL successfully.

So here we covered the insert operation, in the next posts we’d see how we can configure update, upsert and delete operation.

Also, check

Posts on Azure Data Factory

Posts on Azure Data Lake

Hope it helps..