Featured

Recent Posts


Fixed – “The relative url contains invalid characters. Please use a different name. Valid relative url names cannot ends with the following strings: .aspx, .ashx, .asmx, .svc , cannot begin or end with a dot, cannot contain consecutive dots and cannot contain any of the following characters: ~ ” # % & * : ? / \ { | }. “ error while creating SharePoint Document Location – Dynamics 365 / Dataverse

We were using the below code to create a sharepointdoucmentlocation record through a C# Console App. For folderName we were using the…

Something went wrong. Please refresh the page and/or try again.

Advertisements

[Tool Showcase – PowerMakerAI] Talk to Your CRM Like a Teammate — Meet PowerMakerAI’s Context-Aware Chatbot


You’ve probably used ChatGPT or Gemini to generate code, write emails, or even debug errors. But what if you could do the same for your Dynamics 365 CRM?

No plugins. No clicking around. No SDK calls.

Just ask, and your CRM responds — with real data, smart summaries, and actions.

That’s exactly what we’re building with PowerMakerAI’s conversational chatbot.


🧠 What It Actually Does

This isn’t just a chatbot bolted on top of CRM. It’s deeply CRM-aware — meaning it knows your metadata, understands your entities and attributes, and can use that to:

  • Create new records (like leads, contacts, opportunities)
  • Update or delete existing data
  • Fetch filtered lists
  • Analyze plugin trace logs in conversation
  • Help troubleshoot issues based on real CRM behavior
  • Explain how your CRM is set up — from relationships to field types

In short: it’s like having a junior CRM dev who already knows your schema and listens carefully.


🛠 How It Works

Here’s a basic example.

You say:

“Show me all open opportunities from last week, owned by users in the Mumbai region.”

The bot:

  • Understands “opportunity” as an entity
  • Reads your metadata to confirm which fields match “open,” “last week,” and “region”
  • Constructs a real CRM query
  • Returns the results in a nice summary or even a table
  • Can export the results or help you take action — like closing them or assigning to someone else

A screenshot of a chat
AI-generated content may be incorrect.

Another example:

You say:

“Why is my lead conversion plugin failing for some records?”

The bot:

  • Checks for recent failures in your PluginTraceLog
  • Analyzes the logs using the same logic as our trace log analyzer
  • Gives you a plain-language explanation of what’s breaking
  • Suggests what you might fix in the plugin or data

🔁 It’s Not Static — It Talks Back

What makes this chatbot different is that it keeps asking the right follow-up questions:

“Do you want to filter this by owner?” “Should I show top 10 records or all?” “Would you like to update these now?”

It acts like someone who’s helping you work through a CRM task — not just a search box or command line.


🔍 Why This Matters

Most CRM tools still assume you:

  • Know the schema
  • Can build FetchXML queries or use Advanced Find
  • Understand what each plugin step is doing
  • Have time to jump between forms, logs, and docs

But most people just want to get something done or figure out what’s broken. That’s where natural language makes a real difference.


🧪 Use Cases We’re Seeing Already

  • Functional consultants using it to prepare data before demos
  • Junior devs using it to troubleshoot without writing code
  • Support teams asking why a record didn’t update
  • Architects mapping entity relationships without opening the solution

🎯 What It’s Doing Under the Hood

  • Reads your CRM metadata in real time (entities, fields, option sets, etc.)
  • Converts your prompt into a structured CRM operation
  • Performs the operation via Web API or analysis logic
  • Returns results + lets the LLM generate summaries or explanations

You don’t need to write any code or handle tokens — it does the heavy lifting quietly.


🔐 In Beta — and Free for Now

Right now, the chatbot is available to all beta users. It’s evolving fast, and we’re adding support for:

  • Multi-turn conversations that lead to actual CRM changes
  • Configurable actions (e.g. “always ask before saving”)
  • Metadata validation with real-time error checks
  • Relationship-aware queries

👉 [Try the Chatbot Now]

You can explore PowerMakerAI yourself here: https://powermakerai.com


We’re excited about this one. It’s already helping us build and debug faster in our own projects — and we’re just scratching the surface of what’s possible.

Let us know how you’d use this. Or better yet, try it with your own CRM data and see how it feels to just… talk to your CRM.

You could share the feedback at – powermakerai@gmail.com

Action ‘Update_a_record’ failed: An error has occurred. No resources were found when selecting for update – Fixing Cross-Company Update Issues in Finance & Operations Using Dataverse Virtual Entities (Power Automate)


Recently, while trying to update the Projects table in Finance & Operations using the Fin & Ops Apps actions in Power Automate, we ran into below error:

An error has occurred. No resources were found when selecting for update.

A screenshot of a computer

AI-generated content may be incorrect.

A screenshot of a computer

AI-generated content may be incorrect.

After digging deeper, we realised the issue had nothing to do with the payload or field mappings. The root cause was that the default company of the connection user in Finance & Operations was different from the company of the record we were trying to update. The Fin & Ops connector always operates in the context of the user’s default legal entity, and unlike the “List items present in table” action, it does not offer any “Cross Company” option for update operations. The result is that Power Automate looks for the record in the wrong company and, naturally, F&O returns a “record not found” error.

A screenshot of a computer

AI-generated content may be incorrect.

In our scenario, we had dual-write and virtual entities enabled. When this is the case, Finance & Operations exposes many of its data entities as Dataverse Virtual Tables. These tables are essentially real-time proxies: Dataverse reads and writes directly into F&O while automatically handling the underlying company and key structure. So instead of updating the F&O Projects table directly, we switched our Power Automate logic to update the corresponding virtual entity in Dataverse. That simple change immediately resolved the issue. The update worked flawlessly across companies, and we didn’t have to worry about the user’s default company or any cross-company flags.

A screenshot of a computer

AI-generated content may be incorrect.

This completed without errors, even for multi-company data.

A screenshot of a computer

AI-generated content may be incorrect.

There are other approaches as well, depending on how your environment is set up. We could bypass the connector entirely and make raw OData PATCH calls to F&O, as long as we manually specify the full composite keys, including the DataAreaId. Another option is to build a small custom API or X++ service that accepts a legal entity parameter, executes a changeCompany call in F&O, and safely performs the update on the server side. Both approaches work, but they require more configuration, authentication handling, and careful error management. For most day-to-day automation scenarios, the virtual entity route remains the simplest and most reliable.

In our case, switching to the Dataverse Projects (mserp) virtual table solved the issue immediately. The update completed without errors, even for multi-company data, and the flow became much cleaner.

A detailed explanation of this limitation can be found here, and was very helpful in troubleshooting:

Multiple legal entities issue with the Fin & Ops Apps connector in Power Automate

Hope it helps..

Advertisements

Fixing the “Only 1 of 2 keys provided for lookup, provide keys for dataAreaId, ProjectID / Not found” Error in Power Automate (Fin & Ops Apps)


Recently, while working with the Projects table from a Finance & Operations (F&O) environment, we ran into an error while using the Get a record action in Power Automate. (BTW this was the first we were using the Fin & Ops connector)

The flow kept failing with the following message:

Only 1 of 2 keys provided for lookup, provide keys for dataAreaId, ProjectID.

This error appears when we try to fetch a record from an F&O using only the ProjectID. But unlike normal Dataverse tables, F&O tables often come with composite keys.

Instead of passing just the ProjectID, we must pass the full composite key in the format:

dataAreaId,ProjectID

In our case, the correct value was:

AUST P-REP-34

Below, we can see it working properly

A screenshot of a computer

AI-generated content may be incorrect.

At one point, even after passing the correct composite key, we started getting a NotFound error:

Action ‘Get_a_record’ failed – NotFound

A screen shot of a computer error

AI-generated content may be incorrect.

Nothing had changed in the inputs — it was just not working. The fix was surprisingly simple..

We deleted the Get a record step, re-added it, provided the same inputs again, and the error disappeared. It could be related to the connection reference not being updated properly in the background.

Hope it helps..

Advertisements

Fixed – Error occurred while loading document template / Error occurred while loading preview error in Dynamics 365


Recently, one of the users reported the following error while trying to generate a PDF for a Quote record in Dynamics 365:

Initially, the Export to PDF option was showing a blank list of templates.

A screenshot of a computer

AI-generated content may be incorrect.

This happened because the user was missing a few essential privileges on the Document Template tables.

To fix the blank template list, we updated the user’s custom security role with the appropriate privileges on the following tables:

  • Document Template
  • Personal Document Template
A screenshot of a computer

AI-generated content may be incorrect.

After adding these, the templates started appearing in the “Export to PDF” dialog.

Even though the templates were now visible, the user still got the following error while trying to preview or export:

A screenshot of a computer

AI-generated content may be incorrect.

This was due to one missing privilege in the Customization area of the security role.

We added: DocumentGeneration privilege

A screenshot of a computer

AI-generated content may be incorrect.

Once this privilege was granted, the preview and PDF generation started working as expected.

If we are unsure which privilege might be missing in similar situations, a quick way to find out is by using Developer Tools (F12) and monitoring the Network tab while reproducing the error. The failed request, such as ExportPdfDocument, usually reveals the missing privilege directly in its Response section (for example, missing prvDocumentGeneration privilege). This saves time and avoids trial and error when troubleshooting permission issues.

Hope it helps..

Advertisements

How to Identify and Update Power Automate HTTP Request Trigger Flows Before November 2025


Few weeks back, while working on one of our Power Automate flows, we noticed a banner warning on the HTTP Request trigger step. Microsoft has announced that starting August 2025, all flows using HTTP or Teams Webhook triggers with logic.azure.com URLs will move to a new endpoint under environment.api.powerplatform.com. The old URLs will stop working after November 30, 2025.

The screenshot below shows the banner that appears when the flow is opened in the designer.

A screenshot of a computer

AI-generated content may be incorrect.

When we saw this, we wanted to make sure no other flow across environments was using the old logic.azure.com–based URLs. Instead of checking each flow manually, we used SQL4CDS to quickly identify all such flows.

We ran the following query in SQL4CDS:

SELECT wf.name, wf.workflowid, wf.owneridname, wf.modifiedon
FROM workflow wf
WHERE wf.category = 5
AND LOWER(wf.clientdata) LIKE '%"type":"request","kind":"http"%'
A screenshot of a computer

AI-generated content may be incorrect.

This query returns all flows that have an HTTP Request trigger. It checks the clientdata column in the workflow table where the flow definition is stored as JSON and looks for the trigger type “Request” and kind “Http”.

This helped us identify every flow that exposes an HTTP endpoint — typically used for integrations, webhooks, or form submissions from external systems. Once we had this list, we opened each flow and copied the new trigger URL from the message banner.

The same can be achieved using PowerShell with the command: This lists all flows in a given environment that are part of Microsoft’s trigger URL migration.

Get-AdminFlowWithMigratingTriggerUrl -EnvironmentName

A screen shot of a computer

AI-generated content may be incorrect.

After copying the new URL, we updated our calling applications (for example, the website forms and marketing integrations) to replace the old endpoint with the new one. The new URLs are hosted under the Power Platform domain environment.api.powerplatform.com, replacing the older logic.azure.com endpoints.

To confirm that the requests were now reaching the new infrastructure, we captured the request headers after updating the URL.

This header (x-ms-igw-external-uri) is the easiest and most reliable way to confirm that your flow is now routed through the new Power Platform ingress gateway.

A screenshot of a computer

AI-generated content may be incorrect.

If this header is present and points to environment.api.powerplatform.com, the flow has been successfully migrated. If you still see logic.azure.com under the Host or DISGUISED-HOST headers, that flow or calling application still needs to be updated.

In our case, after replacing the URLs and testing, all flows were confirmed to be running on the new platform, and the integrations worked as expected.

So if we see this warning in your environment, you can either use the SQL4CDS query or the PowerShell command to locate such flows, update the calling systems with the new URL, and then verify by checking for the x-ms-igw-external-uri header in the request. That’s all you need to ensure your integrations continue to work smoothly past November 2025.

Hope it helps..

Advertisements

Using Parent Context in Dynamics 365 Plugins — Detecting System-Triggered Operations (Dataverse / Dynamics 365)


In this post, we’ll look at how we used the ParentContext property in Dynamics 365 plugins to determine if a plugin execution was triggered by another plugin and perform logic conditionally. This came up in two real-world cases — one where we needed to prevent duplicate Sales Leads created automatically through Event Management, and another where we wanted to match the correct Campaign when a Lead was updated through a Marketing Form submission.

In the first scenario, we had a plugin registered on the Pre-Create of Lead. We wanted to block duplicate Sales Leads only when they were created via CRM’s Event Plugin, not during manual Lead creation. To achieve this, we checked if the plugin execution had a ParentContext. When present, it meant the Lead creation was triggered by another process, not a user. We confirmed it was the system’s Event Plugin by checking the msevtmgt_originatingeventid field (this field will be auto-populated for a lead created by event) and depth. If the Lead was of Enquiry Type “Sales” and had an email address, we checked for duplicates and stopped the creation if one existed. This ensured duplicates were blocked only for system-triggered Leads.

The second case involved the plugin, registered on the Update of Lead. Here, we needed to identify if a Lead update was triggered by a Marketing Form submission (from the msdynmkt_marketingformsubmission table) and only then run our Campaign mapping logic. We used ParentContext to walk up the plugin chain and confirm the origin. Once verified, we called our logic to assign the correct Campaign based on the Region or Village. This made sure the Campaign logic only ran for Leads updated by Marketing automation, not for regular user edits.

In both cases, using ParentContext gave us precise control over when the plugin logic should run. It allowed us to differentiate between user actions and system-triggered updates, avoiding redundant execution and maintaining a cleaner automation flow.

Hope it helps..

Advertisements

Nishant Rana's Weblog

Everything related to Microsoft .NET Technology

Skip to content ↓