Featured

Recent Posts


Fixed – “The relative url contains invalid characters. Please use a different name. Valid relative url names cannot ends with the following strings: .aspx, .ashx, .asmx, .svc , cannot begin or end with a dot, cannot contain consecutive dots and cannot contain any of the following characters: ~ ” # % & * : ? / \ { | }. “ error while creating SharePoint Document Location – Dynamics 365 / Dataverse

We were using the below code to create a sharepointdoucmentlocation record through a C# Console App. For folderName we were using the…

Something went wrong. Please refresh the page and/or try again.

Advertisements

Advancing a Business Process Flow Stage Using a C# Console App (Dataverse / Dynamics 365)


In Dynamics 365, Business Process Flows are usually progressed by users through the UI. However, in scenarios like data migration, bulk remediation, or backend automation, we may need to move a BPF stage programmatically. Here we will cover one of the ways we can advance the Business Process Flow to the next stage using a C# console application, with a Case example used only as a reference.

Every Business Process Flow in Dataverse is backed by its own table, created when the process is published. The table name is derived from the process schema name and stores one record per entity instance participating in the flow.

Below is the table for the Phone To Case Process with schema name – phonetocaseprocess

We can get the stagename and the processstageid for the business process flow from the processstage table, passing the GUID of the business process flow.

SELECT processid,
       processidname,
       stagename,
       processstageid,
       stagecategoryname,
       *
FROM   processstage
WHERE  processid = '0FFBCDE4-61C1-4355-AA89-AA1D7B2B8792';

Regardless of the entity, two columns control stage movement: activestageid, which represents the current stage, and traversedpath, which stores a comma-separated list of all stage IDs the record has passed through. When moving a BPF programmatically, both values must be updated together to ensure the UI reflects the change correctly. The table will also include the column referring to the record it is associated with; in our example, it is incidentid.

The traversedpath value must be constructed as a comma-separated list of processstageid values, preserving the exact order in which stages are completed, with each newly reached stage appended to the end of the existing path.

SELECT businessprocessflowinstanceid,      
       activestageid,
       activestageidname,
       traversedpath,
       incidentid,
       processid,
       processidname,
       *
FROM   phonetocaseprocess
where incidentid = '98c26cb0-ff9f-f011-b41c-7c1e52fd16bb'

At a high level, the process is always the same. We first identify the correct BPF table, then retrieve the BPF instance associated with the primary record. Next, we update the activestageid to point to the next stage and append that stage ID to the existing traversedpath. Finally, we persist the update back to Dataverse. Because this logic runs outside the UI, it bypasses stage validations and required-field enforcement, making it ideal for backend utilities but something that should be used carefully.

Below is our sample code that moves the case record from the Research stage to the Resolve stage.

Sample Code

 static void Main(string[] args)
        {
            Console.WriteLine("MoveCaseBpfToResolve started.");
            // CRM connection
            string connString = @"AuthType=OAuth;
            Username=abc.onmicrosoft.com;
            Password=xyz;
            Url=https://abc.crm.dynamics.com/;
            AppId=51f81489-12ee-4a9e-aaae-a2591f45987d;
            RedirectUri=app://58145b91-0c36-4500-8554-080854f2ac97/";

            var service = new CrmServiceClient(connString);

            var bpfSchemaName = "phonetocaseprocess";

            var caseId = "98c26cb0-ff9f-f011-b41c-7c1e52fd16bb";
            var resolveStageId = new Guid("356ecd08-43c3-4585-ae94-6053984bc0a9");

            // Query the BPF instance for the Case
            var query = new QueryExpression(bpfSchemaName)
            {
                ColumnSet = new ColumnSet("activestageid", "traversedpath")
            };
            query.Criteria.AddCondition("incidentid", ConditionOperator.Equal, caseId);

            var instances = service.RetrieveMultiple(query);

            if (!instances.Entities.Any())
            {
                Console.WriteLine("No BPF instance found for the Case. Exiting.");
                return;
            }

            var bpfInstance = instances.Entities.First();        
         

            var updateBpf = new Entity(bpfSchemaName)
            {
                Id = bpfInstance.Id
            };
            // Set active stage to Resolve
            updateBpf["activestageid"] = new EntityReference("processstage", resolveStageId);
            // Update traversed path
            var traversedPath = bpfInstance.GetAttributeValue<string>("traversedpath");
            updateBpf["traversedpath"] = $"{traversedPath},{resolveStageId}";
            service.Update(updateBpf);

            Console.WriteLine("BPF successfully moved to Resolve stage.");
            Console.WriteLine("MoveCaseBpfToResolve completed.");
        }

Result –

Hope it helps.

Advertisements

Using a Plugin to Generate Auto-Number Values for Legacy and Reopened Records in Dynamics 365 / Dataverse


In one of our recent Dynamics 365 / Dataverse projects, we ran into one issue with auto-number fields. We had configured an auto-number for the custom_id field on the Opportunity table. The format used a prefix of QU- followed by eight digits, resulting in IDs such as QU-00000133.

A screenshot of a computer

AI-generated content may be incorrect.

Everything functioned correctly for newly created records, and the field populated exactly as expected. However, during testing, we discovered a problem. When an Opportunity that had previously been closed was later reopened, the auto-number field did not populate as expected. The system did not treat the reopen action as a new record creation, so no auto-number was generated. Because the custom_id field was required for downstream integrations, the absence of a value became a breaking issue.

This happens because auto-numbers in Dataverse only trigger at creation time. A reopened record is simply updated, not recreated, so the auto-number mechanism never fires. This behavior is by design. Unfortunately, we had legacy records created long before auto-numbering was implemented, and when users reopened them for updates, those records still had no custom_id. The integration layer expects a value, so we needed a reliable way to populate one even after the original creation event had long passed.

To solve this, we implemented a plugin that checks whether the custom_id field is empty during update operations. If the value is missing, the plugin generates the next available QU number manually. The logic first retrieves the highest existing QU number in the system, extracts the numeric portion, increments it by one, and then applies the resulting value to the current record. Once the number is assigned, we also update the auto-number seed so that the built-in Dataverse auto-number engine continues from the correct sequence and avoids generating duplicates in the future.

The plugin was registered on the post update of the opportunity table with statecode, statuscode as the filtering attributes.

And PreImage with the following attributes – statecode, statuscode, custom_id.

A screenshot of a computer

AI-generated content may be incorrect.

Sample Code –

private void EnsureCustomId(Guid oppId, Entity preImage) 
{
    var customId = preImage.GetAttributeValue<string>("custom_id");
    if (!string.IsNullOrWhiteSpace(customId))
    {
        _trace.Trace("custom_id already populated, skipping.");
        return;
    }

    _trace.Trace("custom_id is NULL, generating new QU number.");

    var query = new QueryExpression("opportunity")
    {
        ColumnSet = new ColumnSet("custom_id"),
        Criteria =
        {
            Conditions =
            {
                new ConditionExpression("custom_id", ConditionOperator.NotNull),
                new ConditionExpression("custom_id", ConditionOperator.Like, "QU%")
            }
        },
        Orders =
        {
            new OrderExpression("custom_id", OrderType.Descending)
        },
        TopCount = 1
    };

    var existing = _service.RetrieveMultiple(query).Entities.FirstOrDefault();

    int next = 1;
    if (existing != null)
    {
        if (int.TryParse(existing.GetAttributeValue<string>("custom_id").Split('-').Last(), out int last))
            next = last + 1;
    }

    string newId = $"QU-{next:D8}";
    _trace.Trace($"Assigning new QU: {newId}");

    var update = new Entity("opportunity", oppId)
    {
        ["custom_id"] = newId
    };
    _service.Update(update);

    next = next + 1;

    var request = new OrganizationRequest("SetAutoNumberSeed");
    request["EntityName"] = "opportunity";
    request["AttributeName"] = "custom_id";
    request["Value"] = (long)next;
    _service.Execute(request);

    _trace.Trace($"Auto number seed updated to {next}");
}

Updating the auto-number seed is an important part of this solution. Without adjusting the seed, Dataverse might attempt to generate a number that has already been created manually by our plugin. By synchronizing the seed after each assignment, we ensure that the system’s internal auto-number feature continues counting from the correct position. This prevents duplicate values and keeps both manual and automatic generation aligned.

With this logic in place, reopened Opportunities now receive valid QU numbers automatically. The integration processes no longer break due to missing identifiers. Users can reopen and update older records confidently, and the system maintains a clean and consistent numbering sequence. A small enhancement to our plugin resolved a significant data quality issue end-to-end.

Hope it helps..

[Tool Showcase – PowerMakerAI] Talk to Your CRM Like a Teammate — Meet PowerMakerAI’s Context-Aware Chatbot


You’ve probably used ChatGPT or Gemini to generate code, write emails, or even debug errors. But what if you could do the same for your Dynamics 365 CRM?

No plugins. No clicking around. No SDK calls.

Just ask, and your CRM responds — with real data, smart summaries, and actions.

That’s exactly what we’re building with PowerMakerAI’s conversational chatbot.


🧠 What It Actually Does

This isn’t just a chatbot bolted on top of CRM. It’s deeply CRM-aware — meaning it knows your metadata, understands your entities and attributes, and can use that to:

  • Create new records (like leads, contacts, opportunities)
  • Update or delete existing data
  • Fetch filtered lists
  • Analyze plugin trace logs in conversation
  • Help troubleshoot issues based on real CRM behavior
  • Explain how your CRM is set up — from relationships to field types

In short: it’s like having a junior CRM dev who already knows your schema and listens carefully.


🛠 How It Works

Here’s a basic example.

You say:

“Show me all open opportunities from last week, owned by users in the Mumbai region.”

The bot:

  • Understands “opportunity” as an entity
  • Reads your metadata to confirm which fields match “open,” “last week,” and “region”
  • Constructs a real CRM query
  • Returns the results in a nice summary or even a table
  • Can export the results or help you take action — like closing them or assigning to someone else

A screenshot of a chat
AI-generated content may be incorrect.

Another example:

You say:

“Why is my lead conversion plugin failing for some records?”

The bot:

  • Checks for recent failures in your PluginTraceLog
  • Analyzes the logs using the same logic as our trace log analyzer
  • Gives you a plain-language explanation of what’s breaking
  • Suggests what you might fix in the plugin or data

🔁 It’s Not Static — It Talks Back

What makes this chatbot different is that it keeps asking the right follow-up questions:

“Do you want to filter this by owner?” “Should I show top 10 records or all?” “Would you like to update these now?”

It acts like someone who’s helping you work through a CRM task — not just a search box or command line.


🔍 Why This Matters

Most CRM tools still assume you:

  • Know the schema
  • Can build FetchXML queries or use Advanced Find
  • Understand what each plugin step is doing
  • Have time to jump between forms, logs, and docs

But most people just want to get something done or figure out what’s broken. That’s where natural language makes a real difference.


🧪 Use Cases We’re Seeing Already

  • Functional consultants using it to prepare data before demos
  • Junior devs using it to troubleshoot without writing code
  • Support teams asking why a record didn’t update
  • Architects mapping entity relationships without opening the solution

🎯 What It’s Doing Under the Hood

  • Reads your CRM metadata in real time (entities, fields, option sets, etc.)
  • Converts your prompt into a structured CRM operation
  • Performs the operation via Web API or analysis logic
  • Returns results + lets the LLM generate summaries or explanations

You don’t need to write any code or handle tokens — it does the heavy lifting quietly.


🔐 In Beta — and Free for Now

Right now, the chatbot is available to all beta users. It’s evolving fast, and we’re adding support for:

  • Multi-turn conversations that lead to actual CRM changes
  • Configurable actions (e.g. “always ask before saving”)
  • Metadata validation with real-time error checks
  • Relationship-aware queries

👉 [Try the Chatbot Now]

You can explore PowerMakerAI yourself here: https://powermakerai.com


We’re excited about this one. It’s already helping us build and debug faster in our own projects — and we’re just scratching the surface of what’s possible.

Let us know how you’d use this. Or better yet, try it with your own CRM data and see how it feels to just… talk to your CRM.

You could share the feedback at – powermakerai@gmail.com

Action ‘Update_a_record’ failed: An error has occurred. No resources were found when selecting for update – Fixing Cross-Company Update Issues in Finance & Operations Using Dataverse Virtual Entities (Power Automate)


Recently, while trying to update the Projects table in Finance & Operations using the Fin & Ops Apps actions in Power Automate, we ran into below error:

An error has occurred. No resources were found when selecting for update.

A screenshot of a computer

AI-generated content may be incorrect.

A screenshot of a computer

AI-generated content may be incorrect.

After digging deeper, we realised the issue had nothing to do with the payload or field mappings. The root cause was that the default company of the connection user in Finance & Operations was different from the company of the record we were trying to update. The Fin & Ops connector always operates in the context of the user’s default legal entity, and unlike the “List items present in table” action, it does not offer any “Cross Company” option for update operations. The result is that Power Automate looks for the record in the wrong company and, naturally, F&O returns a “record not found” error.

A screenshot of a computer

AI-generated content may be incorrect.

In our scenario, we had dual-write and virtual entities enabled. When this is the case, Finance & Operations exposes many of its data entities as Dataverse Virtual Tables. These tables are essentially real-time proxies: Dataverse reads and writes directly into F&O while automatically handling the underlying company and key structure. So instead of updating the F&O Projects table directly, we switched our Power Automate logic to update the corresponding virtual entity in Dataverse. That simple change immediately resolved the issue. The update worked flawlessly across companies, and we didn’t have to worry about the user’s default company or any cross-company flags.

A screenshot of a computer

AI-generated content may be incorrect.

This completed without errors, even for multi-company data.

A screenshot of a computer

AI-generated content may be incorrect.

There are other approaches as well, depending on how your environment is set up. We could bypass the connector entirely and make raw OData PATCH calls to F&O, as long as we manually specify the full composite keys, including the DataAreaId. Another option is to build a small custom API or X++ service that accepts a legal entity parameter, executes a changeCompany call in F&O, and safely performs the update on the server side. Both approaches work, but they require more configuration, authentication handling, and careful error management. For most day-to-day automation scenarios, the virtual entity route remains the simplest and most reliable.

In our case, switching to the Dataverse Projects (mserp) virtual table solved the issue immediately. The update completed without errors, even for multi-company data, and the flow became much cleaner.

A detailed explanation of this limitation can be found here, and was very helpful in troubleshooting:

Multiple legal entities issue with the Fin & Ops Apps connector in Power Automate

Hope it helps..

Advertisements

Fixing the “Only 1 of 2 keys provided for lookup, provide keys for dataAreaId, ProjectID / Not found” Error in Power Automate (Fin & Ops Apps)


Recently, while working with the Projects table from a Finance & Operations (F&O) environment, we ran into an error while using the Get a record action in Power Automate. (BTW this was the first we were using the Fin & Ops connector)

The flow kept failing with the following message:

Only 1 of 2 keys provided for lookup, provide keys for dataAreaId, ProjectID.

This error appears when we try to fetch a record from an F&O using only the ProjectID. But unlike normal Dataverse tables, F&O tables often come with composite keys.

Instead of passing just the ProjectID, we must pass the full composite key in the format:

dataAreaId,ProjectID

In our case, the correct value was:

AUST P-REP-34

Below, we can see it working properly

A screenshot of a computer

AI-generated content may be incorrect.

At one point, even after passing the correct composite key, we started getting a NotFound error:

Action ‘Get_a_record’ failed – NotFound

A screen shot of a computer error

AI-generated content may be incorrect.

Nothing had changed in the inputs — it was just not working. The fix was surprisingly simple..

We deleted the Get a record step, re-added it, provided the same inputs again, and the error disappeared. It could be related to the connection reference not being updated properly in the background.

Hope it helps..

Advertisements

Nishant Rana's Weblog

Everything related to Microsoft .NET Technology

Skip to content ↓