Recently, we got the following error while booking a resource in our schedule board. We earlier got the same error while trying to delete bookable resource booking records.
We could not figure out the workflow or SLA that could be responsible for the issue.
Eventually, turning on the Use Enhanced Background Processing (Preview) option in Field Service Settings >> Other fixed the issue for us.
Use Enhanced Background Processing (Preview)
In Field Service, a bunch of background jobs take care of things like managing Agreements. Normally, this uses the tried-and-tested workflows. However, we can flip on the new preview option that uses Power Automate Flows instead. This can help in tricky cases—like when the Agreement owner no longer has access to Dynamics 365. We could see this option enabled in our other environments, and this environment was a copy of one of them, so that could be the reason it worked in our case.
Recently, I came across a very interesting tool called PowerMakerAI, developed by a fellow community member. I felt it’s worth sharing here because it directly helps us in one of the most common and time-consuming tasks — creating tables and fields in Dynamics 365.
Creating custom entities and fields in Microsoft Dynamics 365 is essential — but let’s be honest, it’s often tedious and time-consuming. Navigating through multiple forms, selecting field types, defining metadata, and ensuring consistency takes significant effort, especially for complex data models.
That’s where PowerMakerAI’s AI Entity Builder with Visualizer comes in.
🎯 What is the AI Entity Builder?
The AI Entity Builder is a natural language-powered tool that lets you create complete entity definitions by simply describing what you need — just like talking to a junior developer.
Instead of clicking through menus or filling out field properties manually, you can say:
“Create an entity called ‘Customer Feedback’ with fields for Name, Email, Rating (1-5), Comments, and Submitted On.”
PowerMakerAI understands your intent and instantly generates a complete schema, including:
Logical names
Display names
Field data types
Required levels
Primary field selection
Relationship suggestions (coming soon)
👁️ Visualizer: See What You’re Building — Instantly
One of the most powerful additions is the interactive schema visualizer.
Once your entity is generated, the visualizer shows you:
All attributes in a structured format
Field types (text, number, option set, date, etc.)
Required vs optional fields
Entity-level metadata
[Coming Soon] Relationships with other entities
You can drag, zoom, and explore your schema visually — no need to switch back and forth between the Dynamics UI and your schema documentation.
🧠 How It Works (Under the Hood)
PowerMakerAI uses a combination of:
Natural Language Understanding (NLU): To detect intent, field names, data types, and relationships.
CRM Schema Intelligence: Built-in logic that maps certain keywords to CRM field types (e.g., “status” = OptionSet, “submitted on” = DateTime).
Dynamic Validations: Prevents common issues like duplicate logical names or unsupported characters.
It then presents the results in human-readable formats.
⚡ Benefits
🕒 Save Hours: No more manual field creation
💬 Speak Requirements: Great for functional consultants too
📦 Deployment-Ready: Export and import with minimal edits
🧩 Visual Confidence: Know what you’re building before you build it
👥 Perfect for Teams: Let non-technical users define schema, and devs just approve and deploy
🧪 Example Use Case
A marketing team wants a “Campaign Response” entity. Instead of submitting a spec document and waiting for dev time, they use PowerMakerAI:
“Create a Campaign Response entity with Contact, Campaign Name, Response Type (dropdown), Notes, and Created On.”
💡 In under 30 seconds, the entire entity is ready, visualized, and exportable.
🚧 Coming Soon
We’re just getting started. Upcoming features include:
🔁 Relationship suggestions (1:N, N:N)
✅ Auto-validation for solution compatibility
📤 One-click push to your CRM environment
🔄 Integration with PowerMakerAI’s chatbot for conversational updates
💬 Try It Yourself
You can explore PowerMakerAI yourself here: https://powermakerai.com. The creators are currently offering it free during beta.
When working with Date and Time fields in Dataverse, one of the most confusing parts is how values are stored in the backend vs. how they are displayed to users in different time zones.
For data type – Date and Time, we can specify Format as Date Only and Date and Time, and Time zone adjustment as User Local, Time Zone Independent for (Date and Time), and Date only, User Local, Time Zone Independent for (Date Only).
Below, we have created a few sample fields in Dataverse and tested their behavior.
And here’s how they are stored in the dataverse
Date and Time – User Local
Stored value: 2025-08-17T02:30:00.000Z
Formatted value in UI: 8/17/2025 8:00 AM
The value is stored in UTC, but when displayed, Dataverse converts it to the current user’s time zone (in our case, IST). Suitable for Meetings, appointments, or events where the time matters and should be relative to the user’s location.
Date and Time – Time Zone Independent
Stored value: 2025-08-17T08:00:00.000Z
Formatted value in UI: 8/17/2025 8:00 AM
Even though the backend stores it as UTC, Dataverse does not adjust this value per user’s time zone. Everyone sees the same clock time. Suitable for Business deadlines, store opening/closing hours, or cutoff times that should be consistent across geographies.
Date Only – Date Only
Stored value: 2025-08-17
Formatted value in UI: 8/17/2025
Stored as a pure date, no UTC, no midnight timestamp. It’s just the calendar date. Suitable for Birthdays, anniversaries, or personal dates where time zones should never matter.
Date Only – Time Zone Independent
Stored value: 2025-08-17T00:00:00.000Z
Formatted value in UI: 8/17/202
This looks like a datetime in storage (midnight UTC), but Dataverse locks the display so all users see the same date. Contract signed dates, expiry dates, or compliance deadlines where everyone must agree on the calendar date. We might select this option if we need to add a time component in the future, want consistency for all users globally, or during integration, the external API is expecting a full date and time value.
When we automate emails in Dataverse using Power Automate, we deal with something called Activity Party. It manages the participants of an email—whether they are To, CC, BCC, or the Sender. Normally, we use the partyid field to point to a Dataverse record like a Contact, Lead, User, or Queue.
Every participant in an email is stored as a row in the Activity Party table.
Key fields include:
partyid → Reference to the actual record (Contact, Account, User, Queue, etc.)
participationtypemask → Role (1 = Sender, 2 = To, 3 = CC, 4 = BCC)
addressused → The raw email address used
Normally, if a Contact or User is referenced in partyid, Dataverse automatically pulls their primary email.
But there are situations where this is not enough. That’s where addressused becomes important.
Multiple Email Addresses on a Record – A Contact, Lead, or User might have more than one email (work, personal, secondary). By default, Dataverse always uses the primary email field. But if we need to send an email to a specific alternate address, we can set it directly in addressused.
Unresolved Recipients – There are times when we need to send an email to someone who doesn’t exist in Dataverse at all—for example, an external consultant, new partner, or temporary vendor.
Recenlty we had to send email to particular email address not stored as actual record in CRM, below is how we specified the email address in the Add a new Row (Email) action of Power Automate.
Sometimes we need to find all the flows where a specific Dataverse field is used — maybe before renaming it, removing it, or just checking its usage. Manually opening each flow is slow, but we can do it in seconds with SQL 4 CDS.
For e.g. we want to search for flows that use the field: custom_mysamplefield
For this we can make use of the below query, run it in SQL 4 CDS (XrmToolBox).
SELECT wf.name, wf.workflowid, wf.clientdata
FROM workflow wf
WHERE wf.category = 5
AND LOWER(wf.clientdata) LIKE '%custom_mysamplefield%'
Here table workflow stores flows and workflows details, category = 5 refers to cloud flows, clientdata contains the flow’s JSON Definition.
In our Dataverse environment, we had a field named custom_sampledate configured as a DateOnly type with User Local behavior. At some point, we changed its behavior to Time Zone Independent, assuming it would prevent confusion across time zones.
At first glance, everything seemed fine. But over time, users in time zones like New Zealand (NZ) started reporting an issue: for older records created before the change, the dates were now showing up as one day earlier than what they had originally entered.
This was because, when we changed the field’s behavior from User Local to Time Zone Independent, Dataverse stopped interpreting the date based on the user’s local time zone. Instead, it began treating the stored value exactly as-is, which caused trouble for values that were originally entered as User Local, especially from users in forward time zones like NZ.
Dataverse stores DateOnly fields as a datetime behind the scenes, with the time part set to 00:00:00.000. The behavior setting (User Local vs. Time Zone Independent) affects how this raw value is interpreted and displayed. Before the behavior was changed, NZ users (UTC+13/UTC+12) were entering dates into a User Local field. Dataverse automatically converted their local midnight time to UTC when storing it.
For example:
A NZ user enters 2025-04-03
Dataverse stores it as 2025-04-02 11:00:00 UTC
Later, when the field behavior was switched to Time Zone Independent, that same stored value was no longer adjusted for the user’s time zone. It got displayed as is as 2025-04-02 which was one day earlier than intended.
However, the new records entered after the change didn’t show the issue, because after the behavior was set to Time Zone Independent, any newly entered values were saved and displayed exactly as the user typed them—without conversion.
To correct this mismatch without losing data, we followed the below approach:
Created a temporary DateOnly field named custom_sampledate_temp, set to Time Zone Independent behavior.
Copied all values from the existing custom_sampledate field into custom_sampledate_temp.
Deleted the original custom_sampledate field (after backups).
Recreated custom_sampledate with the same schema name, but set it back to User Local behavior.
Copied data back from the temp field into the new custom_sampledate field.
Changing a DateOnly field from User Local to Time Zone Independent, might look harmless—but it can cause subtle issues, especially across global teams. We need to careful before we make this change, as apart from UI this could result changing the JavaScript, Cloud Flows and Plugins where we have used that field to reflect correct date.