Thursday, July 24, 2025

Understanding the 10 Linked Entity Limit in Dataverse FetchXML Queries (Dynamics 365 / CRM)

Understanding the 10 Linked Entity Limit in Dataverse FetchXML Queries (Dynamics 365 / CRM)

If you’ve worked with Microsoft Dataverse (previously part of Dynamics CRM and now the data platform for Dynamics 365 and Power Apps), you’re likely familiar with FetchXML—a powerful XML-based query language used to retrieve data from Dataverse tables. FetchXML is particularly useful for building reports, creating custom views, and performing complex data retrieval operations in Dynamics 365 and Power Apps. However, like any technology, it comes with certain constraints. One such limitation is that Dataverse enforces a maximum of 10 linked entities in a single FetchXML query. In this post, we’ll explore what this limitation means, why it exists, and how you can design your queries to work effectively within this constraint.

What is FetchXML?

Before diving into the limitation, let’s briefly recap what FetchXML is. FetchXML is a proprietary query language developed by Microsoft for Microsoft Dataverse. It allows developers and administrators to query data across multiple tables (similar to database tables) and their relationships. Unlike SQL, which is text-based, FetchXML uses an XML structure to define queries. This makes it easy to construct queries programmatically and integrate them into Dynamics 365 applications, Power Apps, and other Power Platform solutions.

For example, a simple FetchXML query might look like this:

<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">
  <entity name="account">
    <attribute name="name" />
    <attribute name="accountnumber" />
  </entity>
</fetch>

This query retrieves the name and accountnumber attributes from the account entity. However, FetchXML’s real power comes from its ability to link multiple entities together to retrieve related data.

The 10 Linked Entities Limitation

When building more complex queries, you might need to retrieve data from several related entities. For instance, you might want to fetch data from an account, its related contacts, the opportunities tied to those contacts, and so on. Each of these relationships is represented as a linked entity in FetchXML.

However, Dynamics CRM imposes a hard limit: a single FetchXML query can include no more than 10 linked entities. This means that you cannot link more than 10 entities together in one query, no matter how they are nested or related.

It’s important to note that this limitation is specific to FetchXML. Other query methods in Dynamics CRM, such as the QueryExpression class in the SDK, do not have this same restriction. However, since FetchXML is widely used—especially in scenarios like report building and user-defined views—understanding this limitation is crucial.

Why Does This Limitation Exist?

You might be wondering why Microsoft implemented this restriction. The primary reason is performance. Each linked entity in a FetchXML query adds complexity to the underlying database query. As the number of linked entities increases, so does the potential for slow query execution, which can impact the overall performance of the CRM system.

By capping the number of linked entities at 10, Dataverse ensures that queries remain efficient and don’t consume excessive resources. This is especially important in a multi-tenant cloud environment like Dynamics 365, where system performance must be balanced across many users and organizations.

Illustrating the Limitation with an Example

To better understand the limitation, let’s consider a practical scenario. Suppose you’re building a report that requires data from the following entities:

  • account (the primary entity)
  • contact (linked to account)
  • opportunity (linked to contact)
  • lead (linked to opportunity)
  • quote (linked to opportunity)
  • order (linked to quote)
  • invoice (linked to order)
  • product (linked to invoice)
  • pricelevel (linked to product)
  • territory (linked to account)
  • systemuser (linked to territory)

In this case, you’re trying to link 11 entities together in a single FetchXML query. However, since the limit is 10 linked entities, this query would fail. You’d receive an error indicating that the query exceeds the maximum number of linked entities allowed.

Working Within the Limitation

So, what can you do if your query requires more than 10 linked entities? Here are a few strategies to work within this constraint:

1. Break Down the Query

One approach is to break your large query into smaller, more manageable queries, each with fewer than 10 linked entities. You can then combine the results programmatically or in your reporting tool. For example, you might run one query to retrieve data from the first 10 entities and a second query to retrieve data from the remaining entities, then merge the results based on a common key.

2. Use Multiple Queries with Filters

Another option is to use multiple queries with filters based on the results of previous queries. For instance, you could first query the account and its related entities up to the 10th link, then use the results to filter a second query that retrieves data from the additional entities.

3. Leverage Other Query Methods

If FetchXML’s limitation is too restrictive for your scenario, consider using other query options available in Microsoft Dataverse. For example, the QueryExpression class in the Dataverse SDK for .NET allows developers to build complex queries programmatically and provides greater flexibility for data retrieval. However, this approach requires custom development and is typically used in plugins, custom integrations, or server-side code rather than in standard views or reports within Dynamics 365 or Power Apps.

4. Optimize Your Data Model

In some cases, requiring more than 10 linked entities may indicate that your Dataverse data model could be optimized. Review your table relationships and evaluate whether the number of joins required in your queries can be reduced. For example, restructuring relationships, using calculated or rollup columns, or introducing supporting tables can help simplify queries and improve performance.

Conclusion

The 10 linked entities limitation in Dataverse FetchXML queries is an important constraint to consider when designing complex data retrieval operations. Although it may appear restrictive, the limitation helps maintain platform performance, query efficiency, and overall system stability. By understanding why this constraint exists and designing solutions that work within it—such as simplifying queries, using alternative query approaches, or optimizing your data model—you can build scalable and efficient solutions on the Power Platform.

While FetchXML remains a powerful and widely used query language in Microsoft Dataverse, it is not the only option for retrieving data. Developers can also use tools such as the Dataverse SDK, Web API, or QueryExpression depending on the scenario. For many common use cases in Dynamics 365 and Power Apps, however, working within the 10 linked entities limit is both practical and effective.

Saturday, June 28, 2025

Bulk Updating Over 20,000 Records in Dataverse ExecuteMultipleRequest —Step-by-Step Implementation

Mastering Dataverse Updates with ExecuteMultipleRequest: A Modern C# Guide

Dataverse Updates with ExecuteMultipleRequest

Recently, in a WhatsApp group I’m part of, someone reached out with a challenge: "With pagination, I want to update 20,511 records using Power Automate, but I am not able to do it more than 5,000. Anyone has any idea how to implement?" It’s a classic problem when working with Microsoft Dataverse—Power Automate, while great for many tasks, hits a wall with its pagination limits. In this case, the 5,000-record cap was stopping them cold, even with pagination tweaks.

Power Automate’s low-code simplicity is fantastic for small-to-medium workflows, but for bulk updates like this, it’s not the best fit. The pagination limit forces you into processing records in tiny batches, slowing things down and complicating the flow. For 20,511 records, that’s a lot of hoops to jump through—and a lot of time wasted.

My advice? Ditch Power Automate for this job and use a C# console application with ExecuteMultipleRequest. This Dataverse SDK method lets you bundle up to 1,000 operations into a single request, slashing the number of API calls and turbocharging performance. It’s a better choice because:

  • No Pagination Hassles: Unlike Power Automate, ExecuteMultipleRequest doesn’t choke on large datasets—you can process all 20,511 records without breaking them into 5,000-record chunks.
  • Faster Execution: Batching operations reduces server round trips, making it way quicker for big updates.
  • More Control: A console app lets you customize error handling, logging, and logic—things Power Automate struggles to match.

In this post, I’ll show you how to build that console app step-by-step, complete with code snippets and tips. Let’s get those records updated efficiently!

What You’ll Need

Before we start coding, gather these essentials:

  • Dataverse Environment: Access to a Dataverse instance with a table (e.g., account) containing records to update.
  • Azure AD App Registration: A registered app with client ID, client secret, and tenant ID, plus Dataverse permissions.
  • .NET Development Setup: Visual Studio or VS Code with .NET Core 3.1 or later.
  • NuGet Packages:
    • Microsoft.CrmSdk.CoreAssemblies
    • Microsoft.CrmSdk.XrmTooling.CoreAssembly
    • Microsoft.IdentityModel.Clients.ActiveDirectory

Step-by-Step: Building the Console App

Step 1: Create the Console Project

Fire up your terminal and create a new C# console app, then add the necessary NuGet packages.

dotnet new console -n DataverseBatchUpdater
cd DataverseBatchUpdater
dotnet add package Microsoft.CrmSdk.CoreAssemblies --version 9.0.2.34
dotnet add package Microsoft.CrmSdk.XrmTooling.CoreAssembly --version 9.1.0.68
dotnet add package Microsoft.IdentityModel.Clients.ActiveDirectory --version 5.3.0

Step 2: Set Up Authentication

To connect to Dataverse, configure authentication using your Azure AD app credentials. Open Program.cs and add these constants, replacing placeholders with your values.

private static readonly string dataverseUrl = "https://yourorg.crm.dynamics.com";
private static readonly string clientId = "your-client-id";
private static readonly string clientSecret = "your-client-secret";
private static readonly string tenantId = "your-tenant-id";
private static readonly string authority = $"https://login.microsoftonline.com/{tenantId}";

We’ll use these to authenticate via the Dataverse SDK’s CrmServiceClient.

Step 3: Authenticate with Dataverse

Add a method to establish a connection to Dataverse using the client credentials flow.

private static async Task GetCrmServiceClient()
{
    var authContext = new AuthenticationContext(authority);
    var clientCred = new ClientCredential(clientId, clientSecret);
    var authResult = await authContext.AcquireTokenAsync(dataverseUrl, clientCred);

    var connectionString = $"AuthType=ClientSecret;Url={dataverseUrl};ClientId={clientId};ClientSecret={clientSecret};RequireNewInstance=true";
    var serviceClient = new CrmServiceClient(connectionString);

    if (serviceClient.IsReady)
    {
        return serviceClient;
    }
    else
    {
        throw new Exception($"Connection failed: {serviceClient.LastCrmError}");
    }
}

Step 4: Prepare Records for Update

To use ExecuteMultipleRequest, we need a list of records to update. For this example, we’ll fetch accounts and modify their name field. Here’s how to query records (we’ll assume you have a pre-populated list or fetch them separately).

// Example: Fetch records (replace with your data source)
var accounts = new List();
for (int i = 1; i <= 20511; i++)
{
    var account = new Entity("account") { Id = Guid.NewGuid() };
    account["name"] = $"Account_{i}";
    accounts.Add(account);
}

In practice, you’d retrieve these using RetrieveMultiple, but we’ll focus on the update logic here.

Step 5: Implement ExecuteMultipleRequest

Now, the star of the show: ExecuteMultipleRequest. We’ll process records in batches of 1,000 (the max allowed per request).

private static async Task UpdateRecordsAsync(IOrganizationService service, List accounts)
{
    int batchSize = 1000;
    int totalUpdated = 0;

    for (int i = 0; i < accounts.Count; i += batchSize)
    {
        var requestCollection = new ExecuteMultipleRequest
        {
            Settings = new ExecuteMultipleSettings
            {
                ContinueOnError = true, // Keep going if one update fails
                ReturnResponses = true  // Get feedback on each operation
            },
            Requests = new OrganizationRequestCollection()
        };

        int endIndex = Math.Min(i + batchSize, accounts.Count);
        for (int j = i; j < endIndex; j++)
        {
            var entity = accounts[j];
            entity["name"] = $"{entity.GetAttributeValue("name")}_Updated";
            var updateRequest = new UpdateRequest { Target = entity };
            requestCollection.Requests.Add(updateRequest);
        }

        var response = (ExecuteMultipleResponse)service.Execute(requestCollection);
        totalUpdated += requestCollection.Requests.Count;

        foreach (var resp in response.Responses)
        {
            if (resp.Fault != null)
            {
                Console.WriteLine($"Error at index {i + resp.RequestIndex}: {resp.Fault.Message}");
            }
        }

        Console.WriteLine($"Updated {requestCollection.Requests.Count} records. Total: {totalUpdated}");
        await Task.Delay(500); // Throttling prevention
    }

    Console.WriteLine($"Finished! Total records updated: {totalUpdated}");
}

Tip 1: Batch Size Matters

Stick to 1,000 records per batch—Dataverse’s upper limit for ExecuteMultipleRequest. Smaller batches work but slow you down unnecessarily.

Step 6: Tie It All Together

Update Main to connect and run the update.

static async Task Main(string[] args)
{
    try
    {
        var service = await GetCrmServiceClient();
        Console.WriteLine("Connected to Dataverse!");

        // Replace with your record-fetching logic
        var accounts = new List();
        for (int i = 1; i <= 20511; i++)
        {
            var account = new Entity("account") { Id = Guid.NewGuid() };
            account["name"] = $"Account_{i}";
            accounts.Add(account);
        }

        await UpdateRecordsAsync(service, accounts);
    }
    catch (Exception ex)
    {
        Console.WriteLine($"Oops! {ex.Message}");
    }
}

Step 7: Run and Verify

Build and run your app, then check Dataverse to confirm the updates.

dotnet run

Look for console output like “Updated 1000 records” and verify in Dataverse that account names now end with "_Updated".

The Full Code

Here’s the complete C# console app. Replace placeholders with your Dataverse and Azure AD details.

using Microsoft.Crm.Sdk.Messages;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Tooling.Connector;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace DataverseBatchUpdater
{
    class Program
    {
        private static readonly string dataverseUrl = "https://yourorg.crm.dynamics.com";
        private static readonly string clientId = "your-client-id";
        private static readonly string clientSecret = "your-client-secret";
        private static readonly string tenantId = "your-tenant-id";
        private static readonly string authority = $"https://login.microsoftonline.com/{tenantId}";

        static async Task Main(string[] args)
        {
            try
            {
                var service = await GetCrmServiceClient();
                Console.WriteLine("Connected to Dataverse!");

                // Simulated record list (replace with your fetch logic)
                var accounts = new List();
                for (int i = 1; i <= 20511; i++)
                {
                    var account = new Entity("account") { Id = Guid.NewGuid() };
                    account["name"] = $"Account_{i}";
                    accounts.Add(account);
                }

                await UpdateRecordsAsync(service, accounts);
            }
            catch (Exception ex)
            {
                Console.WriteLine($"Oops! {ex.Message}");
            }
        }

        private static async Task GetCrmServiceClient()
        {
            var authContext = new AuthenticationContext(authority);
            var clientCred = new ClientCredential(clientId, clientSecret);
            var authResult = await authContext.AcquireTokenAsync(dataverseUrl, clientCred);

            var connectionString = $"AuthType=ClientSecret;Url={dataverseUrl};ClientId={clientId};ClientSecret={clientSecret};RequireNewInstance=true";
            var serviceClient = new CrmServiceClient(connectionString);

            if (serviceClient.IsReady)
            {
                return serviceClient;
            }
            else
            {
                throw new Exception($"Connection failed: {serviceClient.LastCrmError}");
            }
        }

        private static async Task UpdateRecordsAsync(IOrganizationService service, List accounts)
        {
            int batchSize = 1000;
            int totalUpdated = 0;

            for (int i = 0; i < accounts.Count; i += batchSize)
            {
                var requestCollection = new ExecuteMultipleRequest
                {
                    Settings = new ExecuteMultipleSettings
                    {
                        ContinueOnError = true,
                        ReturnResponses = true
                    },
                    Requests = new OrganizationRequestCollection()
                };

                int endIndex = Math.Min(i + batchSize, accounts.Count);
                for (int j = i; j < endIndex; j++)
                {
                    var entity = accounts[j];
                    entitytrong["name"] = $"{entity.GetAttributeValue("name")}_Updated";
                    var updateRequest = new UpdateRequest { Target = entity };
                    requestCollection.Requests.Add(updateRequest);
                }

                var response = (ExecuteMultipleResponse)service.Execute(requestCollection);
                totalUpdated += requestCollection.Requests.Count;

                foreach (var resp in response.Responses)
                {
                    if (resp.Fault != null)
                    {
                        Console.WriteLine($"Error at index {i + resp.RequestIndex}: {resp.Fault.Message}");
                    }
                }

                Console.WriteLine($"Updated {requestCollection.Requests.Count} records. Total: {totalUpdated}");
                await Task.Delay(500); // Avoid throttling
            }

            Console.WriteLine($"Finished! Total records updated: {totalUpdated}");
        }
    }
}

Pro Tips for Success

Tip 1: Maximize Batch Efficiency

Use the full 1,000-record capacity of ExecuteMultipleRequest to minimize requests and speed up execution.

Tip 2: Dodge Throttling Limits

Add a small delay (e.g., await Task.Delay(500)) between batches to avoid hitting Dataverse’s throttling limits.

Tip 3: Robust Error Handling

Log errors from response.Responses to a file or table for post-run analysis, ensuring no issues slip through.

Wrapping Up

With ExecuteMultipleRequest, updating large datasets in Dataverse becomes a breeze. This C# console app showcases how to harness batch processing for efficiency and reliability. Try it out, tweak it to your needs, and watch your Dataverse updates soar!

Friday, June 27, 2025

How to Retrieve More Than 5000 Records Using FetchXML and Power Automate

Power Automate: Paging Through 5000+ Records with FetchXML

When Dataverse limits you to 5,000 records per page, Power Automate gives you the tools to break past that limit—without writing C# or JavaScript. In this guide, you’ll create a Flow that loops through large datasets using FetchXML paging logic.

When Should You Use Power Automate for This?

  • You want to export large record sets to Excel, SharePoint, SQL, or external APIs
  • You want to schedule this data sync (e.g., nightly)
  • You don’t want to maintain plugin code or JS logic
  • You need a low-code way to pull 10k+, 50k+ records from Dataverse
💡 Pro Tip: Power Automate handles up to 100k rows using paging in loops. Just be sure to avoid infinite loops or huge memory loads!

Step 1: Create Your Base FetchXML

<fetch mapping="logical" count="5000">
  <entity name="account">
    <attribute name="accountid" />
    <attribute name="name" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>
</fetch>

Step 2: Build the Flow

  1. Create a Manual trigger or use a Scheduled trigger
  2. Initialize these variables:
    • pagingCookie (string, empty)
    • pageNumber (int, default 1)
    • moreRecords (boolean, true)
    • recordsArray (array, empty)
  3. Add a Do Until loop:
    • Condition: moreRecords is equal to false

Step 3: Build Dynamic FetchXML with Pagination

Use a Compose action inside the loop to generate this:

<fetch mapping="logical" count="5000" page="@{variables('pageNumber')}"@{if(variables('pagingCookie'), concat(' paging-cookie="', variables('pagingCookie'), '"'), '')}>
  <entity name="account">
    <attribute name="accountid" />
    <attribute name="name" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>
</fetch>
Tip: You can use concat() + if() to conditionally add the paging-cookie only when it exists.

Step 4: Fetch and Merge Results

Inside the loop:

  • Add a “List rows using FetchXML” action (Dataverse connector)
  • Pass in the dynamic FetchXML from Compose
  • Use “Append to array variable” to merge value into recordsArray
append(outputs('List_rows_using_FetchXML')?['body']?['value'])

Step 5: Handle Paging State

  • Update moreRecords using an expression like:
@if(contains(outputs('List_rows_using_FetchXML')?['body'], '@odata.nextLink'), true, false)
  • Set pagingCookie from @odata.nextLink by parsing the query string (use substring, split, or parse approach)
  • Set pageNumber to add(variables('pageNumber'), 1)
⚠️ Warning: PagingCookie is encoded in @odata.nextLink. You may need to decode or extract from the query string using split() or uriComponentToString().

Step 6: Post-Loop Actions

Once the loop is done, your recordsArray contains ALL records. You can now:

  • Loop over them to create SharePoint rows
  • Insert into SQL or external APIs
  • Dump them into Excel via Create Table + Add a Row

Real-World Use Cases

  • Export nightly case updates to Excel
  • Push 100k record snapshots to Data Lake
  • Sync contacts to downstream email platform

Advanced Ideas

  • Convert flow into a Custom Connector wrapper with retry control
  • Trigger flow from a Model-Driven App button using HTTP Request trigger
  • Use Secure Inputs to store base FetchXML in Dataverse for reuse

How to Retrieve More Than 5000 Records Using FetchXML and Dynamics 365 Plugin

Server-Side FetchXML Paging Using a Dynamics 365 Plugin

If you're working server-side inside a Plugin or Custom Workflow Activity, you can bypass client-side paging limits and fully control how many records to pull using FetchXML and paging cookies.

When to Use a Plugin for This

  • Fetching large datasets for processing on the server
  • Validating or aggregating records during business logic
  • Keeping logic secure and non-reversible from the client
  • Integrating real-time, record-heavy automations

Step 1: Base FetchXML Structure

Start with this structure:

<fetch mapping='logical' count='5000'>
  <entity name='account'>
    <attribute name='accountid' />
    <attribute name='name' />
    <filter>
      <condition attribute='statecode' operator='eq' value='0' />
    </filter>
  </entity>
</fetch>
💡 Tip: Keep your base FetchXML free of paging logic—this will be dynamically injected.

Step 2: Plugin Implementation

This plugin fetches all active account records, 5000 at a time, using the PagingCookie and MoreRecords flags.

public class FetchAllAccountsPlugin : IPlugin
{
    public void Execute(IServiceProvider serviceProvider)
    {
        ITracingService tracer = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
        IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        IOrganizationService service = factory.CreateOrganizationService(context.UserId);

        string pagingCookie = null;
        int pageNumber = 1;
        int fetchCount = 5000;
        bool moreRecords = true;
        List<Entity> allAccounts = new List<Entity>();

        while (moreRecords)
        {
            string fetchXml = CreateFetchXml(pagingCookie, pageNumber, fetchCount);
            EntityCollection results = service.RetrieveMultiple(new FetchExpression(fetchXml));

            allAccounts.AddRange(results.Entities);
            tracer.Trace($"Fetched page {pageNumber}: {results.Entities.Count} records");

            moreRecords = results.MoreRecords;
            if (moreRecords)
            {
                pagingCookie = results.PagingCookie;
                pageNumber++;
            }
        }

        tracer.Trace($"✅ Total records fetched: {allAccounts.Count}");
    }

    private string CreateFetchXml(string pagingCookie, int pageNumber, int fetchCount)
    {
        return $@"
<fetch mapping='logical' page='{pageNumber}' count='{fetchCount}'{(pagingCookie != null ? $" paging-cookie='{{System.Security.SecurityElement.Escape(pagingCookie)}}'" : "")}>
  <entity name='account'>
    <attribute name='accountid' />
    <attribute name='name' />
    <filter>
      <condition attribute='statecode' operator='eq' value='0' />
    </filter>
  </entity>
</fetch>";
    }
}
⚠️ Watch out: Don’t forget to escape the pagingCookie! Use System.Security.SecurityElement.Escape() to avoid breaking XML syntax.

Bonus Tips

  • Use an async plugin or Custom API for long-running tasks
  • Consider streaming results or batching when working with >50,000 records
  • For external systems, push the results into Azure or a Data Export Service

Registration Ideas

This plugin can be triggered via:

  • A Custom API (for on-demand calls)
  • A Workflow or Power Automate Flow
  • Or bound to Entity Events like Create/Update (with filters)

How to Retrieve More Than 5000 Records Using FetchXML and JavaScript

Fetching Over 5000 Records with JavaScript and FetchXML

By default, FetchXML caps results at 5000 per page. But with JavaScript, you can paginate through results using the paging-cookie to retrieve all matching records. Let’s break down how.

Why Paging is Necessary

  • FetchXML will never return more than 5000 rows in a single query.
  • To get beyond 5K, we need to paginate using a paging-cookie from the server response.
  • This involves calling Xrm.WebApi.retrieveMultipleRecords recursively.

Step-by-Step Approach

1. Prepare Your Core FetchXML Body

Start with just the inner <entity> part—leave out the <fetch> wrapper.

<entity name="account">
  <attribute name="name" />
  <attribute name="accountnumber" />
  <filter>
    <condition attribute="statecode" operator="eq" value="0" />
  </filter>
</entity>

2. Use JavaScript to Retrieve All Records

This full function handles paging, cookie reuse, throttling (429 errors), and returns all your records as one big array.

function retrieveAllWithFetchXml(entityName, fetchBody, pageSize = 5000) {
  let allRecords = [];
  let pageNumber = 1;
  let pagingCookie = null;
  let retryCount = 0;

  function delay(ms) {
    return new Promise(res => setTimeout(res, ms));
  }

  function getPage() {
    const fetchXml = [
      `<fetch mapping="logical" page="${pageNumber}" count="${pageSize}"`,
      pagingCookie ? ` paging-cookie='${pagingCookie}'` : "",
      `>`,
        fetchBody,
      `</fetch>`
    ].join("");

    const query = `?fetchXml=${encodeURIComponent(fetchXml)}`;

    return Xrm.WebApi.retrieveMultipleRecords(entityName, query)
      .then(result => {
        allRecords = allRecords.concat(result.entities);

        if (result["@odata.nextLink"]) {
          const cookieToken = result["@odata.nextLink"]
            .match(/fetchXmlPagingCookie=([^&]+)/)[1];
          pagingCookie = decodeURIComponent(cookieToken);
          pageNumber++;
          retryCount = 0;
          return getPage();
        }

        return allRecords;
      })
      .catch(err => {
        if (err.status === 429 && retryCount < 5) {
          retryCount++;
          return delay(Math.pow(2, retryCount) * 500).then(getPage);
        }
        return Promise.reject(err);
      });
  }

  return getPage();
}

3. Sample Usage

const fetchBody = `
  <entity name="account">
    <attribute name="name" />
    <attribute name="accountnumber" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>`;

retrieveAllWithFetchXml("account", fetchBody)
  .then(records => console.log("Total:", records.length))
  .catch(err => console.error(err));
Pro Tip 💡
Always keep your FetchXML small enough to avoid URL length limits. For massive queries, you can use ExecuteFetchXml Web API or a server-side plugin.
⚠️ Watch out for:
- Server throttling: Implement retry logic
- URL encoding: Always encode your FetchXML before passing it into a GET call
- Link-entities: Keep your inner Fetch clean and modular

Wrapping Up

With this approach, you’ve got full control over paging behavior, and can efficiently retrieve tens of thousands of records from a model-driven app or Dynamics 365 using just JavaScript.

Thursday, June 26, 2025

Moving BPF Stages Backward in Dynamics 365 with JavaScript

A step-by-step guide to dynamically control BPF stages using client-side JavaScript

Introduction

Business Process Flows (BPFs) in Microsoft Dynamics 365 and Dataverse guide users through predefined stages, but moving backward isn’t supported natively in the BPF designer. For example, moving from stage 2 to stage 1 based on a field value like "No" requires custom logic. JavaScript, executed as a Web Resource in a model-driven app form, provides a client-side solution to update the activestageid field. In this post, we’ll show you how to create a JavaScript Web Resource to move BPF stages dynamically.

Why This Matters: JavaScript enables real-time, client-side BPF stage control, offering immediate feedback to users without server-side dependencies.

The Requirement

For an account entity with a BPF named "Account Process Flow," we need to:

  • Monitor a choice field (new_decisionfield).
  • If "Yes," move from stage 2 to stage 3.
  • If "No," move back to stage 1.

We’ll use JavaScript to detect field changes and update the BPF instance’s activestageid using the Web API.

Step-by-Step Guide

Step 1: Gather Required Information

Before writing the JavaScript, collect these details:

  • BPF Stage IDs: Find the GUIDs for stage 1 and stage 3 in the processstages table using XrmToolBox or FetchXML. Example:
    • Stage 1: 12345678-1234-1234-1234-1234567890ab
    • Stage 3: 98765432-4321-4321-4321-9876543210ba
  • BPF Instance Table: Identify the logical name (e.g., accountprocessflows).
  • Field Values: Confirm the new_decisionfield OptionSet values (Yes = 100000000, No = 100000001).

Step 2: Create the JavaScript Web Resource

Create a JavaScript file to handle the onchange event of new_decisionfield and update the BPF stage:

var BPFStageMover = BPFStageMover || {};

BPFStageMover.moveBPFStage = function (executionContext) {
    var formContext = executionContext.getFormContext();

    // Get the decision field value
    var decisionField = formContext.getAttribute("new_decisionfield");
    if (!decisionField) {
        console.log("Decision field not found.");
        return;
    }

    var decisionValue = decisionField.getValue();
    if (decisionValue === null) {
        console.log("Decision field is null.");
        return;
    }

    // Define stage GUIDs (replace with actual stage IDs)
    var stage1Id = "12345678-1234-1234-1234-1234567890ab"; // Stage 1 GUID
    var stage3Id = "98765432-4321-4321-4321-9876543210ba"; // Stage 3 GUID

    // Determine new stage based on decision value
    var newStageId = decisionValue === 100000000 ? stage3Id : stage1Id; // Yes: 100000000, No: 100000001

    // Get the account ID
    var accountId = formContext.data.entity.getId().replace(/[{}]/g, "");

    // Retrieve the BPF instance
    Xrm.WebApi.retrieveMultipleRecords("accountprocessflow", `?$select=activestageid&$filter=regardingobjectid eq ${accountId}`)
        .then(
            function success(result) {
                if (result.entities.length === 0) {
                    console.log("No BPF instance found for account: " + accountId);
                    return;
                }

                var bpfInstance = result.entities[0];
                var currentStageId = bpfInstance.activestageid;

                // Only update if the stage is different
                if (currentStageId !== newStageId) {
                    var updateData = {
                        "activestageid@odata.bind": `/processstages(${newStageId})`
                    };

                    // Update the BPF instance
                    Xrm.WebApi.updateRecord("accountprocessflow", bpfInstance.accountprocessflowid, updateData)
                        .then(
                            function success() {
                                console.log("BPF stage updated to: " + newStageId);
                                formContext.data.refresh(true); // Refresh form to reflect stage change
                            },
                            function error(error) {
                                console.error("Error updating BPF stage: " + error.message);
                                Xrm.Utility.alertDialog("Error updating BPF stage: " + error.message);
                            }
                        );
                } else {
                    console.log("No stage change needed.");
                }
            },
            function error(error) {
                console.error("Error retrieving BPF instance: " + error.message);
                Xrm.Utility.alertDialog("Error retrieving BPF instance: " + error.message);
            }
        );
};

Step 3: Register the Web Resource

Add the JavaScript to your Dynamics 365 form:

  1. Upload Web Resource:
    • In your solution, create a Web Resource named new_moveBPFStage (Type: JavaScript).
    • Upload the moveBPFStage.js file.
  2. Add to Form:
    • Open the account form in the form editor.
    • Add new_moveBPFStage to the form libraries.
    • Register the BPFStageMover.moveBPFStage function on the onchange event of new_decisionfield, passing the execution context.
  3. Publish: Save and publish the form and solution.

Step 4: Test the JavaScript

Open an account record, change new_decisionfield to "Yes" or "No," and verify the BPF stage moves to stage 3 or stage 1. Use browser developer tools (F12) to check console logs for errors.

Pro Tip: Use the browser’s developer tools to debug JavaScript errors. Add console logs for key variables (e.g., decisionValue, currentStageId) to trace execution.

Key Notes

  • Replace GUIDs: Update stage GUIDs in the JavaScript with actual values from your processstages table.
  • BPF Table: Use the correct logical name for your BPF instance table (e.g., accountprocessflows).
  • Option Set Values: Adjust 100000000 (Yes) and 100000001 (No) to match your field’s values.
  • Permissions: Ensure the user has read/write access to the accountprocessflows table.

Conclusion

Using JavaScript in a Dynamics 365 form, you can dynamically move BPF stages backward or forward, overcoming the designer’s limitations. This client-side approach provides real-time feedback and is ideal for interactive user experiences. Try it out, and share your feedback or questions in the comments!

© 2025 Your Name. All rights reserved.

Moving BPF Stages Backward in Dynamics 365 with Power Automate

A step-by-step guide to dynamically control BPF stages using Power Automate

Introduction

Business Process Flows (BPFs) in Microsoft Dynamics 365 and Dataverse guide users through predefined stages, but moving backward isn’t supported natively in the BPF designer or connectors. For example, moving from stage 2 to stage 1 based on a field value like "No" requires custom logic. Power Automate offers a low-code solution to achieve this by updating the activestageid field. In this post, we’ll show you how to create a flow that moves a BPF stage backward or forward based on a field value.

Why This Matters: Power Automate enables non-developers to implement complex BPF logic, making it accessible and maintainable without writing code.

The Requirement

For an account entity with a BPF named "Account Process Flow," we need to:

  • Monitor a choice field (new_decisionfield).
  • If "Yes," move from stage 2 to stage 3.
  • If "No," move back to stage 1.

Since the BPF designer and standard connectors only support forward movement, we’ll use Power Automate to update the BPF instance’s activestageid.

Step-by-Step Guide

Step 1: Gather Required Information

Before building the flow, collect these details:

  • BPF Stage IDs: Find the GUIDs for stage 1 and stage 3 in the processstages table using XrmToolBox or FetchXML. Example:
    • Stage 1: 12345678-1234-1234-1234-1234567890ab
    • Stage 3: 98765432-4321-4321-4321-9876543210ba
  • BPF Instance Table: Identify the logical name (e.g., accountprocessflows).
  • Field Values: Confirm the new_decisionfield OptionSet values (Yes = 100000000, No = 100000001).

Step 2: Set Up Power Automate

Create a flow in Power Automate:

  1. Create a Flow: Go to make.powerautomate.com and create a new automated cloud flow.
  2. Add Trigger: Use "When a row is added, modified, or deleted" (Dataverse connector):
    • Table name: Accounts
    • Change type: Modified
    • Filter attributes: new_decisionfield
  3. Add Condition: Check if new_decisionfield equals 100000000 (Yes) or 100000001 (No).
  4. Retrieve BPF Instance: Add a "List rows" action:
    • Table name: accountprocessflows
    • Filter rows: regardingobjectid eq @{triggerOutputs()?['body/accountid']}
    • Select columns: activestageid,accountprocessflowid
  5. Update BPF Stage:
    • Yes Branch: Add "Update a row" action:
      • Table name: accountprocessflows
      • Row ID: @{first(body('List_rows')?['value'])['accountprocessflowid']}
      • Active Stage: 98765432-4321-4321-4321-9876543210ba (Stage 3)
    • No Branch: Add "Update a row" action:
      • Table name: accountprocessflows
      • Row ID: @{first(body('List_rows')?['value'])['accountprocessflowid']}
      • Active Stage: 12345678-1234-1234-1234-1234567890ab (Stage 1)

Step 3: Sample Flow Configuration

Below is a JSON representation of the Power Automate flow for reference (you can import this as a starting point):

{
  "definition": {
    "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
    "actions": {
      "Condition": {
        "actions": {
          "Yes_Update_Stage_3": {
            "type": "ApiConnection",
            "inputs": {
              "host": {
                "connection": {
                  "name": "@parameters('$connections')['dataverse']['connectionId']"
                }
              },
              "method": "patch",
              "path": "/v9.2/accountprocessflows(@{first(body('List_BPF_Instance')?['value'])['accountprocessflowid']})",
              "body": {
                "activestageid": "98765432-4321-4321-4321-9876543210ba"
              }
            }
          }
        },
        "else": {
          "actions": {
            "No_Update_Stage_1": {
              "type": "ApiConnection",
              "inputs": {
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['dataverse']['connectionId']"
                  }
                },
                "method": "patch",
                "path": "/v9.2/accountprocessflows(@{first(body('List_BPF_Instance')?['value'])['accountprocessflowid']})",
                "body": {
                  "activestageid": "12345678-1234-1234-1234-1234567890ab"
                }
              }
            }
          }
        },
        "expression": {
          "and": [
            {
              "equals": [
                "@triggerOutputs()?['body/new_decisionfield']",
                100000000
              ]
            }
          ]
        },
        "type": "If"
      },
      "List_BPF_Instance": {
        "type": "ApiConnection",
        "inputs": {
          "host": {
            "connection": {
              "name": "@parameters('$connections')['dataverse']['connectionId']"
            }
          },
          "method": "get",
          "path": "/v9.2/accountprocessflows",
          "queries": {
            "$filter": "regardingobjectid eq @{triggerOutputs()?['body/accountid']}",
            "$select": "activestageid,accountprocessflowid"
          }
        }
      }
    },
    "triggers": {
      "When_a_row_is_modified": {
        "type": "ApiConnection",
        "inputs": {
          "host": {
            "connection": {
              "name": "@parameters('$connections')['dataverse']['connectionId']"
            }
          },
          "method": "get",
          "path": "/v9.2/rows",
          "queries": {
            "table": "accounts",
            "changeType": "Modified",
            "filterAttributes": "new_decisionfield"
          }
        }
      }
    }
  }
}

Step 4: Test the Flow

Save and test the flow by updating new_decisionfield on an account record. Verify the BPF stage moves to stage 3 (Yes) or stage 1 (No). Check the flow’s run history for errors.

Pro Tip: Add an error-handling step (e.g., send an email if no BPF instance is found) to make the flow more robust. Use the flow’s run history to debug issues.

Key Notes

  • Replace GUIDs: Update stage GUIDs in the flow with actual values from your processstages table.
  • BPF Table: Use the correct logical name for your BPF instance table (e.g., accountprocessflows).
  • Option Set Values: Adjust 100000000 (Yes) and 100000001 (No) to match your field’s values.
  • Permissions: Ensure the flow’s connection has permissions to read and update accountprocessflows records.

Conclusion

Power Automate provides a low-code way to move BPF stages backward in Dynamics 365, overcoming the designer’s forward-only limitation. This solution is ideal for non-developers and offers flexibility for dynamic process control. Try it out, and share your feedback or questions in the comments!

© 2025 Your Name. All rights reserved.

Moving Business Process Flow Stages Backward in Dynamics 365 with a Plugin

A step-by-step guide to programmatically move BPF stages backward in Dataverse using a C# plugin

Introduction

In Microsoft Dynamics 365 and Dataverse, Business Process Flows (BPFs) guide users through predefined stages, typically moving forward. But what if you need to move backward? For example, from stage 2 to stage 1 based on a field value like "No"? The BPF designer doesn’t support backward movement natively, but a server-side plugin can make it happen. In this post, we’ll walk through how to create a plugin that moves a BPF stage backward when a condition is met, complete with code and setup instructions.

Why This Matters: Moving BPF stages backward allows dynamic process control, enabling scenarios like revisiting earlier steps based on user input, enhancing flexibility in workflows.

The Requirement

Imagine you’re working with an account entity and a BPF called "Account Process Flow." At stage 2, a choice field (new_decisionfield) determines the next step:

  • If the value is "Yes," move to stage 3.
  • If the value is "No," move back to stage 1.

Since connectors in Power Automate and the BPF designer only support forward movement, we’ll use a plugin to update the activestageid field of the BPF instance.

Step-by-Step Guide

Step 1: Gather Required Information

Before coding, collect these details:

  • BPF Stage IDs: Find the GUIDs for stage 1 and stage 3 in the processstages table using XrmToolBox or FetchXML. Example:
    • Stage 1: 12345678-1234-1234-1234-1234567890ab
    • Stage 3: 98765432-4321-4321-4321-9876543210ba
  • BPF Instance Table: Identify the logical name (e.g., accountprocessflow).
  • Field Name: Confirm the logical name of the choice field (e.g., new_decisionfield) and its option set values (e.g., Yes = 100000000, No = 100000001).

Step 2: Set Up the Development Environment

You’ll need:

  • Visual Studio (2019 or 2022).
  • Microsoft Power Platform CLI or Dynamics 365 SDK for Microsoft.Xrm.Sdk.
  • Plugin Registration Tool (via XrmToolBox or Power Platform CLI).

Create a Class Library (.NET Framework 4.6.2 or 4.7.2) and add the Microsoft.CrmSdk.CoreAssemblies NuGet package:

Install-Package Microsoft.CrmSdk.CoreAssemblies

Step 3: Write the Plugin

The plugin triggers on the Update message for the account entity, checks the new_decisionfield value, and updates the BPF’s activestageid. Here’s the code:

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using System;

namespace BPFStageMover
{
    public class MoveBPFStage : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            // Obtain the execution context
            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);
            ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            try
            {
                // Check if the event is an update on the account entity
                if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)
                {
                    Entity target = (Entity)context.InputParameters["Target"];

                    // Check if decision_field is in the target entity
                    if (!target.Contains("new_decisionfield"))
                    {
                        tracingService.Trace("decision_field not updated, exiting plugin.");
                        return;
                    }

                    // Get the decision field value (OptionSetValue)
                    OptionSetValue decisionValue = target.GetAttributeValue("new_decisionfield");
                    if (decisionValue == null)
                    {
                        tracingService.Trace("decision_field is null, exiting plugin.");
                        return;
                    }

                    // Define stage GUIDs (replace with actual stage IDs from processstages table)
                    Guid stage1Id = new Guid("12345678-1234-1234-1234-1234567890ab"); // Stage 1 GUID
                    Guid stage3Id = new Guid("98765432-4321-4321-4321-9876543210ba"); // Stage 3 GUID

                    // Get the BPF instance for the account
                    Guid accountId = context.PrimaryEntityId;
                    QueryExpression query = new QueryExpression("accountprocessflow")
                    {
                        ColumnSet = new ColumnSet("activestageid"),
                        Criteria = new FilterExpression
                        {
                            Conditions =
                            {
                                new ConditionExpression("regardingobjectid", ConditionOperator.Equal, accountId)
                            }
                        }
                    };

                    EntityCollection bpfInstances = service.RetrieveMultiple(query);
                    if (bpfInstances.Entities.Count == 0)
                    {
                        tracingService.Trace("No BPF instance found for account {0}.", accountId);
                        return;
                    }

                    Entity bpfInstance = bpfInstances.Entities[0];
                    Guid currentStageId = bpfInstance.GetAttributeValue("activestageid")?.Id ?? Guid.Empty;

                    // Only proceed if the current stage is stage 2 (optional: add stage 2 GUID check)
                    // Update activestageid based on decision_field value
                    Guid newStageId = decisionValue.Value == 100000000 ? stage3Id : stage1Id; // 100000000 = Yes, 100000001 = No

                    if (currentStageId != newStageId)
                    {
                        Entity bpfUpdate = new Entity("accountprocessflow")
                        {
                            Id = bpfInstance.Id,
                            ["activestageid"] = new EntityReference("processstage", newStageId)
                        };
                        service.Update(bpfUpdate);
                        tracingService.Trace("BPF stage updated to {0} for account {1}.", newStageId, accountId);
                    }
                    else
                    {
                        tracingService.Trace("No stage change needed.");
                    }
                }
            }
            catch (Exception ex)
            {
                tracingService.Trace("Error in MoveBPFStage plugin: {0}", ex.Message);
                throw new InvalidPluginExecutionException($"Plugin error: {ex.Message}");
            }
        }
    }
}

Step 4: Build and Register the Plugin

  1. Build: Compile the solution in Visual Studio to generate the DLL (e.g., BPFStageMover.dll).
  2. Register: Use the Plugin Registration Tool to:
    • Upload the DLL and select the MoveBPFStage class.
    • Register a step:
      • Message: Update
      • Primary Entity: account
      • Filtering Attributes: new_decisionfield
      • Stage: PostOperation, Synchronous

Step 5: Test the Plugin

Update the new_decisionfield on an account record and verify the BPF stage moves to stage 3 (Yes) or stage 1 (No). Check trace logs in Dataverse if issues arise.

Pro Tip: Enable plugin tracing in Dataverse to debug issues. Use XrmToolBox to view logs in the Plugin Trace Log table.

Key Notes

  • Replace GUIDs: Update the stage GUIDs in the code with actual values from your processstages table.
  • BPF Table: Use the correct logical name for your BPF instance table (e.g., accountprocessflow).
  • Option Set Values: Adjust 100000000 (Yes) and 100000001 (No) to match your field’s values.
  • Security: Ensure the plugin runs under a user with permissions to update BPF instances.

Conclusion

By using a plugin to update the activestageid, you can overcome the BPF designer’s forward-only limitation, enabling dynamic stage movement in Dynamics 365. This solution is robust, scalable, and perfect for scenarios requiring flexible process control. Try it out, and let us know in the comments if you have questions or need further customization!

© 2025 Your Name. All rights reserved.

Tuesday, June 24, 2025

Dynamics 365 / PowerApps Dataverse’s Web API

Leverage Dynamics 365 / PowerApps Dataverse’s RESTful interface for seamless integration, automation, and customization.

The Dataverse Web API is your key to unlocking the full potential of Microsoft Dynamics 365 and PowerApps. Built on the OData v4 protocol, this RESTful API lets you create, read, update, and delete data with ease. Whether you’re a seasoned developer, a low-code enthusiast, or an integration pro, this guide will walk you through everything you need to know about the Web API—complete with practical examples and real-world use cases. Let’s dive in!

What is the Dataverse Web API?

The Dataverse Web API is a RESTful service powered by the OData v4 protocol. It allows you to interact with Dataverse data and metadata using standard HTTP methods through endpoints like https://yourorg.crm.dynamics.com/api/data/v9.2/. From creating records to executing custom business logic, it’s a versatile tool for any data-driven solution.

Why OData v4?

The OData v4 protocol is a standardized way to build and consume data APIs. It offers:

  • RESTful Design: Uses HTTP verbs (GET, POST, PATCH, DELETE) and JSON for lightweight data exchange.
  • Rich Queries: Supports powerful query options like $select, $filter, and $expand.
  • Metadata: Exposes schema details via $metadata, enabling dynamic discovery.
  • Custom Actions: Allows invoking business logic beyond basic CRUD operations.

Pro Tip

Use the $metadata endpoint (/api/data/v9.2/$metadata) to explore your Dataverse schema and understand available entities and relationships.

Where Can You Use the Web API?

The Web API’s flexibility makes it a go-to tool across scenarios:

  • Server-Side Apps: Integrate with C#, Python, or Node.js for backend solutions.
  • Client-Side Scripts: Enhance Dynamics 365 forms with JavaScript.
  • Low-Code Platforms: Automate processes in Power Automate.
  • Integration Tools: Connect with Azure Data Factory, SSIS, or custom apps.
  • External Systems: Any HTTP-capable tool (e.g., Postman) can interact with Dataverse.

How to Use the Web API: Scenarios and Techniques

Let’s explore how the Web API fits into different development contexts, leveraging OData v4’s power.

1. Plugins

Plugins are .NET code that run server-side in response to Dataverse events. While the IOrganizationService SDK is preferred, the Web API can be used for specific operations.

  • When to Use: For custom actions or functions only available via the Web API.
  • How: Use HttpClient with an Azure AD token.

Pro Tip

Avoid Web API calls in plugins unless necessary, as IOrganizationService is faster and more efficient.

2. Workflows

Classic workflows can include custom .NET activities that call the Web API, though this is rare.

  • When to Use: For niche scenarios not covered by built-in steps.
  • How: Similar to plugins, use HttpClient.

3. Power Automate

Power Automate simplifies Web API interactions with Dataverse connectors.

  • Standard Way: Use actions like “List rows” (OData queries behind the scenes).
  • Custom Way: Use the HTTP action for direct OData v4 queries.

Pro Tip

Use the “Perform an unbound action” step for custom actions to keep flows clean and maintainable.

4. JavaScript in Model-Driven Apps

JavaScript enhances Dynamics 365 forms using the Xrm.WebApi object, which wraps OData v4 calls.

  • Use Cases: Fetch data dynamically or update records.
  • How: Use Xrm.WebApi.retrieveMultipleRecords for queries.

5. PowerFX in Canvas Apps

PowerFX indirectly uses the Web API via Dataverse connectors.

  • How It Works: Functions like Patch or Filter translate to OData queries.
  • Custom Needs: Use Power Automate for direct API calls.

6. External Tools

Integration tools leverage OData v4 for seamless connectivity:

  • Azure Data Factory: Use the Dataverse connector for ETL.
  • SSIS: Call the Web API via script tasks or use the TDS endpoint.
  • Other Tools: Postman or PowerShell with Azure AD authentication.

Pro Tip

Test API calls in Postman to prototype OData queries before implementing them in code.

Calling the Web API: The OData v4 Way

To use the Web API, you need:

  1. Endpoint: https://yourorg.crm.dynamics.com/api/data/v9.2/.
  2. Authentication: Azure AD token in the Authorization: Bearer header.
  3. HTTP Methods:
    • GET: Retrieve data.
    • POST: Create records or call actions.
    • PATCH: Update records.
    • DELETE: Delete records.
  4. OData Queries: Use $select, $filter, $expand, etc.

Code Examples

C# (Server-Side): Retrieve Accounts

Fetch account names and revenue with OData v4:

using System.Net.Http;
using System.Net.Http.Headers;
using Newtonsoft.Json.Linq;

var client = new HttpClient();
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", accessToken);
var response = await client.GetAsync("https://yourorg.crm.dynamics.com/api/data/v9.2/accounts?$select=name,revenue");
if (response.IsSuccessStatusCode)
{
    var accounts = JObject.Parse(await response.Content.ReadAsStringAsync());
    foreach (var account in accounts["value"])
    {
        Console.WriteLine($"{account["name"]}: {account["revenue"]}");
    }
}

JavaScript (Model-Driven App): Fetch Top 3 Accounts

Use Xrm.WebApi for a simple OData query:

Xrm.WebApi.retrieveMultipleRecords("account", "?$select=name&$top=3").then(
    function success(result) {
        console.log(result);
    },
    function (error) {
        console.log(error.message);
    }
);

Power Automate: Call a Custom Action

Use the “Perform an unbound action” step with parameters like orderId.

OData Query Example: Retrieve Active Accounts

Here’s an OData v4 query and its response:

GET /api/data/v9.2/accounts?$select=name&$filter=statecode eq 0&$expand=primarycontactid($select=fullname)&$top=5

{
  "@odata.context": "...",
  "value": [
    {
      "name": "Contoso Inc.",
      "primarycontactid": {
        "fullname": "John Doe"
      }
    },
    {
      "name": "Fabrikam Ltd.",
      "primarycontactid": {
        "fullname": "Jane Smith"
      }
    }
  ]
}

Real-World Use Cases

The Web API shines in practical scenarios:

  1. E-Commerce Sync: A retailer uses POST requests to /salesorders to sync online orders, with $expand for customer details.
  2. Customer Portal: A web app updates profiles via PATCH to /contacts(guid).
  3. Reporting: Query Dataverse with OData filters for Power BI dashboards.
  4. Data Migration: Use Azure Data Factory to load legacy data into Dataverse.

Additional Resources

OData in Dynamics 365: How to Get the Data You Need, Fast

If you’ve worked with Microsoft Dynamics 365 (or Power Platform and Dataverse), you’ve likely encountered OData queries—used in Power Automate, Canvas Apps, JavaScript code, Power BI, and beyond. But what exactly is OData? How does it work? And how can you use it to build powerful, efficient solutions in your business apps?


1. What Is OData?

OData (Open Data Protocol) is a standardized REST API protocol developed by Microsoft. It allows for querying and manipulating data using simple URLs and standard HTTP verbs (GET, POST, PATCH, DELETE). Dynamics 365 and Dataverse use OData v4 through their Web API.

https://<yourorg>.crm.dynamics.com/api/data/v9.2/

Key Features of OData:

  • RESTful architecture based on HTTP
  • Simple URL-based query syntax
  • Supports metadata discovery via $metadata endpoint
  • Returns data in JSON (or Atom XML)

2. Under the Hood

OData is built on:

  • HTTP Methods: For CRUD operations
  • URI query options: $filter, $select, $expand, $orderby, $top, $skip
  • Entity Data Model (EDM) via /api/data/v9.2/$metadata
  • JSON payloads for data exchange

Supported HTTP Verbs:

  • GET – Retrieve data
  • POST – Create new record
  • PATCH – Update existing record
  • DELETE – Delete record

3. How OData is Used in Dynamics 365

  • Power Automate – Filter records using “List Rows” with OData filters
  • Canvas Apps – Filter galleries and data tables using OData-based functions
  • JavaScript on Model-driven Apps – Make client-side API calls using OData Xrm.WebApi & fetch()
  • Power BI – Load Dataverse tables via Real-time OData feeds
  • Custom Integration – Use OData endpoints from external systems (e.g., Azure Functions, Node.js)

4. Common OData Queries


// Filter active accounts
/accounts?$filter=statecode eq 0

// Select specific columns
/accounts?$select=name,accountnumber

// Sort by creation date
/accounts?$orderby=createdon desc

// Top 10 records
/accounts?$top=10

// Date-based filter
/accounts?$filter=createdon ge 2025-05-01T00:00:00Z

// Text search
/contacts?$filter=contains(fullname,'John')

// Lookup filter
/contacts?$filter=_parentcustomerid_value eq 'GUID'

// Expand lookup (parent contact)
/accounts?$expand=primarycontactid($select=fullname,emailaddress1)

5. Common Mistakes

  • Use eq not ==; and not &&
  • Enclose strings in single quotes: name eq 'Contoso'
  • Using unsupported functions like substring
  • Incorrect lookup schema or missing _value suffix for lookups, such as:_parentcustomerid_value
  • Unsupported nested $expand (only one level is supported)

6. Debugging Tips

  • Inspect /api/data/v9.2/$metadata for correct names
  • Test queries in Postman or browser
  • Use Dev Tools → Network for fetch/Xrm.WebApi calls
  • Read error messages—they often explain the issue

7. JavaScript Example: Basic Account + Contact Expand


// Fetch Account with Primary Contact info
var accountId = "00000000-0000-0000-0000-000000000000";

var reqUrl = Xrm.Utility.getGlobalContext().getClientUrl() +
  "/api/data/v9.2/accounts(" + accountId + ")" +
  "?$select=name,accountnumber" +
  "&$expand=primarycontactid($select=fullname,emailaddress1,jobtitle)";

fetch(reqUrl, {
  method: "GET",
  headers: {
    "OData-MaxVersion": "4.0",
    "OData-Version": "4.0",
    "Accept": "application/json",
    "Content-Type": "application/json; charset=utf-8",
    "Prefer": "odata.include-annotations=*"
  }
})
.then(response => response.json())
.then(data => {
  console.log("Account Name:", data.name);
  console.log(data.primarycontactid
    ? "Contact: " + data.primarycontactid.fullname
    : "No Contact");
})
.catch(console.error);

Another example - Retrieve Contacts with their parent Account names using $expand=parentcustomerid_account:


// Retrieve Contacts with Parent Account Name
Xrm.WebApi.retrieveMultipleRecords(
  "contact",
  "$select=fullname,emailaddress1&$expand=parentcustomerid_account($select=name)"
).then(
  function success(result) {
    result.entities.forEach(function (contact) {
      console.log(
        `Contact: ${contact.fullname}, ` +
        `Email: ${contact.emailaddress1}, ` +
        `Account: ${contact.parentcustomerid_account?.name || "No Account"}`
      );
    });
  },
  function (error) {
    console.error("Error retrieving contacts:", error.message);
  }
);

8. Power Automate Example: “List Rows” with OData


// Filter = only active contacts from last 30 days
statuscode eq 1 and createdon ge 2025-05-25T00:00:00Z

// Select columns
fullname,emailaddress1

// Order and limit results
Order by: createdon desc
Top Count: 50

Configure the Dataverse “List rows” step with these fields. This approach minimizes data transfer and maximizes performance.


Empower your Dynamics 365 solutions with efficient, accurate OData queries—whether you’re building automations, apps, or APIs. Happy querying!

Power Automate Optimization: Filter Rows vs Trigger Conditions - When and Why to Use Each

Filter Rows vs Trigger Conditions in Power Automate Filter Rows vs Trigger Conditions in Power Automate Why your flow...