Friday, March 13, 2026

Power Automate Optimization: Filter Rows vs Trigger Conditions - When and Why to Use Each

Filter Rows vs Trigger Conditions in Power Automate

Filter Rows vs Trigger Conditions in Power Automate

Why your flow might not even start

Recently in a Power Platform WhatsApp group, someone asked a very good question about created/modified trigger filtering in Power Automate.

“Isn't Power Automate being hit either way? You're just deciding whether to continue or not. So you aren't really reducing the number of Power Automate executions with that filter?”

This is actually a very common misunderstanding, even among experienced Power Platform developers.

My answer in the group was:

Trigger Conditions are evaluated server-side before the flow starts. The system decides whether the flow should execute or not.

Because this topic comes up frequently, I thought it would be useful to write a short blog explaining:

  • How Trigger Conditions actually work internally
  • How Filter Rows works
  • Why both help reduce unnecessary flow executions

The Misconception: “The flow runs anyway”

Many developers assume the following sequence happens:

Record Updated

Then

Power Automate flow is triggered

Then

A condition inside the flow is checked

Then

The flow stops if the condition is not met

If this were true, then yes, the flow would still count as an execution.

But this is not how Trigger Conditions work.

What Actually Happens Internally

When you configure a Trigger Condition, the evaluation happens before the flow instance is created.

Record Updated

Then

Power Automate trigger receives the event

Then

Trigger Condition is evaluated on the server side

Then

If the condition is True

Then

Flow instance is created

Then

Actions execute

If the condition evaluates to False:

Record Updated

Then

Trigger Condition is evaluated

Then

Condition is False

Then

Flow instance is not created

Meaning:

  • The flow never runs
  • No flow execution is created
  • No actions execute
  • No API calls are consumed
This behavior applies to most modern event-based triggers such as Dataverse and SharePoint.

Trigger Conditions vs Conditions Inside the Flow

Scenario 1 — Condition inside the Flow

Trigger: When a row is modified

Then

Condition inside the flow: State = Texas

Then

If Yes, send an email

Here is the problem.

Every update still starts the flow when you use a normal Condition action inside the flow. The check happens only after the flow has already begun.

Update Event Flow Runs
Customer name changed Yes
Phone number updated Yes
Notes updated Yes
State changed to Texas Yes

Even if the condition fails, the flow still ran.

  • Wasted flow runs
  • Extra API calls
  • Unnecessary system load

Scenario 2 — Using Trigger Conditions

Trigger: When a row is modified

Trigger Condition:
@equals(triggerOutputs()?['body/address1_stateorprovince'], 'Texas')

With Trigger Conditions, the check happens before the flow instance is created. If the condition is false, the flow does not start.

Update Event Trigger Condition Result Flow Runs
Customer name changed False No
Phone number updated False No
Notes updated False No
State changed to Texas True Yes

Now the flow starts only when the target business condition is met.

Where Trigger Conditions Are Configured

  1. Open your flow
  2. Click the Trigger
  3. Click Settings
  4. Add your expression under Trigger Conditions
@equals(triggerOutputs()?['body/address1_stateorprovince'], 'Texas')

Multiple trigger conditions can be added, and they act as AND conditions.

What About “Filter Rows”?

If you are using Dataverse triggers, you will also see Filter Rows.

address1_stateorprovince eq 'Texas'

Filter Rows uses OData query syntax.

Unlike Trigger Conditions, Filter Rows works even earlier in the process.

In Dataverse, Filter Rows is applied at the event subscription level, meaning only matching events are sent to Power Automate.

Example:

address1_stateorprovince eq 'Virginia'
Record Updated Flow Triggered
State = Florida No
State = Virginia Yes

This prevents non-matching events from even reaching the flow trigger.

Filter Rows vs Trigger Conditions

Feature Filter Rows Trigger Conditions
Evaluation level Dataverse event subscription Power Automate trigger engine
Syntax OData query Workflow expression
Capabilities Simple column filters Complex logic such as and, or, empty, and comparisons
Best use case Record filtering Business rule validation

Filter Rows runs first, then Trigger Conditions.

In practice, both must evaluate to true for the flow to start.

Using Both Together

The best design is often to use both features together.

Example:

Filter Rows:
address1_stateorprovince eq 'Virginia'

Trigger Condition:
@equals(triggerOutputs()?['body/accountcategorycode'], 1)

In this example:

  • Filter Rows first limits the event to records where the state is Virginia
  • Trigger Condition then checks whether the account category matches the business rule
  • The flow starts only if both conditions are satisfied

Best Enterprise Design Pattern

In high-volume Dataverse environments, the best trigger configuration usually looks like this:

Trigger

Then

Select Columns

Then

Filtering Columns

Then

Filter Rows

Then

Trigger Conditions
  • Select Columns reduces payload size
  • Filtering Columns ensures the flow triggers only when specific fields change
  • Filter Rows limits which records generate events
  • Trigger Conditions applies the final business rule before the flow starts

Real Enterprise Example

Imagine a customer table where you only want to notify a team when a record belongs to Virginia.

Poor design:

Trigger: When a row is modified

Then

Condition inside flow:
If State = Virginia

Then

Send notification

This runs every time the record is edited, even when the state is not Virginia.

Better design:

Filter Rows:
address1_stateorprovince eq 'Virginia'

And optionally:

Trigger Condition:
@not(empty(triggerOutputs()?['body/emailaddress1']))

Now the flow starts only when:

  • The updated record belongs to Virginia
  • The record has the business data needed for the next step

SharePoint Example

SharePoint does not offer Dataverse-style Filter Rows in the same trigger experience, but it does support Trigger Conditions.

Example:

@equals(triggerOutputs()?['body/State'], 'California')

This means the flow starts only when the SharePoint item has a State value of California.

Common Mistakes

Using Conditions inside the flow for basic filtering

Trigger

Then

Condition

Then

Exit

This wastes executions because the flow has already started.

Ignoring Trigger Conditions

Many developers simply do not realize this feature exists.

Putting complex logic in Filter Rows

Filter Rows supports OData syntax only. It is not the place for full workflow expressions.

Not using Filtering Columns

If the flow only cares about a few fields, configure the trigger to watch those fields instead of reacting to every update.

Performance Impact

Using Filter Rows and Trigger Conditions properly helps:

  • Reduce flow runs
  • Reduce API consumption
  • Improve performance
  • Prevent throttling
  • Improve maintainability

This becomes extremely important in enterprise environments where thousands of records may be updated daily.

Even when skipped triggers do not create real flow runs, very high event volumes can still affect throughput and responsiveness. Good trigger design still matters.

Final Takeaway

Coming back to the original WhatsApp question:

“Isn't Power Automate being hit either way?”

The answer is No.

Trigger Conditions are evaluated server-side before the flow instance is created.

If the condition evaluates to false, the flow never starts.

And if you are using Dataverse, Filter Rows can stop irrelevant events even earlier.

That is why Trigger Conditions and Filter Rows are critical optimization techniques when building scalable Power Automate solutions.


Tip: Before publishing, you may want to replace sample column names with the exact logical names from your own environment.

-- Warm Regards, Sudip Chakrovorty, M.S. Hand Ph.: +1-(614) 309-8282 (EDT)

Thursday, July 24, 2025

Understanding the 10 Linked Entity Limit in Dataverse FetchXML Queries (Dynamics 365 / CRM)

Understanding the 10 Linked Entity Limit in Dataverse FetchXML Queries (Dynamics 365 / CRM)

If you’ve worked with Microsoft Dataverse (previously part of Dynamics CRM and now the data platform for Dynamics 365 and Power Apps), you’re likely familiar with FetchXML—a powerful XML-based query language used to retrieve data from Dataverse tables. FetchXML is particularly useful for building reports, creating custom views, and performing complex data retrieval operations in Dynamics 365 and Power Apps. However, like any technology, it comes with certain constraints. One such limitation is that Dataverse enforces a maximum of 10 linked entities in a single FetchXML query. In this post, we’ll explore what this limitation means, why it exists, and how you can design your queries to work effectively within this constraint.

What is FetchXML?

Before diving into the limitation, let’s briefly recap what FetchXML is. FetchXML is a proprietary query language developed by Microsoft for Microsoft Dataverse. It allows developers and administrators to query data across multiple tables (similar to database tables) and their relationships. Unlike SQL, which is text-based, FetchXML uses an XML structure to define queries. This makes it easy to construct queries programmatically and integrate them into Dynamics 365 applications, Power Apps, and other Power Platform solutions.

For example, a simple FetchXML query might look like this:

<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">
  <entity name="account">
    <attribute name="name" />
    <attribute name="accountnumber" />
  </entity>
</fetch>

This query retrieves the name and accountnumber attributes from the account entity. However, FetchXML’s real power comes from its ability to link multiple entities together to retrieve related data.

The 10 Linked Entities Limitation

When building more complex queries, you might need to retrieve data from several related entities. For instance, you might want to fetch data from an account, its related contacts, the opportunities tied to those contacts, and so on. Each of these relationships is represented as a linked entity in FetchXML.

However, Dynamics CRM imposes a hard limit: a single FetchXML query can include no more than 10 linked entities. This means that you cannot link more than 10 entities together in one query, no matter how they are nested or related.

It’s important to note that this limitation is specific to FetchXML. Other query methods in Dynamics CRM, such as the QueryExpression class in the SDK, do not have this same restriction. However, since FetchXML is widely used—especially in scenarios like report building and user-defined views—understanding this limitation is crucial.

Why Does This Limitation Exist?

You might be wondering why Microsoft implemented this restriction. The primary reason is performance. Each linked entity in a FetchXML query adds complexity to the underlying database query. As the number of linked entities increases, so does the potential for slow query execution, which can impact the overall performance of the CRM system.

By capping the number of linked entities at 10, Dataverse ensures that queries remain efficient and don’t consume excessive resources. This is especially important in a multi-tenant cloud environment like Dynamics 365, where system performance must be balanced across many users and organizations.

Illustrating the Limitation with an Example

To better understand the limitation, let’s consider a practical scenario. Suppose you’re building a report that requires data from the following entities:

  • account (the primary entity)
  • contact (linked to account)
  • opportunity (linked to contact)
  • lead (linked to opportunity)
  • quote (linked to opportunity)
  • order (linked to quote)
  • invoice (linked to order)
  • product (linked to invoice)
  • pricelevel (linked to product)
  • territory (linked to account)
  • systemuser (linked to territory)

In this case, you’re trying to link 11 entities together in a single FetchXML query. However, since the limit is 10 linked entities, this query would fail. You’d receive an error indicating that the query exceeds the maximum number of linked entities allowed.

Working Within the Limitation

So, what can you do if your query requires more than 10 linked entities? Here are a few strategies to work within this constraint:

1. Break Down the Query

One approach is to break your large query into smaller, more manageable queries, each with fewer than 10 linked entities. You can then combine the results programmatically or in your reporting tool. For example, you might run one query to retrieve data from the first 10 entities and a second query to retrieve data from the remaining entities, then merge the results based on a common key.

2. Use Multiple Queries with Filters

Another option is to use multiple queries with filters based on the results of previous queries. For instance, you could first query the account and its related entities up to the 10th link, then use the results to filter a second query that retrieves data from the additional entities.

3. Leverage Other Query Methods

If FetchXML’s limitation is too restrictive for your scenario, consider using other query options available in Microsoft Dataverse. For example, the QueryExpression class in the Dataverse SDK for .NET allows developers to build complex queries programmatically and provides greater flexibility for data retrieval. However, this approach requires custom development and is typically used in plugins, custom integrations, or server-side code rather than in standard views or reports within Dynamics 365 or Power Apps.

4. Optimize Your Data Model

In some cases, requiring more than 10 linked entities may indicate that your Dataverse data model could be optimized. Review your table relationships and evaluate whether the number of joins required in your queries can be reduced. For example, restructuring relationships, using calculated or rollup columns, or introducing supporting tables can help simplify queries and improve performance.

Conclusion

The 10 linked entities limitation in Dataverse FetchXML queries is an important constraint to consider when designing complex data retrieval operations. Although it may appear restrictive, the limitation helps maintain platform performance, query efficiency, and overall system stability. By understanding why this constraint exists and designing solutions that work within it—such as simplifying queries, using alternative query approaches, or optimizing your data model—you can build scalable and efficient solutions on the Power Platform.

While FetchXML remains a powerful and widely used query language in Microsoft Dataverse, it is not the only option for retrieving data. Developers can also use tools such as the Dataverse SDK, Web API, or QueryExpression depending on the scenario. For many common use cases in Dynamics 365 and Power Apps, however, working within the 10 linked entities limit is both practical and effective.

Saturday, June 28, 2025

Bulk Updating Over 20,000 Records in Dataverse ExecuteMultipleRequest —Step-by-Step Implementation

Mastering Dataverse Updates with ExecuteMultipleRequest: A Modern C# Guide

Dataverse Updates with ExecuteMultipleRequest

Recently, in a WhatsApp group I’m part of, someone reached out with a challenge: "With pagination, I want to update 20,511 records using Power Automate, but I am not able to do it more than 5,000. Anyone has any idea how to implement?" It’s a classic problem when working with Microsoft Dataverse—Power Automate, while great for many tasks, hits a wall with its pagination limits. In this case, the 5,000-record cap was stopping them cold, even with pagination tweaks.

Power Automate’s low-code simplicity is fantastic for small-to-medium workflows, but for bulk updates like this, it’s not the best fit. The pagination limit forces you into processing records in tiny batches, slowing things down and complicating the flow. For 20,511 records, that’s a lot of hoops to jump through—and a lot of time wasted.

My advice? Ditch Power Automate for this job and use a C# console application with ExecuteMultipleRequest. This Dataverse SDK method lets you bundle up to 1,000 operations into a single request, slashing the number of API calls and turbocharging performance. It’s a better choice because:

  • No Pagination Hassles: Unlike Power Automate, ExecuteMultipleRequest doesn’t choke on large datasets—you can process all 20,511 records without breaking them into 5,000-record chunks.
  • Faster Execution: Batching operations reduces server round trips, making it way quicker for big updates.
  • More Control: A console app lets you customize error handling, logging, and logic—things Power Automate struggles to match.

In this post, I’ll show you how to build that console app step-by-step, complete with code snippets and tips. Let’s get those records updated efficiently!

What You’ll Need

Before we start coding, gather these essentials:

  • Dataverse Environment: Access to a Dataverse instance with a table (e.g., account) containing records to update.
  • Azure AD App Registration: A registered app with client ID, client secret, and tenant ID, plus Dataverse permissions.
  • .NET Development Setup: Visual Studio or VS Code with .NET Core 3.1 or later.
  • NuGet Packages:
    • Microsoft.CrmSdk.CoreAssemblies
    • Microsoft.CrmSdk.XrmTooling.CoreAssembly
    • Microsoft.IdentityModel.Clients.ActiveDirectory

Step-by-Step: Building the Console App

Step 1: Create the Console Project

Fire up your terminal and create a new C# console app, then add the necessary NuGet packages.

dotnet new console -n DataverseBatchUpdater
cd DataverseBatchUpdater
dotnet add package Microsoft.CrmSdk.CoreAssemblies --version 9.0.2.34
dotnet add package Microsoft.CrmSdk.XrmTooling.CoreAssembly --version 9.1.0.68
dotnet add package Microsoft.IdentityModel.Clients.ActiveDirectory --version 5.3.0

Step 2: Set Up Authentication

To connect to Dataverse, configure authentication using your Azure AD app credentials. Open Program.cs and add these constants, replacing placeholders with your values.

private static readonly string dataverseUrl = "https://yourorg.crm.dynamics.com";
private static readonly string clientId = "your-client-id";
private static readonly string clientSecret = "your-client-secret";
private static readonly string tenantId = "your-tenant-id";
private static readonly string authority = $"https://login.microsoftonline.com/{tenantId}";

We’ll use these to authenticate via the Dataverse SDK’s CrmServiceClient.

Step 3: Authenticate with Dataverse

Add a method to establish a connection to Dataverse using the client credentials flow.

private static async Task GetCrmServiceClient()
{
    var authContext = new AuthenticationContext(authority);
    var clientCred = new ClientCredential(clientId, clientSecret);
    var authResult = await authContext.AcquireTokenAsync(dataverseUrl, clientCred);

    var connectionString = $"AuthType=ClientSecret;Url={dataverseUrl};ClientId={clientId};ClientSecret={clientSecret};RequireNewInstance=true";
    var serviceClient = new CrmServiceClient(connectionString);

    if (serviceClient.IsReady)
    {
        return serviceClient;
    }
    else
    {
        throw new Exception($"Connection failed: {serviceClient.LastCrmError}");
    }
}

Step 4: Prepare Records for Update

To use ExecuteMultipleRequest, we need a list of records to update. For this example, we’ll fetch accounts and modify their name field. Here’s how to query records (we’ll assume you have a pre-populated list or fetch them separately).

// Example: Fetch records (replace with your data source)
var accounts = new List();
for (int i = 1; i <= 20511; i++)
{
    var account = new Entity("account") { Id = Guid.NewGuid() };
    account["name"] = $"Account_{i}";
    accounts.Add(account);
}

In practice, you’d retrieve these using RetrieveMultiple, but we’ll focus on the update logic here.

Step 5: Implement ExecuteMultipleRequest

Now, the star of the show: ExecuteMultipleRequest. We’ll process records in batches of 1,000 (the max allowed per request).

private static async Task UpdateRecordsAsync(IOrganizationService service, List accounts)
{
    int batchSize = 1000;
    int totalUpdated = 0;

    for (int i = 0; i < accounts.Count; i += batchSize)
    {
        var requestCollection = new ExecuteMultipleRequest
        {
            Settings = new ExecuteMultipleSettings
            {
                ContinueOnError = true, // Keep going if one update fails
                ReturnResponses = true  // Get feedback on each operation
            },
            Requests = new OrganizationRequestCollection()
        };

        int endIndex = Math.Min(i + batchSize, accounts.Count);
        for (int j = i; j < endIndex; j++)
        {
            var entity = accounts[j];
            entity["name"] = $"{entity.GetAttributeValue("name")}_Updated";
            var updateRequest = new UpdateRequest { Target = entity };
            requestCollection.Requests.Add(updateRequest);
        }

        var response = (ExecuteMultipleResponse)service.Execute(requestCollection);
        totalUpdated += requestCollection.Requests.Count;

        foreach (var resp in response.Responses)
        {
            if (resp.Fault != null)
            {
                Console.WriteLine($"Error at index {i + resp.RequestIndex}: {resp.Fault.Message}");
            }
        }

        Console.WriteLine($"Updated {requestCollection.Requests.Count} records. Total: {totalUpdated}");
        await Task.Delay(500); // Throttling prevention
    }

    Console.WriteLine($"Finished! Total records updated: {totalUpdated}");
}

Tip 1: Batch Size Matters

Stick to 1,000 records per batch—Dataverse’s upper limit for ExecuteMultipleRequest. Smaller batches work but slow you down unnecessarily.

Step 6: Tie It All Together

Update Main to connect and run the update.

static async Task Main(string[] args)
{
    try
    {
        var service = await GetCrmServiceClient();
        Console.WriteLine("Connected to Dataverse!");

        // Replace with your record-fetching logic
        var accounts = new List();
        for (int i = 1; i <= 20511; i++)
        {
            var account = new Entity("account") { Id = Guid.NewGuid() };
            account["name"] = $"Account_{i}";
            accounts.Add(account);
        }

        await UpdateRecordsAsync(service, accounts);
    }
    catch (Exception ex)
    {
        Console.WriteLine($"Oops! {ex.Message}");
    }
}

Step 7: Run and Verify

Build and run your app, then check Dataverse to confirm the updates.

dotnet run

Look for console output like “Updated 1000 records” and verify in Dataverse that account names now end with "_Updated".

The Full Code

Here’s the complete C# console app. Replace placeholders with your Dataverse and Azure AD details.

using Microsoft.Crm.Sdk.Messages;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Tooling.Connector;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace DataverseBatchUpdater
{
    class Program
    {
        private static readonly string dataverseUrl = "https://yourorg.crm.dynamics.com";
        private static readonly string clientId = "your-client-id";
        private static readonly string clientSecret = "your-client-secret";
        private static readonly string tenantId = "your-tenant-id";
        private static readonly string authority = $"https://login.microsoftonline.com/{tenantId}";

        static async Task Main(string[] args)
        {
            try
            {
                var service = await GetCrmServiceClient();
                Console.WriteLine("Connected to Dataverse!");

                // Simulated record list (replace with your fetch logic)
                var accounts = new List();
                for (int i = 1; i <= 20511; i++)
                {
                    var account = new Entity("account") { Id = Guid.NewGuid() };
                    account["name"] = $"Account_{i}";
                    accounts.Add(account);
                }

                await UpdateRecordsAsync(service, accounts);
            }
            catch (Exception ex)
            {
                Console.WriteLine($"Oops! {ex.Message}");
            }
        }

        private static async Task GetCrmServiceClient()
        {
            var authContext = new AuthenticationContext(authority);
            var clientCred = new ClientCredential(clientId, clientSecret);
            var authResult = await authContext.AcquireTokenAsync(dataverseUrl, clientCred);

            var connectionString = $"AuthType=ClientSecret;Url={dataverseUrl};ClientId={clientId};ClientSecret={clientSecret};RequireNewInstance=true";
            var serviceClient = new CrmServiceClient(connectionString);

            if (serviceClient.IsReady)
            {
                return serviceClient;
            }
            else
            {
                throw new Exception($"Connection failed: {serviceClient.LastCrmError}");
            }
        }

        private static async Task UpdateRecordsAsync(IOrganizationService service, List accounts)
        {
            int batchSize = 1000;
            int totalUpdated = 0;

            for (int i = 0; i < accounts.Count; i += batchSize)
            {
                var requestCollection = new ExecuteMultipleRequest
                {
                    Settings = new ExecuteMultipleSettings
                    {
                        ContinueOnError = true,
                        ReturnResponses = true
                    },
                    Requests = new OrganizationRequestCollection()
                };

                int endIndex = Math.Min(i + batchSize, accounts.Count);
                for (int j = i; j < endIndex; j++)
                {
                    var entity = accounts[j];
                    entitytrong["name"] = $"{entity.GetAttributeValue("name")}_Updated";
                    var updateRequest = new UpdateRequest { Target = entity };
                    requestCollection.Requests.Add(updateRequest);
                }

                var response = (ExecuteMultipleResponse)service.Execute(requestCollection);
                totalUpdated += requestCollection.Requests.Count;

                foreach (var resp in response.Responses)
                {
                    if (resp.Fault != null)
                    {
                        Console.WriteLine($"Error at index {i + resp.RequestIndex}: {resp.Fault.Message}");
                    }
                }

                Console.WriteLine($"Updated {requestCollection.Requests.Count} records. Total: {totalUpdated}");
                await Task.Delay(500); // Avoid throttling
            }

            Console.WriteLine($"Finished! Total records updated: {totalUpdated}");
        }
    }
}

Pro Tips for Success

Tip 1: Maximize Batch Efficiency

Use the full 1,000-record capacity of ExecuteMultipleRequest to minimize requests and speed up execution.

Tip 2: Dodge Throttling Limits

Add a small delay (e.g., await Task.Delay(500)) between batches to avoid hitting Dataverse’s throttling limits.

Tip 3: Robust Error Handling

Log errors from response.Responses to a file or table for post-run analysis, ensuring no issues slip through.

Wrapping Up

With ExecuteMultipleRequest, updating large datasets in Dataverse becomes a breeze. This C# console app showcases how to harness batch processing for efficiency and reliability. Try it out, tweak it to your needs, and watch your Dataverse updates soar!

Friday, June 27, 2025

How to Retrieve More Than 5000 Records Using FetchXML and Power Automate

Power Automate: Paging Through 5000+ Records with FetchXML

When Dataverse limits you to 5,000 records per page, Power Automate gives you the tools to break past that limit—without writing C# or JavaScript. In this guide, you’ll create a Flow that loops through large datasets using FetchXML paging logic.

When Should You Use Power Automate for This?

  • You want to export large record sets to Excel, SharePoint, SQL, or external APIs
  • You want to schedule this data sync (e.g., nightly)
  • You don’t want to maintain plugin code or JS logic
  • You need a low-code way to pull 10k+, 50k+ records from Dataverse
💡 Pro Tip: Power Automate handles up to 100k rows using paging in loops. Just be sure to avoid infinite loops or huge memory loads!

Step 1: Create Your Base FetchXML

<fetch mapping="logical" count="5000">
  <entity name="account">
    <attribute name="accountid" />
    <attribute name="name" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>
</fetch>

Step 2: Build the Flow

  1. Create a Manual trigger or use a Scheduled trigger
  2. Initialize these variables:
    • pagingCookie (string, empty)
    • pageNumber (int, default 1)
    • moreRecords (boolean, true)
    • recordsArray (array, empty)
  3. Add a Do Until loop:
    • Condition: moreRecords is equal to false

Step 3: Build Dynamic FetchXML with Pagination

Use a Compose action inside the loop to generate this:

<fetch mapping="logical" count="5000" page="@{variables('pageNumber')}"@{if(variables('pagingCookie'), concat(' paging-cookie="', variables('pagingCookie'), '"'), '')}>
  <entity name="account">
    <attribute name="accountid" />
    <attribute name="name" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>
</fetch>
Tip: You can use concat() + if() to conditionally add the paging-cookie only when it exists.

Step 4: Fetch and Merge Results

Inside the loop:

  • Add a “List rows using FetchXML” action (Dataverse connector)
  • Pass in the dynamic FetchXML from Compose
  • Use “Append to array variable” to merge value into recordsArray
append(outputs('List_rows_using_FetchXML')?['body']?['value'])

Step 5: Handle Paging State

  • Update moreRecords using an expression like:
@if(contains(outputs('List_rows_using_FetchXML')?['body'], '@odata.nextLink'), true, false)
  • Set pagingCookie from @odata.nextLink by parsing the query string (use substring, split, or parse approach)
  • Set pageNumber to add(variables('pageNumber'), 1)
⚠️ Warning: PagingCookie is encoded in @odata.nextLink. You may need to decode or extract from the query string using split() or uriComponentToString().

Step 6: Post-Loop Actions

Once the loop is done, your recordsArray contains ALL records. You can now:

  • Loop over them to create SharePoint rows
  • Insert into SQL or external APIs
  • Dump them into Excel via Create Table + Add a Row

Real-World Use Cases

  • Export nightly case updates to Excel
  • Push 100k record snapshots to Data Lake
  • Sync contacts to downstream email platform

Advanced Ideas

  • Convert flow into a Custom Connector wrapper with retry control
  • Trigger flow from a Model-Driven App button using HTTP Request trigger
  • Use Secure Inputs to store base FetchXML in Dataverse for reuse

How to Retrieve More Than 5000 Records Using FetchXML and Dynamics 365 Plugin

Server-Side FetchXML Paging Using a Dynamics 365 Plugin

If you're working server-side inside a Plugin or Custom Workflow Activity, you can bypass client-side paging limits and fully control how many records to pull using FetchXML and paging cookies.

When to Use a Plugin for This

  • Fetching large datasets for processing on the server
  • Validating or aggregating records during business logic
  • Keeping logic secure and non-reversible from the client
  • Integrating real-time, record-heavy automations

Step 1: Base FetchXML Structure

Start with this structure:

<fetch mapping='logical' count='5000'>
  <entity name='account'>
    <attribute name='accountid' />
    <attribute name='name' />
    <filter>
      <condition attribute='statecode' operator='eq' value='0' />
    </filter>
  </entity>
</fetch>
💡 Tip: Keep your base FetchXML free of paging logic—this will be dynamically injected.

Step 2: Plugin Implementation

This plugin fetches all active account records, 5000 at a time, using the PagingCookie and MoreRecords flags.

public class FetchAllAccountsPlugin : IPlugin
{
    public void Execute(IServiceProvider serviceProvider)
    {
        ITracingService tracer = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
        IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        IOrganizationService service = factory.CreateOrganizationService(context.UserId);

        string pagingCookie = null;
        int pageNumber = 1;
        int fetchCount = 5000;
        bool moreRecords = true;
        List<Entity> allAccounts = new List<Entity>();

        while (moreRecords)
        {
            string fetchXml = CreateFetchXml(pagingCookie, pageNumber, fetchCount);
            EntityCollection results = service.RetrieveMultiple(new FetchExpression(fetchXml));

            allAccounts.AddRange(results.Entities);
            tracer.Trace($"Fetched page {pageNumber}: {results.Entities.Count} records");

            moreRecords = results.MoreRecords;
            if (moreRecords)
            {
                pagingCookie = results.PagingCookie;
                pageNumber++;
            }
        }

        tracer.Trace($"✅ Total records fetched: {allAccounts.Count}");
    }

    private string CreateFetchXml(string pagingCookie, int pageNumber, int fetchCount)
    {
        return $@"
<fetch mapping='logical' page='{pageNumber}' count='{fetchCount}'{(pagingCookie != null ? $" paging-cookie='{{System.Security.SecurityElement.Escape(pagingCookie)}}'" : "")}>
  <entity name='account'>
    <attribute name='accountid' />
    <attribute name='name' />
    <filter>
      <condition attribute='statecode' operator='eq' value='0' />
    </filter>
  </entity>
</fetch>";
    }
}
⚠️ Watch out: Don’t forget to escape the pagingCookie! Use System.Security.SecurityElement.Escape() to avoid breaking XML syntax.

Bonus Tips

  • Use an async plugin or Custom API for long-running tasks
  • Consider streaming results or batching when working with >50,000 records
  • For external systems, push the results into Azure or a Data Export Service

Registration Ideas

This plugin can be triggered via:

  • A Custom API (for on-demand calls)
  • A Workflow or Power Automate Flow
  • Or bound to Entity Events like Create/Update (with filters)

How to Retrieve More Than 5000 Records Using FetchXML and JavaScript

Fetching Over 5000 Records with JavaScript and FetchXML

By default, FetchXML caps results at 5000 per page. But with JavaScript, you can paginate through results using the paging-cookie to retrieve all matching records. Let’s break down how.

Why Paging is Necessary

  • FetchXML will never return more than 5000 rows in a single query.
  • To get beyond 5K, we need to paginate using a paging-cookie from the server response.
  • This involves calling Xrm.WebApi.retrieveMultipleRecords recursively.

Step-by-Step Approach

1. Prepare Your Core FetchXML Body

Start with just the inner <entity> part—leave out the <fetch> wrapper.

<entity name="account">
  <attribute name="name" />
  <attribute name="accountnumber" />
  <filter>
    <condition attribute="statecode" operator="eq" value="0" />
  </filter>
</entity>

2. Use JavaScript to Retrieve All Records

This full function handles paging, cookie reuse, throttling (429 errors), and returns all your records as one big array.

function retrieveAllWithFetchXml(entityName, fetchBody, pageSize = 5000) {
  let allRecords = [];
  let pageNumber = 1;
  let pagingCookie = null;
  let retryCount = 0;

  function delay(ms) {
    return new Promise(res => setTimeout(res, ms));
  }

  function getPage() {
    const fetchXml = [
      `<fetch mapping="logical" page="${pageNumber}" count="${pageSize}"`,
      pagingCookie ? ` paging-cookie='${pagingCookie}'` : "",
      `>`,
        fetchBody,
      `</fetch>`
    ].join("");

    const query = `?fetchXml=${encodeURIComponent(fetchXml)}`;

    return Xrm.WebApi.retrieveMultipleRecords(entityName, query)
      .then(result => {
        allRecords = allRecords.concat(result.entities);

        if (result["@odata.nextLink"]) {
          const cookieToken = result["@odata.nextLink"]
            .match(/fetchXmlPagingCookie=([^&]+)/)[1];
          pagingCookie = decodeURIComponent(cookieToken);
          pageNumber++;
          retryCount = 0;
          return getPage();
        }

        return allRecords;
      })
      .catch(err => {
        if (err.status === 429 && retryCount < 5) {
          retryCount++;
          return delay(Math.pow(2, retryCount) * 500).then(getPage);
        }
        return Promise.reject(err);
      });
  }

  return getPage();
}

3. Sample Usage

const fetchBody = `
  <entity name="account">
    <attribute name="name" />
    <attribute name="accountnumber" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>`;

retrieveAllWithFetchXml("account", fetchBody)
  .then(records => console.log("Total:", records.length))
  .catch(err => console.error(err));
Pro Tip 💡
Always keep your FetchXML small enough to avoid URL length limits. For massive queries, you can use ExecuteFetchXml Web API or a server-side plugin.
⚠️ Watch out for:
- Server throttling: Implement retry logic
- URL encoding: Always encode your FetchXML before passing it into a GET call
- Link-entities: Keep your inner Fetch clean and modular

Wrapping Up

With this approach, you’ve got full control over paging behavior, and can efficiently retrieve tens of thousands of records from a model-driven app or Dynamics 365 using just JavaScript.

Thursday, June 26, 2025

Moving BPF Stages Backward in Dynamics 365 with JavaScript

A step-by-step guide to dynamically control BPF stages using client-side JavaScript

Introduction

Business Process Flows (BPFs) in Microsoft Dynamics 365 and Dataverse guide users through predefined stages, but moving backward isn’t supported natively in the BPF designer. For example, moving from stage 2 to stage 1 based on a field value like "No" requires custom logic. JavaScript, executed as a Web Resource in a model-driven app form, provides a client-side solution to update the activestageid field. In this post, we’ll show you how to create a JavaScript Web Resource to move BPF stages dynamically.

Why This Matters: JavaScript enables real-time, client-side BPF stage control, offering immediate feedback to users without server-side dependencies.

The Requirement

For an account entity with a BPF named "Account Process Flow," we need to:

  • Monitor a choice field (new_decisionfield).
  • If "Yes," move from stage 2 to stage 3.
  • If "No," move back to stage 1.

We’ll use JavaScript to detect field changes and update the BPF instance’s activestageid using the Web API.

Step-by-Step Guide

Step 1: Gather Required Information

Before writing the JavaScript, collect these details:

  • BPF Stage IDs: Find the GUIDs for stage 1 and stage 3 in the processstages table using XrmToolBox or FetchXML. Example:
    • Stage 1: 12345678-1234-1234-1234-1234567890ab
    • Stage 3: 98765432-4321-4321-4321-9876543210ba
  • BPF Instance Table: Identify the logical name (e.g., accountprocessflows).
  • Field Values: Confirm the new_decisionfield OptionSet values (Yes = 100000000, No = 100000001).

Step 2: Create the JavaScript Web Resource

Create a JavaScript file to handle the onchange event of new_decisionfield and update the BPF stage:

var BPFStageMover = BPFStageMover || {};

BPFStageMover.moveBPFStage = function (executionContext) {
    var formContext = executionContext.getFormContext();

    // Get the decision field value
    var decisionField = formContext.getAttribute("new_decisionfield");
    if (!decisionField) {
        console.log("Decision field not found.");
        return;
    }

    var decisionValue = decisionField.getValue();
    if (decisionValue === null) {
        console.log("Decision field is null.");
        return;
    }

    // Define stage GUIDs (replace with actual stage IDs)
    var stage1Id = "12345678-1234-1234-1234-1234567890ab"; // Stage 1 GUID
    var stage3Id = "98765432-4321-4321-4321-9876543210ba"; // Stage 3 GUID

    // Determine new stage based on decision value
    var newStageId = decisionValue === 100000000 ? stage3Id : stage1Id; // Yes: 100000000, No: 100000001

    // Get the account ID
    var accountId = formContext.data.entity.getId().replace(/[{}]/g, "");

    // Retrieve the BPF instance
    Xrm.WebApi.retrieveMultipleRecords("accountprocessflow", `?$select=activestageid&$filter=regardingobjectid eq ${accountId}`)
        .then(
            function success(result) {
                if (result.entities.length === 0) {
                    console.log("No BPF instance found for account: " + accountId);
                    return;
                }

                var bpfInstance = result.entities[0];
                var currentStageId = bpfInstance.activestageid;

                // Only update if the stage is different
                if (currentStageId !== newStageId) {
                    var updateData = {
                        "activestageid@odata.bind": `/processstages(${newStageId})`
                    };

                    // Update the BPF instance
                    Xrm.WebApi.updateRecord("accountprocessflow", bpfInstance.accountprocessflowid, updateData)
                        .then(
                            function success() {
                                console.log("BPF stage updated to: " + newStageId);
                                formContext.data.refresh(true); // Refresh form to reflect stage change
                            },
                            function error(error) {
                                console.error("Error updating BPF stage: " + error.message);
                                Xrm.Utility.alertDialog("Error updating BPF stage: " + error.message);
                            }
                        );
                } else {
                    console.log("No stage change needed.");
                }
            },
            function error(error) {
                console.error("Error retrieving BPF instance: " + error.message);
                Xrm.Utility.alertDialog("Error retrieving BPF instance: " + error.message);
            }
        );
};

Step 3: Register the Web Resource

Add the JavaScript to your Dynamics 365 form:

  1. Upload Web Resource:
    • In your solution, create a Web Resource named new_moveBPFStage (Type: JavaScript).
    • Upload the moveBPFStage.js file.
  2. Add to Form:
    • Open the account form in the form editor.
    • Add new_moveBPFStage to the form libraries.
    • Register the BPFStageMover.moveBPFStage function on the onchange event of new_decisionfield, passing the execution context.
  3. Publish: Save and publish the form and solution.

Step 4: Test the JavaScript

Open an account record, change new_decisionfield to "Yes" or "No," and verify the BPF stage moves to stage 3 or stage 1. Use browser developer tools (F12) to check console logs for errors.

Pro Tip: Use the browser’s developer tools to debug JavaScript errors. Add console logs for key variables (e.g., decisionValue, currentStageId) to trace execution.

Key Notes

  • Replace GUIDs: Update stage GUIDs in the JavaScript with actual values from your processstages table.
  • BPF Table: Use the correct logical name for your BPF instance table (e.g., accountprocessflows).
  • Option Set Values: Adjust 100000000 (Yes) and 100000001 (No) to match your field’s values.
  • Permissions: Ensure the user has read/write access to the accountprocessflows table.

Conclusion

Using JavaScript in a Dynamics 365 form, you can dynamically move BPF stages backward or forward, overcoming the designer’s limitations. This client-side approach provides real-time feedback and is ideal for interactive user experiences. Try it out, and share your feedback or questions in the comments!

© 2025 Your Name. All rights reserved.

Power Automate Optimization: Filter Rows vs Trigger Conditions - When and Why to Use Each

Filter Rows vs Trigger Conditions in Power Automate Filter Rows vs Trigger Conditions in Power Automate Why your flow...