Friday, June 27, 2025

How to Retrieve More Than 5000 Records Using FetchXML and Power Automate

Power Automate: Paging Through 5000+ Records with FetchXML

When Dataverse limits you to 5,000 records per page, Power Automate gives you the tools to break past that limit—without writing C# or JavaScript. In this guide, you’ll create a Flow that loops through large datasets using FetchXML paging logic.

When Should You Use Power Automate for This?

  • You want to export large record sets to Excel, SharePoint, SQL, or external APIs
  • You want to schedule this data sync (e.g., nightly)
  • You don’t want to maintain plugin code or JS logic
  • You need a low-code way to pull 10k+, 50k+ records from Dataverse
💡 Pro Tip: Power Automate handles up to 100k rows using paging in loops. Just be sure to avoid infinite loops or huge memory loads!

Step 1: Create Your Base FetchXML

<fetch mapping="logical" count="5000">
  <entity name="account">
    <attribute name="accountid" />
    <attribute name="name" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>
</fetch>

Step 2: Build the Flow

  1. Create a Manual trigger or use a Scheduled trigger
  2. Initialize these variables:
    • pagingCookie (string, empty)
    • pageNumber (int, default 1)
    • moreRecords (boolean, true)
    • recordsArray (array, empty)
  3. Add a Do Until loop:
    • Condition: moreRecords is equal to false

Step 3: Build Dynamic FetchXML with Pagination

Use a Compose action inside the loop to generate this:

<fetch mapping="logical" count="5000" page="@{variables('pageNumber')}"@{if(variables('pagingCookie'), concat(' paging-cookie="', variables('pagingCookie'), '"'), '')}>
  <entity name="account">
    <attribute name="accountid" />
    <attribute name="name" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>
</fetch>
Tip: You can use concat() + if() to conditionally add the paging-cookie only when it exists.

Step 4: Fetch and Merge Results

Inside the loop:

  • Add a “List rows using FetchXML” action (Dataverse connector)
  • Pass in the dynamic FetchXML from Compose
  • Use “Append to array variable” to merge value into recordsArray
append(outputs('List_rows_using_FetchXML')?['body']?['value'])

Step 5: Handle Paging State

  • Update moreRecords using an expression like:
@if(contains(outputs('List_rows_using_FetchXML')?['body'], '@odata.nextLink'), true, false)
  • Set pagingCookie from @odata.nextLink by parsing the query string (use substring, split, or parse approach)
  • Set pageNumber to add(variables('pageNumber'), 1)
⚠️ Warning: PagingCookie is encoded in @odata.nextLink. You may need to decode or extract from the query string using split() or uriComponentToString().

Step 6: Post-Loop Actions

Once the loop is done, your recordsArray contains ALL records. You can now:

  • Loop over them to create SharePoint rows
  • Insert into SQL or external APIs
  • Dump them into Excel via Create Table + Add a Row

Real-World Use Cases

  • Export nightly case updates to Excel
  • Push 100k record snapshots to Data Lake
  • Sync contacts to downstream email platform

Advanced Ideas

  • Convert flow into a Custom Connector wrapper with retry control
  • Trigger flow from a Model-Driven App button using HTTP Request trigger
  • Use Secure Inputs to store base FetchXML in Dataverse for reuse

No comments:

Post a Comment

Power Automate Optimization: Filter Rows vs Trigger Conditions - When and Why to Use Each

Filter Rows vs Trigger Conditions in Power Automate Filter Rows vs Trigger Conditions in Power Automate Why your flow...