Friday, June 27, 2025

How to Retrieve More Than 5000 Records Using FetchXML and JavaScript

Fetching Over 5000 Records with JavaScript and FetchXML

By default, FetchXML caps results at 5000 per page. But with JavaScript, you can paginate through results using the paging-cookie to retrieve all matching records. Let’s break down how.

Why Paging is Necessary

  • FetchXML will never return more than 5000 rows in a single query.
  • To get beyond 5K, we need to paginate using a paging-cookie from the server response.
  • This involves calling Xrm.WebApi.retrieveMultipleRecords recursively.

Step-by-Step Approach

1. Prepare Your Core FetchXML Body

Start with just the inner <entity> part—leave out the <fetch> wrapper.

<entity name="account">
  <attribute name="name" />
  <attribute name="accountnumber" />
  <filter>
    <condition attribute="statecode" operator="eq" value="0" />
  </filter>
</entity>

2. Use JavaScript to Retrieve All Records

This full function handles paging, cookie reuse, throttling (429 errors), and returns all your records as one big array.

function retrieveAllWithFetchXml(entityName, fetchBody, pageSize = 5000) {
  let allRecords = [];
  let pageNumber = 1;
  let pagingCookie = null;
  let retryCount = 0;

  function delay(ms) {
    return new Promise(res => setTimeout(res, ms));
  }

  function getPage() {
    const fetchXml = [
      `<fetch mapping="logical" page="${pageNumber}" count="${pageSize}"`,
      pagingCookie ? ` paging-cookie='${pagingCookie}'` : "",
      `>`,
        fetchBody,
      `</fetch>`
    ].join("");

    const query = `?fetchXml=${encodeURIComponent(fetchXml)}`;

    return Xrm.WebApi.retrieveMultipleRecords(entityName, query)
      .then(result => {
        allRecords = allRecords.concat(result.entities);

        if (result["@odata.nextLink"]) {
          const cookieToken = result["@odata.nextLink"]
            .match(/fetchXmlPagingCookie=([^&]+)/)[1];
          pagingCookie = decodeURIComponent(cookieToken);
          pageNumber++;
          retryCount = 0;
          return getPage();
        }

        return allRecords;
      })
      .catch(err => {
        if (err.status === 429 && retryCount < 5) {
          retryCount++;
          return delay(Math.pow(2, retryCount) * 500).then(getPage);
        }
        return Promise.reject(err);
      });
  }

  return getPage();
}

3. Sample Usage

const fetchBody = `
  <entity name="account">
    <attribute name="name" />
    <attribute name="accountnumber" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>`;

retrieveAllWithFetchXml("account", fetchBody)
  .then(records => console.log("Total:", records.length))
  .catch(err => console.error(err));
Pro Tip 💡
Always keep your FetchXML small enough to avoid URL length limits. For massive queries, you can use ExecuteFetchXml Web API or a server-side plugin.
⚠️ Watch out for:
- Server throttling: Implement retry logic
- URL encoding: Always encode your FetchXML before passing it into a GET call
- Link-entities: Keep your inner Fetch clean and modular

Wrapping Up

With this approach, you’ve got full control over paging behavior, and can efficiently retrieve tens of thousands of records from a model-driven app or Dynamics 365 using just JavaScript.

No comments:

Post a Comment

Power Automate Optimization: Filter Rows vs Trigger Conditions - When and Why to Use Each

Filter Rows vs Trigger Conditions in Power Automate Filter Rows vs Trigger Conditions in Power Automate Why your flow...