Saturday, June 28, 2025

Bulk Updating Over 20,000 Records in Dataverse ExecuteMultipleRequest —Step-by-Step Implementation

Mastering Dataverse Updates with ExecuteMultipleRequest: A Modern C# Guide

Dataverse Updates with ExecuteMultipleRequest

Recently, in a WhatsApp group I’m part of, someone reached out with a challenge: "With pagination, I want to update 20,511 records using Power Automate, but I am not able to do it more than 5,000. Anyone has any idea how to implement?" It’s a classic problem when working with Microsoft Dataverse—Power Automate, while great for many tasks, hits a wall with its pagination limits. In this case, the 5,000-record cap was stopping them cold, even with pagination tweaks.

Power Automate’s low-code simplicity is fantastic for small-to-medium workflows, but for bulk updates like this, it’s not the best fit. The pagination limit forces you into processing records in tiny batches, slowing things down and complicating the flow. For 20,511 records, that’s a lot of hoops to jump through—and a lot of time wasted.

My advice? Ditch Power Automate for this job and use a C# console application with ExecuteMultipleRequest. This Dataverse SDK method lets you bundle up to 1,000 operations into a single request, slashing the number of API calls and turbocharging performance. It’s a better choice because:

  • No Pagination Hassles: Unlike Power Automate, ExecuteMultipleRequest doesn’t choke on large datasets—you can process all 20,511 records without breaking them into 5,000-record chunks.
  • Faster Execution: Batching operations reduces server round trips, making it way quicker for big updates.
  • More Control: A console app lets you customize error handling, logging, and logic—things Power Automate struggles to match.

In this post, I’ll show you how to build that console app step-by-step, complete with code snippets and tips. Let’s get those records updated efficiently!

What You’ll Need

Before we start coding, gather these essentials:

  • Dataverse Environment: Access to a Dataverse instance with a table (e.g., account) containing records to update.
  • Azure AD App Registration: A registered app with client ID, client secret, and tenant ID, plus Dataverse permissions.
  • .NET Development Setup: Visual Studio or VS Code with .NET Core 3.1 or later.
  • NuGet Packages:
    • Microsoft.CrmSdk.CoreAssemblies
    • Microsoft.CrmSdk.XrmTooling.CoreAssembly
    • Microsoft.IdentityModel.Clients.ActiveDirectory

Step-by-Step: Building the Console App

Step 1: Create the Console Project

Fire up your terminal and create a new C# console app, then add the necessary NuGet packages.

dotnet new console -n DataverseBatchUpdater
cd DataverseBatchUpdater
dotnet add package Microsoft.CrmSdk.CoreAssemblies --version 9.0.2.34
dotnet add package Microsoft.CrmSdk.XrmTooling.CoreAssembly --version 9.1.0.68
dotnet add package Microsoft.IdentityModel.Clients.ActiveDirectory --version 5.3.0

Step 2: Set Up Authentication

To connect to Dataverse, configure authentication using your Azure AD app credentials. Open Program.cs and add these constants, replacing placeholders with your values.

private static readonly string dataverseUrl = "https://yourorg.crm.dynamics.com";
private static readonly string clientId = "your-client-id";
private static readonly string clientSecret = "your-client-secret";
private static readonly string tenantId = "your-tenant-id";
private static readonly string authority = $"https://login.microsoftonline.com/{tenantId}";

We’ll use these to authenticate via the Dataverse SDK’s CrmServiceClient.

Step 3: Authenticate with Dataverse

Add a method to establish a connection to Dataverse using the client credentials flow.

private static async Task GetCrmServiceClient()
{
    var authContext = new AuthenticationContext(authority);
    var clientCred = new ClientCredential(clientId, clientSecret);
    var authResult = await authContext.AcquireTokenAsync(dataverseUrl, clientCred);

    var connectionString = $"AuthType=ClientSecret;Url={dataverseUrl};ClientId={clientId};ClientSecret={clientSecret};RequireNewInstance=true";
    var serviceClient = new CrmServiceClient(connectionString);

    if (serviceClient.IsReady)
    {
        return serviceClient;
    }
    else
    {
        throw new Exception($"Connection failed: {serviceClient.LastCrmError}");
    }
}

Step 4: Prepare Records for Update

To use ExecuteMultipleRequest, we need a list of records to update. For this example, we’ll fetch accounts and modify their name field. Here’s how to query records (we’ll assume you have a pre-populated list or fetch them separately).

// Example: Fetch records (replace with your data source)
var accounts = new List();
for (int i = 1; i <= 20511; i++)
{
    var account = new Entity("account") { Id = Guid.NewGuid() };
    account["name"] = $"Account_{i}";
    accounts.Add(account);
}

In practice, you’d retrieve these using RetrieveMultiple, but we’ll focus on the update logic here.

Step 5: Implement ExecuteMultipleRequest

Now, the star of the show: ExecuteMultipleRequest. We’ll process records in batches of 1,000 (the max allowed per request).

private static async Task UpdateRecordsAsync(IOrganizationService service, List accounts)
{
    int batchSize = 1000;
    int totalUpdated = 0;

    for (int i = 0; i < accounts.Count; i += batchSize)
    {
        var requestCollection = new ExecuteMultipleRequest
        {
            Settings = new ExecuteMultipleSettings
            {
                ContinueOnError = true, // Keep going if one update fails
                ReturnResponses = true  // Get feedback on each operation
            },
            Requests = new OrganizationRequestCollection()
        };

        int endIndex = Math.Min(i + batchSize, accounts.Count);
        for (int j = i; j < endIndex; j++)
        {
            var entity = accounts[j];
            entity["name"] = $"{entity.GetAttributeValue("name")}_Updated";
            var updateRequest = new UpdateRequest { Target = entity };
            requestCollection.Requests.Add(updateRequest);
        }

        var response = (ExecuteMultipleResponse)service.Execute(requestCollection);
        totalUpdated += requestCollection.Requests.Count;

        foreach (var resp in response.Responses)
        {
            if (resp.Fault != null)
            {
                Console.WriteLine($"Error at index {i + resp.RequestIndex}: {resp.Fault.Message}");
            }
        }

        Console.WriteLine($"Updated {requestCollection.Requests.Count} records. Total: {totalUpdated}");
        await Task.Delay(500); // Throttling prevention
    }

    Console.WriteLine($"Finished! Total records updated: {totalUpdated}");
}

Tip 1: Batch Size Matters

Stick to 1,000 records per batch—Dataverse’s upper limit for ExecuteMultipleRequest. Smaller batches work but slow you down unnecessarily.

Step 6: Tie It All Together

Update Main to connect and run the update.

static async Task Main(string[] args)
{
    try
    {
        var service = await GetCrmServiceClient();
        Console.WriteLine("Connected to Dataverse!");

        // Replace with your record-fetching logic
        var accounts = new List();
        for (int i = 1; i <= 20511; i++)
        {
            var account = new Entity("account") { Id = Guid.NewGuid() };
            account["name"] = $"Account_{i}";
            accounts.Add(account);
        }

        await UpdateRecordsAsync(service, accounts);
    }
    catch (Exception ex)
    {
        Console.WriteLine($"Oops! {ex.Message}");
    }
}

Step 7: Run and Verify

Build and run your app, then check Dataverse to confirm the updates.

dotnet run

Look for console output like “Updated 1000 records” and verify in Dataverse that account names now end with "_Updated".

The Full Code

Here’s the complete C# console app. Replace placeholders with your Dataverse and Azure AD details.

using Microsoft.Crm.Sdk.Messages;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Tooling.Connector;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace DataverseBatchUpdater
{
    class Program
    {
        private static readonly string dataverseUrl = "https://yourorg.crm.dynamics.com";
        private static readonly string clientId = "your-client-id";
        private static readonly string clientSecret = "your-client-secret";
        private static readonly string tenantId = "your-tenant-id";
        private static readonly string authority = $"https://login.microsoftonline.com/{tenantId}";

        static async Task Main(string[] args)
        {
            try
            {
                var service = await GetCrmServiceClient();
                Console.WriteLine("Connected to Dataverse!");

                // Simulated record list (replace with your fetch logic)
                var accounts = new List();
                for (int i = 1; i <= 20511; i++)
                {
                    var account = new Entity("account") { Id = Guid.NewGuid() };
                    account["name"] = $"Account_{i}";
                    accounts.Add(account);
                }

                await UpdateRecordsAsync(service, accounts);
            }
            catch (Exception ex)
            {
                Console.WriteLine($"Oops! {ex.Message}");
            }
        }

        private static async Task GetCrmServiceClient()
        {
            var authContext = new AuthenticationContext(authority);
            var clientCred = new ClientCredential(clientId, clientSecret);
            var authResult = await authContext.AcquireTokenAsync(dataverseUrl, clientCred);

            var connectionString = $"AuthType=ClientSecret;Url={dataverseUrl};ClientId={clientId};ClientSecret={clientSecret};RequireNewInstance=true";
            var serviceClient = new CrmServiceClient(connectionString);

            if (serviceClient.IsReady)
            {
                return serviceClient;
            }
            else
            {
                throw new Exception($"Connection failed: {serviceClient.LastCrmError}");
            }
        }

        private static async Task UpdateRecordsAsync(IOrganizationService service, List accounts)
        {
            int batchSize = 1000;
            int totalUpdated = 0;

            for (int i = 0; i < accounts.Count; i += batchSize)
            {
                var requestCollection = new ExecuteMultipleRequest
                {
                    Settings = new ExecuteMultipleSettings
                    {
                        ContinueOnError = true,
                        ReturnResponses = true
                    },
                    Requests = new OrganizationRequestCollection()
                };

                int endIndex = Math.Min(i + batchSize, accounts.Count);
                for (int j = i; j < endIndex; j++)
                {
                    var entity = accounts[j];
                    entitytrong["name"] = $"{entity.GetAttributeValue("name")}_Updated";
                    var updateRequest = new UpdateRequest { Target = entity };
                    requestCollection.Requests.Add(updateRequest);
                }

                var response = (ExecuteMultipleResponse)service.Execute(requestCollection);
                totalUpdated += requestCollection.Requests.Count;

                foreach (var resp in response.Responses)
                {
                    if (resp.Fault != null)
                    {
                        Console.WriteLine($"Error at index {i + resp.RequestIndex}: {resp.Fault.Message}");
                    }
                }

                Console.WriteLine($"Updated {requestCollection.Requests.Count} records. Total: {totalUpdated}");
                await Task.Delay(500); // Avoid throttling
            }

            Console.WriteLine($"Finished! Total records updated: {totalUpdated}");
        }
    }
}

Pro Tips for Success

Tip 1: Maximize Batch Efficiency

Use the full 1,000-record capacity of ExecuteMultipleRequest to minimize requests and speed up execution.

Tip 2: Dodge Throttling Limits

Add a small delay (e.g., await Task.Delay(500)) between batches to avoid hitting Dataverse’s throttling limits.

Tip 3: Robust Error Handling

Log errors from response.Responses to a file or table for post-run analysis, ensuring no issues slip through.

Wrapping Up

With ExecuteMultipleRequest, updating large datasets in Dataverse becomes a breeze. This C# console app showcases how to harness batch processing for efficiency and reliability. Try it out, tweak it to your needs, and watch your Dataverse updates soar!

Friday, June 27, 2025

How to Retrieve More Than 5000 Records Using FetchXML and Power Automate

Power Automate: Paging Through 5000+ Records with FetchXML

When Dataverse limits you to 5,000 records per page, Power Automate gives you the tools to break past that limit—without writing C# or JavaScript. In this guide, you’ll create a Flow that loops through large datasets using FetchXML paging logic.

When Should You Use Power Automate for This?

  • You want to export large record sets to Excel, SharePoint, SQL, or external APIs
  • You want to schedule this data sync (e.g., nightly)
  • You don’t want to maintain plugin code or JS logic
  • You need a low-code way to pull 10k+, 50k+ records from Dataverse
💡 Pro Tip: Power Automate handles up to 100k rows using paging in loops. Just be sure to avoid infinite loops or huge memory loads!

Step 1: Create Your Base FetchXML

<fetch mapping="logical" count="5000">
  <entity name="account">
    <attribute name="accountid" />
    <attribute name="name" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>
</fetch>

Step 2: Build the Flow

  1. Create a Manual trigger or use a Scheduled trigger
  2. Initialize these variables:
    • pagingCookie (string, empty)
    • pageNumber (int, default 1)
    • moreRecords (boolean, true)
    • recordsArray (array, empty)
  3. Add a Do Until loop:
    • Condition: moreRecords is equal to false

Step 3: Build Dynamic FetchXML with Pagination

Use a Compose action inside the loop to generate this:

<fetch mapping="logical" count="5000" page="@{variables('pageNumber')}"@{if(variables('pagingCookie'), concat(' paging-cookie="', variables('pagingCookie'), '"'), '')}>
  <entity name="account">
    <attribute name="accountid" />
    <attribute name="name" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>
</fetch>
Tip: You can use concat() + if() to conditionally add the paging-cookie only when it exists.

Step 4: Fetch and Merge Results

Inside the loop:

  • Add a “List rows using FetchXML” action (Dataverse connector)
  • Pass in the dynamic FetchXML from Compose
  • Use “Append to array variable” to merge value into recordsArray
append(outputs('List_rows_using_FetchXML')?['body']?['value'])

Step 5: Handle Paging State

  • Update moreRecords using an expression like:
@if(contains(outputs('List_rows_using_FetchXML')?['body'], '@odata.nextLink'), true, false)
  • Set pagingCookie from @odata.nextLink by parsing the query string (use substring, split, or parse approach)
  • Set pageNumber to add(variables('pageNumber'), 1)
⚠️ Warning: PagingCookie is encoded in @odata.nextLink. You may need to decode or extract from the query string using split() or uriComponentToString().

Step 6: Post-Loop Actions

Once the loop is done, your recordsArray contains ALL records. You can now:

  • Loop over them to create SharePoint rows
  • Insert into SQL or external APIs
  • Dump them into Excel via Create Table + Add a Row

Real-World Use Cases

  • Export nightly case updates to Excel
  • Push 100k record snapshots to Data Lake
  • Sync contacts to downstream email platform

Advanced Ideas

  • Convert flow into a Custom Connector wrapper with retry control
  • Trigger flow from a Model-Driven App button using HTTP Request trigger
  • Use Secure Inputs to store base FetchXML in Dataverse for reuse

How to Retrieve More Than 5000 Records Using FetchXML and Dynamics 365 Plugin

Server-Side FetchXML Paging Using a Dynamics 365 Plugin

If you're working server-side inside a Plugin or Custom Workflow Activity, you can bypass client-side paging limits and fully control how many records to pull using FetchXML and paging cookies.

When to Use a Plugin for This

  • Fetching large datasets for processing on the server
  • Validating or aggregating records during business logic
  • Keeping logic secure and non-reversible from the client
  • Integrating real-time, record-heavy automations

Step 1: Base FetchXML Structure

Start with this structure:

<fetch mapping='logical' count='5000'>
  <entity name='account'>
    <attribute name='accountid' />
    <attribute name='name' />
    <filter>
      <condition attribute='statecode' operator='eq' value='0' />
    </filter>
  </entity>
</fetch>
💡 Tip: Keep your base FetchXML free of paging logic—this will be dynamically injected.

Step 2: Plugin Implementation

This plugin fetches all active account records, 5000 at a time, using the PagingCookie and MoreRecords flags.

public class FetchAllAccountsPlugin : IPlugin
{
    public void Execute(IServiceProvider serviceProvider)
    {
        ITracingService tracer = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
        IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        IOrganizationService service = factory.CreateOrganizationService(context.UserId);

        string pagingCookie = null;
        int pageNumber = 1;
        int fetchCount = 5000;
        bool moreRecords = true;
        List<Entity> allAccounts = new List<Entity>();

        while (moreRecords)
        {
            string fetchXml = CreateFetchXml(pagingCookie, pageNumber, fetchCount);
            EntityCollection results = service.RetrieveMultiple(new FetchExpression(fetchXml));

            allAccounts.AddRange(results.Entities);
            tracer.Trace($"Fetched page {pageNumber}: {results.Entities.Count} records");

            moreRecords = results.MoreRecords;
            if (moreRecords)
            {
                pagingCookie = results.PagingCookie;
                pageNumber++;
            }
        }

        tracer.Trace($"✅ Total records fetched: {allAccounts.Count}");
    }

    private string CreateFetchXml(string pagingCookie, int pageNumber, int fetchCount)
    {
        return $@"
<fetch mapping='logical' page='{pageNumber}' count='{fetchCount}'{(pagingCookie != null ? $" paging-cookie='{{System.Security.SecurityElement.Escape(pagingCookie)}}'" : "")}>
  <entity name='account'>
    <attribute name='accountid' />
    <attribute name='name' />
    <filter>
      <condition attribute='statecode' operator='eq' value='0' />
    </filter>
  </entity>
</fetch>";
    }
}
⚠️ Watch out: Don’t forget to escape the pagingCookie! Use System.Security.SecurityElement.Escape() to avoid breaking XML syntax.

Bonus Tips

  • Use an async plugin or Custom API for long-running tasks
  • Consider streaming results or batching when working with >50,000 records
  • For external systems, push the results into Azure or a Data Export Service

Registration Ideas

This plugin can be triggered via:

  • A Custom API (for on-demand calls)
  • A Workflow or Power Automate Flow
  • Or bound to Entity Events like Create/Update (with filters)

How to Retrieve More Than 5000 Records Using FetchXML and JavaScript

Fetching Over 5000 Records with JavaScript and FetchXML

By default, FetchXML caps results at 5000 per page. But with JavaScript, you can paginate through results using the paging-cookie to retrieve all matching records. Let’s break down how.

Why Paging is Necessary

  • FetchXML will never return more than 5000 rows in a single query.
  • To get beyond 5K, we need to paginate using a paging-cookie from the server response.
  • This involves calling Xrm.WebApi.retrieveMultipleRecords recursively.

Step-by-Step Approach

1. Prepare Your Core FetchXML Body

Start with just the inner <entity> part—leave out the <fetch> wrapper.

<entity name="account">
  <attribute name="name" />
  <attribute name="accountnumber" />
  <filter>
    <condition attribute="statecode" operator="eq" value="0" />
  </filter>
</entity>

2. Use JavaScript to Retrieve All Records

This full function handles paging, cookie reuse, throttling (429 errors), and returns all your records as one big array.

function retrieveAllWithFetchXml(entityName, fetchBody, pageSize = 5000) {
  let allRecords = [];
  let pageNumber = 1;
  let pagingCookie = null;
  let retryCount = 0;

  function delay(ms) {
    return new Promise(res => setTimeout(res, ms));
  }

  function getPage() {
    const fetchXml = [
      `<fetch mapping="logical" page="${pageNumber}" count="${pageSize}"`,
      pagingCookie ? ` paging-cookie='${pagingCookie}'` : "",
      `>`,
        fetchBody,
      `</fetch>`
    ].join("");

    const query = `?fetchXml=${encodeURIComponent(fetchXml)}`;

    return Xrm.WebApi.retrieveMultipleRecords(entityName, query)
      .then(result => {
        allRecords = allRecords.concat(result.entities);

        if (result["@odata.nextLink"]) {
          const cookieToken = result["@odata.nextLink"]
            .match(/fetchXmlPagingCookie=([^&]+)/)[1];
          pagingCookie = decodeURIComponent(cookieToken);
          pageNumber++;
          retryCount = 0;
          return getPage();
        }

        return allRecords;
      })
      .catch(err => {
        if (err.status === 429 && retryCount < 5) {
          retryCount++;
          return delay(Math.pow(2, retryCount) * 500).then(getPage);
        }
        return Promise.reject(err);
      });
  }

  return getPage();
}

3. Sample Usage

const fetchBody = `
  <entity name="account">
    <attribute name="name" />
    <attribute name="accountnumber" />
    <filter>
      <condition attribute="statecode" operator="eq" value="0" />
    </filter>
  </entity>`;

retrieveAllWithFetchXml("account", fetchBody)
  .then(records => console.log("Total:", records.length))
  .catch(err => console.error(err));
Pro Tip 💡
Always keep your FetchXML small enough to avoid URL length limits. For massive queries, you can use ExecuteFetchXml Web API or a server-side plugin.
⚠️ Watch out for:
- Server throttling: Implement retry logic
- URL encoding: Always encode your FetchXML before passing it into a GET call
- Link-entities: Keep your inner Fetch clean and modular

Wrapping Up

With this approach, you’ve got full control over paging behavior, and can efficiently retrieve tens of thousands of records from a model-driven app or Dynamics 365 using just JavaScript.

Thursday, June 26, 2025

Moving BPF Stages Backward in Dynamics 365 with JavaScript

A step-by-step guide to dynamically control BPF stages using client-side JavaScript

Introduction

Business Process Flows (BPFs) in Microsoft Dynamics 365 and Dataverse guide users through predefined stages, but moving backward isn’t supported natively in the BPF designer. For example, moving from stage 2 to stage 1 based on a field value like "No" requires custom logic. JavaScript, executed as a Web Resource in a model-driven app form, provides a client-side solution to update the activestageid field. In this post, we’ll show you how to create a JavaScript Web Resource to move BPF stages dynamically.

Why This Matters: JavaScript enables real-time, client-side BPF stage control, offering immediate feedback to users without server-side dependencies.

The Requirement

For an account entity with a BPF named "Account Process Flow," we need to:

  • Monitor a choice field (new_decisionfield).
  • If "Yes," move from stage 2 to stage 3.
  • If "No," move back to stage 1.

We’ll use JavaScript to detect field changes and update the BPF instance’s activestageid using the Web API.

Step-by-Step Guide

Step 1: Gather Required Information

Before writing the JavaScript, collect these details:

  • BPF Stage IDs: Find the GUIDs for stage 1 and stage 3 in the processstages table using XrmToolBox or FetchXML. Example:
    • Stage 1: 12345678-1234-1234-1234-1234567890ab
    • Stage 3: 98765432-4321-4321-4321-9876543210ba
  • BPF Instance Table: Identify the logical name (e.g., accountprocessflows).
  • Field Values: Confirm the new_decisionfield OptionSet values (Yes = 100000000, No = 100000001).

Step 2: Create the JavaScript Web Resource

Create a JavaScript file to handle the onchange event of new_decisionfield and update the BPF stage:

var BPFStageMover = BPFStageMover || {};

BPFStageMover.moveBPFStage = function (executionContext) {
    var formContext = executionContext.getFormContext();

    // Get the decision field value
    var decisionField = formContext.getAttribute("new_decisionfield");
    if (!decisionField) {
        console.log("Decision field not found.");
        return;
    }

    var decisionValue = decisionField.getValue();
    if (decisionValue === null) {
        console.log("Decision field is null.");
        return;
    }

    // Define stage GUIDs (replace with actual stage IDs)
    var stage1Id = "12345678-1234-1234-1234-1234567890ab"; // Stage 1 GUID
    var stage3Id = "98765432-4321-4321-4321-9876543210ba"; // Stage 3 GUID

    // Determine new stage based on decision value
    var newStageId = decisionValue === 100000000 ? stage3Id : stage1Id; // Yes: 100000000, No: 100000001

    // Get the account ID
    var accountId = formContext.data.entity.getId().replace(/[{}]/g, "");

    // Retrieve the BPF instance
    Xrm.WebApi.retrieveMultipleRecords("accountprocessflow", `?$select=activestageid&$filter=regardingobjectid eq ${accountId}`)
        .then(
            function success(result) {
                if (result.entities.length === 0) {
                    console.log("No BPF instance found for account: " + accountId);
                    return;
                }

                var bpfInstance = result.entities[0];
                var currentStageId = bpfInstance.activestageid;

                // Only update if the stage is different
                if (currentStageId !== newStageId) {
                    var updateData = {
                        "activestageid@odata.bind": `/processstages(${newStageId})`
                    };

                    // Update the BPF instance
                    Xrm.WebApi.updateRecord("accountprocessflow", bpfInstance.accountprocessflowid, updateData)
                        .then(
                            function success() {
                                console.log("BPF stage updated to: " + newStageId);
                                formContext.data.refresh(true); // Refresh form to reflect stage change
                            },
                            function error(error) {
                                console.error("Error updating BPF stage: " + error.message);
                                Xrm.Utility.alertDialog("Error updating BPF stage: " + error.message);
                            }
                        );
                } else {
                    console.log("No stage change needed.");
                }
            },
            function error(error) {
                console.error("Error retrieving BPF instance: " + error.message);
                Xrm.Utility.alertDialog("Error retrieving BPF instance: " + error.message);
            }
        );
};

Step 3: Register the Web Resource

Add the JavaScript to your Dynamics 365 form:

  1. Upload Web Resource:
    • In your solution, create a Web Resource named new_moveBPFStage (Type: JavaScript).
    • Upload the moveBPFStage.js file.
  2. Add to Form:
    • Open the account form in the form editor.
    • Add new_moveBPFStage to the form libraries.
    • Register the BPFStageMover.moveBPFStage function on the onchange event of new_decisionfield, passing the execution context.
  3. Publish: Save and publish the form and solution.

Step 4: Test the JavaScript

Open an account record, change new_decisionfield to "Yes" or "No," and verify the BPF stage moves to stage 3 or stage 1. Use browser developer tools (F12) to check console logs for errors.

Pro Tip: Use the browser’s developer tools to debug JavaScript errors. Add console logs for key variables (e.g., decisionValue, currentStageId) to trace execution.

Key Notes

  • Replace GUIDs: Update stage GUIDs in the JavaScript with actual values from your processstages table.
  • BPF Table: Use the correct logical name for your BPF instance table (e.g., accountprocessflows).
  • Option Set Values: Adjust 100000000 (Yes) and 100000001 (No) to match your field’s values.
  • Permissions: Ensure the user has read/write access to the accountprocessflows table.

Conclusion

Using JavaScript in a Dynamics 365 form, you can dynamically move BPF stages backward or forward, overcoming the designer’s limitations. This client-side approach provides real-time feedback and is ideal for interactive user experiences. Try it out, and share your feedback or questions in the comments!

© 2025 Your Name. All rights reserved.

Moving BPF Stages Backward in Dynamics 365 with Power Automate

A step-by-step guide to dynamically control BPF stages using Power Automate

Introduction

Business Process Flows (BPFs) in Microsoft Dynamics 365 and Dataverse guide users through predefined stages, but moving backward isn’t supported natively in the BPF designer or connectors. For example, moving from stage 2 to stage 1 based on a field value like "No" requires custom logic. Power Automate offers a low-code solution to achieve this by updating the activestageid field. In this post, we’ll show you how to create a flow that moves a BPF stage backward or forward based on a field value.

Why This Matters: Power Automate enables non-developers to implement complex BPF logic, making it accessible and maintainable without writing code.

The Requirement

For an account entity with a BPF named "Account Process Flow," we need to:

  • Monitor a choice field (new_decisionfield).
  • If "Yes," move from stage 2 to stage 3.
  • If "No," move back to stage 1.

Since the BPF designer and standard connectors only support forward movement, we’ll use Power Automate to update the BPF instance’s activestageid.

Step-by-Step Guide

Step 1: Gather Required Information

Before building the flow, collect these details:

  • BPF Stage IDs: Find the GUIDs for stage 1 and stage 3 in the processstages table using XrmToolBox or FetchXML. Example:
    • Stage 1: 12345678-1234-1234-1234-1234567890ab
    • Stage 3: 98765432-4321-4321-4321-9876543210ba
  • BPF Instance Table: Identify the logical name (e.g., accountprocessflows).
  • Field Values: Confirm the new_decisionfield OptionSet values (Yes = 100000000, No = 100000001).

Step 2: Set Up Power Automate

Create a flow in Power Automate:

  1. Create a Flow: Go to make.powerautomate.com and create a new automated cloud flow.
  2. Add Trigger: Use "When a row is added, modified, or deleted" (Dataverse connector):
    • Table name: Accounts
    • Change type: Modified
    • Filter attributes: new_decisionfield
  3. Add Condition: Check if new_decisionfield equals 100000000 (Yes) or 100000001 (No).
  4. Retrieve BPF Instance: Add a "List rows" action:
    • Table name: accountprocessflows
    • Filter rows: regardingobjectid eq @{triggerOutputs()?['body/accountid']}
    • Select columns: activestageid,accountprocessflowid
  5. Update BPF Stage:
    • Yes Branch: Add "Update a row" action:
      • Table name: accountprocessflows
      • Row ID: @{first(body('List_rows')?['value'])['accountprocessflowid']}
      • Active Stage: 98765432-4321-4321-4321-9876543210ba (Stage 3)
    • No Branch: Add "Update a row" action:
      • Table name: accountprocessflows
      • Row ID: @{first(body('List_rows')?['value'])['accountprocessflowid']}
      • Active Stage: 12345678-1234-1234-1234-1234567890ab (Stage 1)

Step 3: Sample Flow Configuration

Below is a JSON representation of the Power Automate flow for reference (you can import this as a starting point):

{
  "definition": {
    "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
    "actions": {
      "Condition": {
        "actions": {
          "Yes_Update_Stage_3": {
            "type": "ApiConnection",
            "inputs": {
              "host": {
                "connection": {
                  "name": "@parameters('$connections')['dataverse']['connectionId']"
                }
              },
              "method": "patch",
              "path": "/v9.2/accountprocessflows(@{first(body('List_BPF_Instance')?['value'])['accountprocessflowid']})",
              "body": {
                "activestageid": "98765432-4321-4321-4321-9876543210ba"
              }
            }
          }
        },
        "else": {
          "actions": {
            "No_Update_Stage_1": {
              "type": "ApiConnection",
              "inputs": {
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['dataverse']['connectionId']"
                  }
                },
                "method": "patch",
                "path": "/v9.2/accountprocessflows(@{first(body('List_BPF_Instance')?['value'])['accountprocessflowid']})",
                "body": {
                  "activestageid": "12345678-1234-1234-1234-1234567890ab"
                }
              }
            }
          }
        },
        "expression": {
          "and": [
            {
              "equals": [
                "@triggerOutputs()?['body/new_decisionfield']",
                100000000
              ]
            }
          ]
        },
        "type": "If"
      },
      "List_BPF_Instance": {
        "type": "ApiConnection",
        "inputs": {
          "host": {
            "connection": {
              "name": "@parameters('$connections')['dataverse']['connectionId']"
            }
          },
          "method": "get",
          "path": "/v9.2/accountprocessflows",
          "queries": {
            "$filter": "regardingobjectid eq @{triggerOutputs()?['body/accountid']}",
            "$select": "activestageid,accountprocessflowid"
          }
        }
      }
    },
    "triggers": {
      "When_a_row_is_modified": {
        "type": "ApiConnection",
        "inputs": {
          "host": {
            "connection": {
              "name": "@parameters('$connections')['dataverse']['connectionId']"
            }
          },
          "method": "get",
          "path": "/v9.2/rows",
          "queries": {
            "table": "accounts",
            "changeType": "Modified",
            "filterAttributes": "new_decisionfield"
          }
        }
      }
    }
  }
}

Step 4: Test the Flow

Save and test the flow by updating new_decisionfield on an account record. Verify the BPF stage moves to stage 3 (Yes) or stage 1 (No). Check the flow’s run history for errors.

Pro Tip: Add an error-handling step (e.g., send an email if no BPF instance is found) to make the flow more robust. Use the flow’s run history to debug issues.

Key Notes

  • Replace GUIDs: Update stage GUIDs in the flow with actual values from your processstages table.
  • BPF Table: Use the correct logical name for your BPF instance table (e.g., accountprocessflows).
  • Option Set Values: Adjust 100000000 (Yes) and 100000001 (No) to match your field’s values.
  • Permissions: Ensure the flow’s connection has permissions to read and update accountprocessflows records.

Conclusion

Power Automate provides a low-code way to move BPF stages backward in Dynamics 365, overcoming the designer’s forward-only limitation. This solution is ideal for non-developers and offers flexibility for dynamic process control. Try it out, and share your feedback or questions in the comments!

© 2025 Your Name. All rights reserved.

Moving Business Process Flow Stages Backward in Dynamics 365 with a Plugin

A step-by-step guide to programmatically move BPF stages backward in Dataverse using a C# plugin

Introduction

In Microsoft Dynamics 365 and Dataverse, Business Process Flows (BPFs) guide users through predefined stages, typically moving forward. But what if you need to move backward? For example, from stage 2 to stage 1 based on a field value like "No"? The BPF designer doesn’t support backward movement natively, but a server-side plugin can make it happen. In this post, we’ll walk through how to create a plugin that moves a BPF stage backward when a condition is met, complete with code and setup instructions.

Why This Matters: Moving BPF stages backward allows dynamic process control, enabling scenarios like revisiting earlier steps based on user input, enhancing flexibility in workflows.

The Requirement

Imagine you’re working with an account entity and a BPF called "Account Process Flow." At stage 2, a choice field (new_decisionfield) determines the next step:

  • If the value is "Yes," move to stage 3.
  • If the value is "No," move back to stage 1.

Since connectors in Power Automate and the BPF designer only support forward movement, we’ll use a plugin to update the activestageid field of the BPF instance.

Step-by-Step Guide

Step 1: Gather Required Information

Before coding, collect these details:

  • BPF Stage IDs: Find the GUIDs for stage 1 and stage 3 in the processstages table using XrmToolBox or FetchXML. Example:
    • Stage 1: 12345678-1234-1234-1234-1234567890ab
    • Stage 3: 98765432-4321-4321-4321-9876543210ba
  • BPF Instance Table: Identify the logical name (e.g., accountprocessflow).
  • Field Name: Confirm the logical name of the choice field (e.g., new_decisionfield) and its option set values (e.g., Yes = 100000000, No = 100000001).

Step 2: Set Up the Development Environment

You’ll need:

  • Visual Studio (2019 or 2022).
  • Microsoft Power Platform CLI or Dynamics 365 SDK for Microsoft.Xrm.Sdk.
  • Plugin Registration Tool (via XrmToolBox or Power Platform CLI).

Create a Class Library (.NET Framework 4.6.2 or 4.7.2) and add the Microsoft.CrmSdk.CoreAssemblies NuGet package:

Install-Package Microsoft.CrmSdk.CoreAssemblies

Step 3: Write the Plugin

The plugin triggers on the Update message for the account entity, checks the new_decisionfield value, and updates the BPF’s activestageid. Here’s the code:

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using System;

namespace BPFStageMover
{
    public class MoveBPFStage : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            // Obtain the execution context
            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);
            ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            try
            {
                // Check if the event is an update on the account entity
                if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)
                {
                    Entity target = (Entity)context.InputParameters["Target"];

                    // Check if decision_field is in the target entity
                    if (!target.Contains("new_decisionfield"))
                    {
                        tracingService.Trace("decision_field not updated, exiting plugin.");
                        return;
                    }

                    // Get the decision field value (OptionSetValue)
                    OptionSetValue decisionValue = target.GetAttributeValue("new_decisionfield");
                    if (decisionValue == null)
                    {
                        tracingService.Trace("decision_field is null, exiting plugin.");
                        return;
                    }

                    // Define stage GUIDs (replace with actual stage IDs from processstages table)
                    Guid stage1Id = new Guid("12345678-1234-1234-1234-1234567890ab"); // Stage 1 GUID
                    Guid stage3Id = new Guid("98765432-4321-4321-4321-9876543210ba"); // Stage 3 GUID

                    // Get the BPF instance for the account
                    Guid accountId = context.PrimaryEntityId;
                    QueryExpression query = new QueryExpression("accountprocessflow")
                    {
                        ColumnSet = new ColumnSet("activestageid"),
                        Criteria = new FilterExpression
                        {
                            Conditions =
                            {
                                new ConditionExpression("regardingobjectid", ConditionOperator.Equal, accountId)
                            }
                        }
                    };

                    EntityCollection bpfInstances = service.RetrieveMultiple(query);
                    if (bpfInstances.Entities.Count == 0)
                    {
                        tracingService.Trace("No BPF instance found for account {0}.", accountId);
                        return;
                    }

                    Entity bpfInstance = bpfInstances.Entities[0];
                    Guid currentStageId = bpfInstance.GetAttributeValue("activestageid")?.Id ?? Guid.Empty;

                    // Only proceed if the current stage is stage 2 (optional: add stage 2 GUID check)
                    // Update activestageid based on decision_field value
                    Guid newStageId = decisionValue.Value == 100000000 ? stage3Id : stage1Id; // 100000000 = Yes, 100000001 = No

                    if (currentStageId != newStageId)
                    {
                        Entity bpfUpdate = new Entity("accountprocessflow")
                        {
                            Id = bpfInstance.Id,
                            ["activestageid"] = new EntityReference("processstage", newStageId)
                        };
                        service.Update(bpfUpdate);
                        tracingService.Trace("BPF stage updated to {0} for account {1}.", newStageId, accountId);
                    }
                    else
                    {
                        tracingService.Trace("No stage change needed.");
                    }
                }
            }
            catch (Exception ex)
            {
                tracingService.Trace("Error in MoveBPFStage plugin: {0}", ex.Message);
                throw new InvalidPluginExecutionException($"Plugin error: {ex.Message}");
            }
        }
    }
}

Step 4: Build and Register the Plugin

  1. Build: Compile the solution in Visual Studio to generate the DLL (e.g., BPFStageMover.dll).
  2. Register: Use the Plugin Registration Tool to:
    • Upload the DLL and select the MoveBPFStage class.
    • Register a step:
      • Message: Update
      • Primary Entity: account
      • Filtering Attributes: new_decisionfield
      • Stage: PostOperation, Synchronous

Step 5: Test the Plugin

Update the new_decisionfield on an account record and verify the BPF stage moves to stage 3 (Yes) or stage 1 (No). Check trace logs in Dataverse if issues arise.

Pro Tip: Enable plugin tracing in Dataverse to debug issues. Use XrmToolBox to view logs in the Plugin Trace Log table.

Key Notes

  • Replace GUIDs: Update the stage GUIDs in the code with actual values from your processstages table.
  • BPF Table: Use the correct logical name for your BPF instance table (e.g., accountprocessflow).
  • Option Set Values: Adjust 100000000 (Yes) and 100000001 (No) to match your field’s values.
  • Security: Ensure the plugin runs under a user with permissions to update BPF instances.

Conclusion

By using a plugin to update the activestageid, you can overcome the BPF designer’s forward-only limitation, enabling dynamic stage movement in Dynamics 365. This solution is robust, scalable, and perfect for scenarios requiring flexible process control. Try it out, and let us know in the comments if you have questions or need further customization!

© 2025 Your Name. All rights reserved.

Tuesday, June 24, 2025

Dynamics 365 / PowerApps Dataverse’s Web API

Leverage Dynamics 365 / PowerApps Dataverse’s RESTful interface for seamless integration, automation, and customization.

The Dataverse Web API is your key to unlocking the full potential of Microsoft Dynamics 365 and PowerApps. Built on the OData v4 protocol, this RESTful API lets you create, read, update, and delete data with ease. Whether you’re a seasoned developer, a low-code enthusiast, or an integration pro, this guide will walk you through everything you need to know about the Web API—complete with practical examples and real-world use cases. Let’s dive in!

What is the Dataverse Web API?

The Dataverse Web API is a RESTful service powered by the OData v4 protocol. It allows you to interact with Dataverse data and metadata using standard HTTP methods through endpoints like https://yourorg.crm.dynamics.com/api/data/v9.2/. From creating records to executing custom business logic, it’s a versatile tool for any data-driven solution.

Why OData v4?

The OData v4 protocol is a standardized way to build and consume data APIs. It offers:

  • RESTful Design: Uses HTTP verbs (GET, POST, PATCH, DELETE) and JSON for lightweight data exchange.
  • Rich Queries: Supports powerful query options like $select, $filter, and $expand.
  • Metadata: Exposes schema details via $metadata, enabling dynamic discovery.
  • Custom Actions: Allows invoking business logic beyond basic CRUD operations.

Pro Tip

Use the $metadata endpoint (/api/data/v9.2/$metadata) to explore your Dataverse schema and understand available entities and relationships.

Where Can You Use the Web API?

The Web API’s flexibility makes it a go-to tool across scenarios:

  • Server-Side Apps: Integrate with C#, Python, or Node.js for backend solutions.
  • Client-Side Scripts: Enhance Dynamics 365 forms with JavaScript.
  • Low-Code Platforms: Automate processes in Power Automate.
  • Integration Tools: Connect with Azure Data Factory, SSIS, or custom apps.
  • External Systems: Any HTTP-capable tool (e.g., Postman) can interact with Dataverse.

How to Use the Web API: Scenarios and Techniques

Let’s explore how the Web API fits into different development contexts, leveraging OData v4’s power.

1. Plugins

Plugins are .NET code that run server-side in response to Dataverse events. While the IOrganizationService SDK is preferred, the Web API can be used for specific operations.

  • When to Use: For custom actions or functions only available via the Web API.
  • How: Use HttpClient with an Azure AD token.

Pro Tip

Avoid Web API calls in plugins unless necessary, as IOrganizationService is faster and more efficient.

2. Workflows

Classic workflows can include custom .NET activities that call the Web API, though this is rare.

  • When to Use: For niche scenarios not covered by built-in steps.
  • How: Similar to plugins, use HttpClient.

3. Power Automate

Power Automate simplifies Web API interactions with Dataverse connectors.

  • Standard Way: Use actions like “List rows” (OData queries behind the scenes).
  • Custom Way: Use the HTTP action for direct OData v4 queries.

Pro Tip

Use the “Perform an unbound action” step for custom actions to keep flows clean and maintainable.

4. JavaScript in Model-Driven Apps

JavaScript enhances Dynamics 365 forms using the Xrm.WebApi object, which wraps OData v4 calls.

  • Use Cases: Fetch data dynamically or update records.
  • How: Use Xrm.WebApi.retrieveMultipleRecords for queries.

5. PowerFX in Canvas Apps

PowerFX indirectly uses the Web API via Dataverse connectors.

  • How It Works: Functions like Patch or Filter translate to OData queries.
  • Custom Needs: Use Power Automate for direct API calls.

6. External Tools

Integration tools leverage OData v4 for seamless connectivity:

  • Azure Data Factory: Use the Dataverse connector for ETL.
  • SSIS: Call the Web API via script tasks or use the TDS endpoint.
  • Other Tools: Postman or PowerShell with Azure AD authentication.

Pro Tip

Test API calls in Postman to prototype OData queries before implementing them in code.

Calling the Web API: The OData v4 Way

To use the Web API, you need:

  1. Endpoint: https://yourorg.crm.dynamics.com/api/data/v9.2/.
  2. Authentication: Azure AD token in the Authorization: Bearer header.
  3. HTTP Methods:
    • GET: Retrieve data.
    • POST: Create records or call actions.
    • PATCH: Update records.
    • DELETE: Delete records.
  4. OData Queries: Use $select, $filter, $expand, etc.

Code Examples

C# (Server-Side): Retrieve Accounts

Fetch account names and revenue with OData v4:

using System.Net.Http;
using System.Net.Http.Headers;
using Newtonsoft.Json.Linq;

var client = new HttpClient();
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", accessToken);
var response = await client.GetAsync("https://yourorg.crm.dynamics.com/api/data/v9.2/accounts?$select=name,revenue");
if (response.IsSuccessStatusCode)
{
    var accounts = JObject.Parse(await response.Content.ReadAsStringAsync());
    foreach (var account in accounts["value"])
    {
        Console.WriteLine($"{account["name"]}: {account["revenue"]}");
    }
}

JavaScript (Model-Driven App): Fetch Top 3 Accounts

Use Xrm.WebApi for a simple OData query:

Xrm.WebApi.retrieveMultipleRecords("account", "?$select=name&$top=3").then(
    function success(result) {
        console.log(result);
    },
    function (error) {
        console.log(error.message);
    }
);

Power Automate: Call a Custom Action

Use the “Perform an unbound action” step with parameters like orderId.

OData Query Example: Retrieve Active Accounts

Here’s an OData v4 query and its response:

GET /api/data/v9.2/accounts?$select=name&$filter=statecode eq 0&$expand=primarycontactid($select=fullname)&$top=5

{
  "@odata.context": "...",
  "value": [
    {
      "name": "Contoso Inc.",
      "primarycontactid": {
        "fullname": "John Doe"
      }
    },
    {
      "name": "Fabrikam Ltd.",
      "primarycontactid": {
        "fullname": "Jane Smith"
      }
    }
  ]
}

Real-World Use Cases

The Web API shines in practical scenarios:

  1. E-Commerce Sync: A retailer uses POST requests to /salesorders to sync online orders, with $expand for customer details.
  2. Customer Portal: A web app updates profiles via PATCH to /contacts(guid).
  3. Reporting: Query Dataverse with OData filters for Power BI dashboards.
  4. Data Migration: Use Azure Data Factory to load legacy data into Dataverse.

Additional Resources

OData in Dynamics 365: How to Get the Data You Need, Fast

If you’ve worked with Microsoft Dynamics 365 (or Power Platform and Dataverse), you’ve likely encountered OData queries—used in Power Automate, Canvas Apps, JavaScript code, Power BI, and beyond. But what exactly is OData? How does it work? And how can you use it to build powerful, efficient solutions in your business apps?


1. What Is OData?

OData (Open Data Protocol) is a standardized REST API protocol developed by Microsoft. It allows for querying and manipulating data using simple URLs and standard HTTP verbs (GET, POST, PATCH, DELETE). Dynamics 365 and Dataverse use OData v4 through their Web API.

https://<yourorg>.crm.dynamics.com/api/data/v9.2/

Key Features of OData:

  • RESTful architecture based on HTTP
  • Simple URL-based query syntax
  • Supports metadata discovery via $metadata endpoint
  • Returns data in JSON (or Atom XML)

2. Under the Hood

OData is built on:

  • HTTP Methods: For CRUD operations
  • URI query options: $filter, $select, $expand, $orderby, $top, $skip
  • Entity Data Model (EDM) via /api/data/v9.2/$metadata
  • JSON payloads for data exchange

Supported HTTP Verbs:

  • GET – Retrieve data
  • POST – Create new record
  • PATCH – Update existing record
  • DELETE – Delete record

3. How OData is Used in Dynamics 365

  • Power Automate – Filter records using “List Rows” with OData filters
  • Canvas Apps – Filter galleries and data tables using OData-based functions
  • JavaScript on Model-driven Apps – Make client-side API calls using OData Xrm.WebApi & fetch()
  • Power BI – Load Dataverse tables via Real-time OData feeds
  • Custom Integration – Use OData endpoints from external systems (e.g., Azure Functions, Node.js)

4. Common OData Queries


// Filter active accounts
/accounts?$filter=statecode eq 0

// Select specific columns
/accounts?$select=name,accountnumber

// Sort by creation date
/accounts?$orderby=createdon desc

// Top 10 records
/accounts?$top=10

// Date-based filter
/accounts?$filter=createdon ge 2025-05-01T00:00:00Z

// Text search
/contacts?$filter=contains(fullname,'John')

// Lookup filter
/contacts?$filter=_parentcustomerid_value eq 'GUID'

// Expand lookup (parent contact)
/accounts?$expand=primarycontactid($select=fullname,emailaddress1)

5. Common Mistakes

  • Use eq not ==; and not &&
  • Enclose strings in single quotes: name eq 'Contoso'
  • Using unsupported functions like substring
  • Incorrect lookup schema or missing _value suffix for lookups, such as:_parentcustomerid_value
  • Unsupported nested $expand (only one level is supported)

6. Debugging Tips

  • Inspect /api/data/v9.2/$metadata for correct names
  • Test queries in Postman or browser
  • Use Dev Tools → Network for fetch/Xrm.WebApi calls
  • Read error messages—they often explain the issue

7. JavaScript Example: Basic Account + Contact Expand


// Fetch Account with Primary Contact info
var accountId = "00000000-0000-0000-0000-000000000000";

var reqUrl = Xrm.Utility.getGlobalContext().getClientUrl() +
  "/api/data/v9.2/accounts(" + accountId + ")" +
  "?$select=name,accountnumber" +
  "&$expand=primarycontactid($select=fullname,emailaddress1,jobtitle)";

fetch(reqUrl, {
  method: "GET",
  headers: {
    "OData-MaxVersion": "4.0",
    "OData-Version": "4.0",
    "Accept": "application/json",
    "Content-Type": "application/json; charset=utf-8",
    "Prefer": "odata.include-annotations=*"
  }
})
.then(response => response.json())
.then(data => {
  console.log("Account Name:", data.name);
  console.log(data.primarycontactid
    ? "Contact: " + data.primarycontactid.fullname
    : "No Contact");
})
.catch(console.error);

Another example - Retrieve Contacts with their parent Account names using $expand=parentcustomerid_account:


// Retrieve Contacts with Parent Account Name
Xrm.WebApi.retrieveMultipleRecords(
  "contact",
  "$select=fullname,emailaddress1&$expand=parentcustomerid_account($select=name)"
).then(
  function success(result) {
    result.entities.forEach(function (contact) {
      console.log(
        `Contact: ${contact.fullname}, ` +
        `Email: ${contact.emailaddress1}, ` +
        `Account: ${contact.parentcustomerid_account?.name || "No Account"}`
      );
    });
  },
  function (error) {
    console.error("Error retrieving contacts:", error.message);
  }
);

8. Power Automate Example: “List Rows” with OData


// Filter = only active contacts from last 30 days
statuscode eq 1 and createdon ge 2025-05-25T00:00:00Z

// Select columns
fullname,emailaddress1

// Order and limit results
Order by: createdon desc
Top Count: 50

Configure the Dataverse “List rows” step with these fields. This approach minimizes data transfer and maximizes performance.


Empower your Dynamics 365 solutions with efficient, accurate OData queries—whether you’re building automations, apps, or APIs. Happy querying!

Saturday, June 21, 2025

Building a Visitor Check-In App with Microsoft Power Platform: A Hands-On Success

From Paper Logs to Power Platform: Building a Visitor Check-In App in 60 minutes

From Paper Logs to Power Platform: Building a Visitor Check-In App in 60 minutes

Welcome, readers! I recently led a 60-minute hands-on session where I built a Visitor Pre-Registration and Check-In System using Microsoft Power Platform using only standard Microsoft 365 licensing components: Microsoft Forms, SharePoint, Power Automate, and a Power Apps Canvas App. The energy in the room was electric, and by the end, I had a fully functional app that amazed everyone. In this blog post, I’ll walk you through every step of the process, complete with detailed instructions and all the code used, so you can build your own version of this app.

Why Standard Licensing Makes This Special

Since I’m working within the constraints of standard Microsoft 365 licensing, I leveraged components that support connectors under standard licensing. This means no premium connectors or additional costs are required, making this solution accessible to any organization with a Microsoft 365 subscription.

This approach not only saves money but also democratizes access to powerful automation and app-building tools!

The Problem: Visitor Management Challenges

Many organizations struggle with inefficient visitor management—paper logs, Excel spreadsheets, or manually updated SharePoint lists. These methods lack automation and an intuitive interface. My goal was to create a modern solution using Power Platform’s low-code tools, and I succeeded!

The Vision: A Streamlined System

I aimed to build a system where visitors pre-register via a form, staff manage check-ins through an app, and managers access insights via a dashboard. Using Microsoft Forms, SharePoint, Power Automate, and Power Apps, I created a solution that automates pre-registration, notifies hosts, prints badges, and tracks visitor activity—all within standard licensing.

Technical Deep Dive: Step-by-Step Guide

Let’s dive into the details. Below, I’ll break down each component with in-depth instructions and the exact code used.

1. Setting Up the SharePoint List

The SharePoint list "Visitor Records" is the foundation of my app, storing all visitor data securely.

Detailed Steps to Create the SharePoint List:

  1. Navigate to your SharePoint site (e.g., https://yourdomain.sharepoint.com/sites/yoursite).
  2. In the left navigation, click Site contents.
  3. Click + New and select List.
  4. In the "Name" field, enter Visitor Records, then click Create.
  5. Once created, click + Add column to define the following columns:
    • Visitor Name: Select "Single line of text," set as Required.
    • Email: Select "Single line of text," set as Required.
    • Meeting Purpose: Select "Choice," add options: "Interview," "Review," "Other." Set default to "Interview."
    • Host Employee: Select "Person or Group," allow only "People," set as Required.
    • Expected Arrival Date: Select "Date and time," include date only, set as Required.
    • Expected Arrival Time: Select "Choice," add options: "9:00 AM," "10:00 AM," "11:00 AM," etc. Set default to "9:00 AM."
    • Check-In Status: Select "Choice," add options: "Pending," "Checked In," "Checked Out." Set default to "Pending."
    • Check-In Time: Select "Date and time," include date and time.
    • HostEmail: Select "Single line of text."
  6. Click Save after adding each column.

Tip: Ensure column names match exactly as listed, as they’ll be referenced later.

SharePoint List Setup

2. Microsoft Forms & Power Automate: Pre-Registration Automation

I used Microsoft Forms for visitor pre-registration and Power Automate to sync form responses to SharePoint and send confirmation emails.

Detailed Steps to Create the Form:

  1. Go to forms.office.com and sign in.
  2. Click New Form.
  3. Set the title to Visitor Registration.
  4. Add the following questions:
    • Visitor Name: Click "+ Add new," select "Text," mark as Required.
    • Email: Add "Text," mark as Required, enable "Restrict to email addresses."
    • Meeting Purpose: Add "Choice," enter options: "Interview," "Review," "Other." Mark as Required.
    • Host Employee’s Email: Add "Text," mark as Required, enable "Restrict to email addresses."
    • Expected Arrival Date: Add "Date," mark as Required.
    • Expected Arrival Time: Add "Choice," enter options: "9:00 AM," "10:00 AM," "11:00 AM," etc. Mark as Required.
  5. Click Share, copy the link, and save the form.

Detailed Steps to Create the Power Automate Flow:

  1. Go to powerautomate.microsoft.com.
  2. Click Create > Automated cloud flow.
  3. Name it Form to List, search for "When a new response is submitted," select the Microsoft Forms trigger, and click Create.
  4. In the trigger, select "Visitor Registration" from the Form Id dropdown.
  5. Click + New step, search for "Get response details" (Microsoft Forms), and configure:
    • Form Id: Visitor Registration
    • Response Id: Select dynamic content Response Id from the trigger.
  6. Add a step: Search for "Get user profile (V2)" (Office 365 Users), set User (UPN) to dynamic Host Employee’s Email from the form response.
  7. Add a step: Search for "Create item" (SharePoint), configure:
    • Site Address: Enter your SharePoint site URL (e.g., https://yourdomain.sharepoint.com/sites/yoursite)
    • List Name: Visitor Records
    • Map fields using dynamic content:
      • Visitor Name: Visitor Name
      • Email: Email
      • Meeting Purpose: Meeting Purpose
      • Host Employee: Mail from "Get user profile"
      • Expected Arrival Date: Expected Arrival Date
      • Expected Arrival Time: Expected Arrival Time
      • Check-In Status: Type Pending
      • HostEmail: Mail from "Get user profile"
  8. Add a step: Search for "Send an email (V2)" (Office 365 Outlook), configure:
    • To: Dynamic Email from form response
    • Subject: Visitor Pre-Registration Confirmation
    • Body (use HTML editor):
      Dear [Dynamic "Visitor Name"],
      Thank you for pre-registering for your visit. Here are the details:
      Meeting Purpose: [Dynamic "Meeting Purpose"]
      Host: [Dynamic "DisplayName" from "Get user profile"]
      Expected Arrival Date and Time: [Dynamic "Expected Arrival Date"] at [Dynamic "Expected Arrival Time"]
      Please arrive at the front desk upon arrival.
      Best regards,
      [My Company Name]
  9. Click Save and test by submitting a form response.
Microsoft Forms Integration

3. Building the Canvas App

The Canvas App provides an interactive interface for staff to manage visitors. Here’s how I built it.

Detailed Steps to Create the Canvas App:

  1. Go to make.powerapps.com.
  2. Click Create > Canvas app, select "Tablet layout."
  3. Name it Visitor Check-In and click Create.
  4. In the left pane, click Data, then Add data, select SharePoint, connect to your site, and add the Visitor Records list.
  5. Rename the default screen to scrTodayVisitors (double-click Screen1 in the Tree view).
  6. Add a Gallery:
    • Click Insert > Gallery > Vertical.
    • Set Data Source to Visitor Records.
    • Set Items property:
      Filter(
          'Visitor Records',
          DateValue(
              Text(
                  'Expected Arrival Date',
                  "[$-en-US]yyyy-mm-dd"
              )
          ) = Today(),
          If(
              drpHostFilter.Selected.Value = "All",
              true,
              Host.DisplayName = drpHostFilter.Selected.Value
          )
      )
    • Set OnSelect: UpdateContext({conSelectedId: ThisItem.ID})
    • Set DisplayMode:
      If(
          frmVisitorDetails.Mode = FormMode.View,
          DisplayMode.Edit,
          DisplayMode.Disabled
      )
    • Customize fields: Click "Edit" next to Fields, add "Visitor Name," "Host Employee," "Expected Arrival Time," "Check-In Status."
  7. Add a Dropdown (drpHostFilter):
    • Click Insert > Dropdown, name it drpHostFilter.
    • Set Default: "All"
    • Set Items:
      Ungroup(
          Table(
              { Value: "All" },
              Sort(Distinct('Visitor Records', Host.DisplayName), Value, SortOrder.Ascending)
          ),
          "Value"
      )
  8. Add a Form (frmVisitorDetails):
    • Click Insert > Form > Edit Form, name it frmVisitorDetails.
    • Set DataSource: Visitor Records
    • Set DefaultMode: FormMode.View
    • Set Item: LookUp('Visitor Records', ID = conSelectedId)
    • Set OnSuccess:
      ViewForm(frmVisitorDetails);
      Notify(
          "You have updated the record!",
          NotificationType.Success,
          2000
      );
      UpdateContext({conSelectedId: Self.LastSubmit.ID})
    • Add fields: Click "Edit fields," add all columns from the list.
  9. Add Icons:
    • Edit (icoEdit): Insert > Icon > Edit, set OnSelect: EditForm(frmVisitorDetails), DisplayMode:
      If(
          And(
              frmVisitorDetails.Mode = FormMode.View,
              Not(IsBlank(conSelectedId))
          ),
          DisplayMode.Edit,
          DisplayMode.Disabled
      )
    • Save (icoSave): Insert > Icon > Save, set OnSelect: SubmitForm(frmVisitorDetails), DisplayMode:
      If(
          And(
              frmVisitorDetails.Mode = FormMode.Edit,
              Not(IsBlank(conSelectedId))
          ),
          DisplayMode.Edit,
          DisplayMode.Disabled
      )
    • Cancel (icoCancel): Insert > Icon > Cancel, set OnSelect: ResetForm(frmVisitorDetails), DisplayMode:
      If(
          frmVisitorDetails.Mode in [
              FormMode.Edit,
              FormMode.New
          ],
          DisplayMode.Edit,
          DisplayMode.Disabled
      )
    • Print (icoPrint): Insert > Icon > Print, set OnSelect: Print(), DisplayMode:
      If(
          And(
              frmVisitorDetails.Mode = FormMode.View,
              Not(IsBlank(conSelectedId))
          ),
          DisplayMode.Edit,
          DisplayMode.Disabled
      )

    I chose to use icons over buttons primarily because of space optimization and visual clarity. Icons are compact and blend well into tight UI layouts like galleries, forms, or headers without overwhelming the screen. They also allow for a more modern and intuitive design—users can instantly recognize actions like edit, delete, print, or save just by the visual cue. While buttons are great for more prominent actions, icons are ideal for contextual or secondary actions where space and clean design are priorities.

  10. Save and publish the app.
Canvas App Interface

4. Print Badge Functionality

I added a print feature to generate visitor badges.

Detailed Steps for Print Badge:

  1. Click Insert > New Screen, name it scrPrintScreen.
  2. Set DataSource to Visitor Records.
  3. Set Item: galTodayVisitors.Selected
  4. Add labels for Visitor Name, Host Employee, Expected Arrival Date/Time (e.g., galTodayVisitors.Selected.'Visitor Name').
  5. Update icoPrint OnSelect:
    Set(varOriginScreen, scrTodayVisitors);
    Navigate(scrPrintScreen);
  6. On scrPrintScreen, add a Print button, set OnSelect:
    Print();
    Navigate(varOriginScreen);
Print Badge Functionality

5. Power Automate Flows: Check-In and Check-Out

I created two flows to automate check-in and check-out notifications.

Detailed Steps for Check-In Flow:

  1. Go to Power Automate, click Create > Automated cloud flow.
  2. Name it Visitor Check-In - Update List & Notify Host, select "When an item is created or modified" (SharePoint), click Create.
  3. Configure trigger:
    • Site Address: Your SharePoint site URL
    • List Name: Visitor Records
  4. Add a "Condition": Check-In Status is equal to Checked In.
  5. In "If yes," add "Send an email (V2)":
    • To: Dynamic HostEmail
    • Subject: Visitor Check-In Notification
    • Body:
      Your visitor [Dynamic "Visitor Name"] has checked in at [Dynamic "Check-In Time"].
      Meeting Purpose: [Dynamic "Meeting Purpose"]
  6. Save and test.

Detailed Steps for Check-Out Flow:

  1. Create a new flow named Visitor Check-Out - Update List & Thank You Email.
  2. Use the same SharePoint trigger.
  3. Add a "Condition": Check-In Status is equal to Checked Out.
  4. In "If yes," add "Update item" (SharePoint):
    • Site Address and List Name as above
    • ID: Dynamic ID
    • Check-Out Time: utcNow()
  5. Add "Send an email (V2)":
    • To: Dynamic Email
    • Subject: Thank You for Visiting
    • Body:
      Dear [Dynamic "Visitor Name"],
      Thank you for visiting! You checked out at [Dynamic "Check-Out Time"].
      Best regards,
      [My Company Name]
  6. Save and test.
Power Automate Flow

6. Dashboard & Reporting

I added a dashboard for insights.

Detailed Steps for Dashboard:

  1. In the Canvas App, add a new screen named scrDashboard.
  2. Bar Chart: Insert > Chart > Column chart, set Items:
    GroupBy('Visitor Records', "Meeting Purpose", "Count")
  3. Gallery: Add a gallery, set Items: 'Visitor Records', display Check-In Time, Check-Out Time.
  4. Late Arrival Label: Add a label, set Text:
    "Late Arrivals: " & CountIf('Visitor Records', 'Check-In Time' > 'Expected Arrival Date' + Time(Value(Left('Expected Arrival Time', 2)), Value(Mid('Expected Arrival Time', 4, 2)), 0))
Dashboard and Reporting

Conclusion

This Visitor Check-In App demonstrates the power of Microsoft Power Platform, all built within standard Microsoft 365 licensing for zero additional cost. With these detailed steps and code, you can build your own app. Try it out and let me know how it goes!

Snapshots:

SharePoint List Setup Microsoft Forms Integration Power Automate Flow Canvas App Interface Print Badge Functionality Dashboard and Reporting

Power Automate Optimization: Filter Rows vs Trigger Conditions - When and Why to Use Each

Filter Rows vs Trigger Conditions in Power Automate Filter Rows vs Trigger Conditions in Power Automate Why your flow...