Friday, January 24, 2025

Use ClearCollect and Bind Data into a Gallery in PowerApps Canvas Apps

How to Use ClearCollect and Bind Data into a Gallery in PowerApps Canvas Apps: Filter Employees by Age Greater Than 30

How to Use ClearCollect and Bind Data into a Gallery in PowerApps Canvas Apps: Filter Employees by Age Greater Than 30

Today, someone from one of the WhatsApp groups I am a member of asked for help:

Hello everyone,

I have a scenario where I am fetching records from Dataverse and displaying them in a horizontal gallery control. I need to display certain columns of the Dataverse records in the gallery, but only based on specific conditions. Could anyone guide me on how to achieve this?

Thank you in advance!

In this blog, we will explore how to use the ClearCollect function and bind data into a Gallery, leveraging SharePoint as the data source. Our goal is to filter employees whose age is greater than 30 and display the results in a Gallery.

Step 1: Set Up the SharePoint Data Source

Create a SharePoint List:

Navigate to your SharePoint site and create a list named Employees.

Add the following columns:

  • Title (Single line of text) - Represents the employee's name.
  • Age (Number) - Represents the employee's age.

Populate the List:

Add some sample data:

Title Age
John Smith 28
Mary Jones 35
Alice Brown 40
David White 29
Sarah Green 32

Step 2: Connect the Data Source in PowerApps

Open PowerApps Studio and create a new Canvas App.

  1. Click on Data in the left-hand menu.
  2. Select + Add Data and search for SharePoint.
  3. Connect to your SharePoint site and select the Employees list.

Step 3: Add a Button to Load the Filtered Data

Insert a button into the Canvas App:

  1. Insert a button into the Canvas App:
    • Go to the Insert menu and select Button.
    • Set the button's Text property to "Filter Employees".
  2. Set the button's OnSelect property to the following formula:
    ClearCollect(
        FilteredEmployees,
        Filter(Employees, Age > 30)
    )
    Explanation:
    • ClearCollect: Clears any existing data in the collection and adds the new filtered data.
    • FilteredEmployees: Name of the collection to store filtered employee data.
    • Filter(Employees, Age > 30): Filters the Employees list where the Age column is greater than 30.

Step 4: Add a Gallery to Display the Filtered Data

Insert a Gallery:

  1. Go to the Insert menu and select Gallery.
  2. Choose a vertical layout.

Bind the Gallery to the FilteredEmployees Collection:

  1. Select the Gallery.
  2. Set its Items property to:
    FilteredEmployees

Customize the Gallery:

  1. Click on the first item in the Gallery to select the template.
  2. Add a Label to display the employee name:
    • Set the Text property to:
      ThisItem.Title
  3. Add another Label to display the age:
    • Set the Text property to:
      ThisItem.Age
  4. Save and run the app by pressing F5.
  5. Click the Filter Employees button.
  6. Verify that the Gallery updates to display only employees aged greater than 30.
For example:
  • Mary Jones (Age 35)
  • Alice Brown (Age 40)
  • Sarah Green (Age 32)

Step 6: Optional Enhancements

  1. Add Search Functionality:
    • Insert a Text Input box for the search term.
    • Modify the filter formula to include a search condition:
      ClearCollect(
          FilteredEmployees,
          Filter(Employees, Age > 30 && StartsWith(Title, TextInput1.Text))
      )
  2. Show a Message When No Records Are Found:
    • Insert a Label below the Gallery.
    • Set its Visible property to:
      IsEmpty(FilteredEmployees)
    • Set its Text property to:
      "No employees found."

Thursday, January 16, 2025

Post Data Migration: Overcoming Challenges with Custom Solutions

Data Migration with Custom Solutions

Successfully Migrating 20 Years of Data to Production

Overcoming Challenges with Custom Solutions

Recently, I delivered a project involving the migration of 20 years' worth of data to a production environment. This task wasn’t just about moving records—it required meticulous preparation and validation to ensure the data was accurate and the process seamless. Here's how I tackled the challenge and the tools that came into play.

The ETL Journey: From Development to Production

To ensure smooth data extraction, transformation, and loading (ETL), I initially worked within the Development and User Acceptance Testing (UAT) environments. Writing and refining the ETL process in these controlled settings allowed me to:

  • Massage the data for consistency.
  • Validate transformations against business rules.
  • Confirm data accuracy and readiness for production.

Once the ETL process was validated, I performed a bulk delete of data from both Development and UAT to ensure these environments remained clean and uncluttered.

The Power of the Bulk Delete Tool in XrmToolBox

For this project, the Bulk Delete tool in XrmToolBox was instrumental. This GUI-based plugin is designed to handle large-scale record deletions in Microsoft Dataverse or Dynamics 365 environments efficiently. Some of its key features include:

  • Customizable Deletion Criteria: Execute deletions based on specific filters.
  • Query and Preview: Preview records before deletion to avoid errors.
  • Automation: Schedule and automate deletion tasks to save time.
  • Efficiency: Overcome the limitations of manual deletion methods like Advanced Find or built-in Bulk Deletion Jobs.

Challenges and the Need for Innovation

Despite the capabilities of the Bulk Delete tool, I encountered an issue: it stopped after deleting a few hundred records. This was likely due to Dataverse API request limits and throttling for large operations, which interrupted the deletion process.

To work around this, I had to manually re-run the tool multiple times with adjusted queries—an inefficient and time-consuming approach.

Custom Solution: A Tailored Console Application

To streamline the process, I decided to build a custom solution—a .NET console application. Below is the complete C# code I developed for this tool, which automates the deletion process and overcomes Bulk Delete Tool limitations:

C# Code:

using Microsoft.Extensions.Configuration;
using Microsoft.Identity.Client;
using Newtonsoft.Json;
using System;
using System.IO;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading.Tasks;

namespace PowerAppsRecordDeleter
{
    /// 
    /// Program class for deleting records in Microsoft Dataverse using their GUIDs
    /// 
    class Program
    {
        // Static IConfiguration field to hold configuration settings from appsettings.json
        static IConfiguration? Configuration;

        /// 
        /// Main entry point of the application. Sets up configuration, obtains an access token, and deletes records.
        /// 
        static async Task Main(string[] args)
        {
            // Load configuration from appsettings.json
            ConfigureAppSettings();

            // Retrieve configuration values
            string clientId = Configuration["AzureAd:ClientId"];
            string clientSecret = Configuration["AzureAd:ClientSecret"];
            string tenantId = Configuration["AzureAd:TenantId"];
            string environmentUrl = Configuration["Dataverse:EnvironmentUrl"];
            string entityName = Configuration["Dataverse:EntityName"];

            // Prompt user for comma-separated GUIDs
            Console.WriteLine("Enter comma-separated GUIDs of records to delete:");
            string inputGuids = Console.ReadLine();
            string[] guids = inputGuids.Split(',');

            // Get an access token using OAuth2 client credentials
            string accessToken = await GetAccessToken(clientId, clientSecret, tenantId, environmentUrl);

            // Iterate over each GUID and delete the corresponding record
            foreach (string guid in guids)
            {
                // Trim whitespace to ensure GUID format validity
                bool isDeleted = await DeleteRecord(guid.Trim(), accessToken, environmentUrl, entityName);

                // Log the result of each deletion attempt
                if (isDeleted)
                {
                    Console.WriteLine($"Record with GUID {guid} deleted successfully.");
                }
                else
                {
                    Console.WriteLine($"Failed to delete record with GUID {guid}.");
                }
            }
        }

        /// 
        /// Configures app settings from appsettings.json.
        /// 
        private static void ConfigureAppSettings()
        {
            // Initialize ConfigurationBuilder and load appsettings.json
            var builder = new ConfigurationBuilder()
                .SetBasePath(Directory.GetCurrentDirectory()) // Set base path to current directory
                .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true); // Load config file

            // Build the IConfiguration object
            Configuration = builder.Build();
        }

        /// 
        /// Authenticates with Azure AD and retrieves an OAuth2 access token for Dataverse.
        /// 
        /// Azure AD Application (client) ID
        /// Azure AD Application (client) secret
        /// Azure AD tenant ID
        /// Dataverse environment base URL
        /// Access token string
        private static async Task GetAccessToken(string clientId, string clientSecret, string tenantId, string environmentUrl)
        {
            // Build a ConfidentialClientApplication instance for client credentials authentication
            var app = ConfidentialClientApplicationBuilder.Create(clientId)
                .WithClientSecret(clientSecret)
                .WithAuthority(new Uri($"https://login.microsoftonline.com/{tenantId}"))
                .Build();

            // Set the API scope to Dataverse environment URL
            string[] scopes = new string[] { $"{environmentUrl}/.default" };

            // Acquire the token and return the access token string
            AuthenticationResult result = await app.AcquireTokenForClient(scopes).ExecuteAsync();
            return result.AccessToken;
        }

        /// 
        /// Deletes a record from Dataverse Web API using the given GUID.
        /// 
        /// GUID of the record to delete
        /// OAuth2 access token for API authentication
        /// Dataverse environment base URL
        /// Logical name of the entity
        /// Boolean indicating success or failure of the delete operation
        private static async Task DeleteRecord(string recordId, string accessToken, string environmentUrl, string entityName)
        {
            try
            {
                using (HttpClient client = new HttpClient())
                {
                    // Set base address to the Dataverse environment URL
                    client.BaseAddress = new Uri(environmentUrl);

                    // Add the access token to the authorization header
                    client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", accessToken);
                    client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));

                    // Construct the API delete endpoint URL with entity name and record GUID
                    string deleteUrl = $"/api/data/v9.2/{entityName}({recordId})";

                    // Execute the DELETE request
                    HttpResponseMessage response = await client.DeleteAsync(deleteUrl);

                    // Return true if the delete operation was successful, otherwise false
                    return response.IsSuccessStatusCode;
                }
            }
            catch (Exception ex)
            {
                // Log any exception that occurs during the delete process
                Console.WriteLine($"Error deleting record: {ex.Message}");
                return false;
            }
        }
    }
}

        

Configuration File (appsettings.json)

JSON Configuration:

{
  "AzureAd": {
    "ClientId": "Your Azure App Registration ID",
    "ClientSecret": "Your Azure App Registration Secret",
    "TenantId": "Your Azure Tenant ID"
  },
  "Dataverse": {
    "EnvironmentUrl": "Your Dataverse Environment URL",
    "EntityName": "Your Table Name"
  }
}
        

Lessons Learned

Delivering this project reinforced the importance of adaptability and innovation in data management. While existing tools like the Bulk Delete tool are incredibly powerful, they’re not always sufficient for complex or large-scale operations. Building custom solutions can bridge these gaps, ensuring efficiency and accuracy.

By combining off-the-shelf tools with bespoke solutions, I successfully navigated the challenges of this project and ensured a smooth migration to production.

Thursday, January 2, 2025

Develop SSRS Reports in Power Apps, Dataverse, or Dynamics 365

SSRS Report Creation in Power Apps, Dataverse, or Dynamics 365 CRM

How to Develop SSRS Reports in Power Apps, Dataverse, and Dynamics 365

1. Prerequisites

  • Microsoft Dynamics 365 instance
  • Power Apps environment or Dataverse instance
  • SQL Server Reporting Services (SSRS) or Visual Studio (for report creation)
  • Access to Dynamics 365 and Power Apps for report deployment and testing

2. Step-by-Step Instructions for SSRS Report Creation

Step 1: Setup Power Apps Environment

Ensure that you have access to your Power Apps environment and are connected to Dataverse (or Dynamics 365) where the data is stored.

  • Go to Power Apps and sign in with the correct credentials.
  • Ensure you have the necessary privileges for both creating reports and accessing the data (typically, you need System Administrator or custom roles that allow report creation).

Step 2: Download and Set Up SQL Server Data Tools (SSDT)

  • Download SQL Server Data Tools (SSDT) from the official site.
  • Install SSDT to get the necessary tools to create SSRS reports.

Step 3: Create an SSRS Report Using FetchXML

To create a report, you will use FetchXML queries in your SSRS report. FetchXML is a proprietary query language in Dynamics 365 and Dataverse, similar to SQL.

  1. Open Visual Studio with SQL Server Data Tools.
  2. Create a new Report Project in Visual Studio (File → New → Project → Report Project).
  3. Add a new Report by right-clicking the Reports folder and choosing Add → New Item → Report.
  4. Set up the Data Source: For Dynamics 365, use the CRM data source type and connect it to your Dynamics instance. Provide the connection string and ensure it's authenticated.

Step 4: Use FetchXML Queries in the Report

In the Report Data pane, right-click Datasets → Add Dataset. Choose FetchXML as the query type and paste your FetchXML query.

<fetch>
    <entity name="contact">
        <attribute name="firstname" />
        <attribute name="lastname" />
        <attribute name="emailaddress1" />
        <order attribute="lastname" descending="false" />
    </entity>
</fetch>
            

Step 5: Design the Report Layout

Use Table or List components in SSRS to display the results of the FetchXML query. Bind the table or list to the dataset you created from FetchXML. Add fields like firstname, lastname, and emailaddress1 to the table.

Step 6: Deploy the Report to Dynamics 365 or Power Apps

  1. Publish the Report from Visual Studio to your Dynamics 365 instance (Right-click the project and choose Deploy).
  2. Specify your Dynamics 365 server URL and provide credentials for deployment.
  3. Once deployed, go to Settings → Reports in the Dynamics 365 instance, where your report will appear.

Step 7: Add the Report to Power Apps / Power Portal

  1. Open Power Apps.
  2. Go to the Data section, select Entities, and choose the entity you want the report to be linked with.
  3. In the Forms section of the entity, add a new Subgrid or Embedded Report component.
  4. Configure the subgrid to show the SSRS report and assign appropriate parameters.

3. Example of FetchXML Queries for Reports

Example 1: Simple FetchXML Query to Fetch Contacts

<fetch>
    <entity name="contact">
        <attribute name="firstname" />
        <attribute name="lastname" />
        <attribute name="emailaddress1" />
        <order attribute="lastname" descending="false" />
    </entity>
</fetch>
            

Example 2: FetchXML with Filter for Active Accounts

<fetch>
    <entity name="account">
        <attribute name="name" />
        <attribute name="telephone1" />
        <attribute name="emailaddress1" />
        <filter>
            <condition attribute="statecode" operator="eq" value="0" />
        </filter>
    </entity>
</fetch>
            

Example 3: FetchXML with Join to Retrieve Contacts Related to Accounts

<fetch>
    <entity name="contact">
        <attribute name="firstname" />
        <attribute name="lastname" />
        <attribute name="emailaddress1" />
        <link-entity name="account" from="accountid" to="contactid">
            <attribute name="name" />
            <filter>
                <condition attribute="name" operator="like" value="Contoso%" />
            </filter>
        </link-entity>
    </entity>
</fetch>
            

Example 4: FetchXML with Grouping and Aggregation

<fetch>
    <entity name="invoice">
        <attribute name="totalamount" alias="total_invoice_amount" aggregate="sum" />
        <groupby>
            <attribute name="customerid" />
        </groupby>
    </entity>
</fetch>
            

Additional Steps: Embedding the Report in a Power Apps Canvas App

If you want to display your SSRS reports in a Power Apps Canvas app:

  1. Add a Power BI Tile control (which can display embedded SSRS reports).
  2. Set the control properties to link to the report that you have deployed in your Dynamics 365 environment.

JavaScript Cheat Sheet for Power Apps, Databases, and Dynamics 365 Development

JavaScript Cheat Sheet

JavaScript Cheat Sheet for Power Apps, Dataverse, and Dynamics 365 Development

Table of Contents

Introduction

JavaScript is a vital component in extending and customizing Microsoft Power Apps, Dataverse, and Dynamics 365. This cheat sheet aims to provide a collection of commonly used scripts, ranging from basic to advanced, to streamline your development process.

Easy Level Scripts

1. Displaying Alert Messages

function showAlertMessage(context) {
    alert("This is a simple alert message!");
}

2. Hiding/Showing Fields or Sections

function toggleFieldVisibility(context) {
    const formContext = context.getFormContext();
    formContext.getControl("field_name").setVisible(false);
}

3. Setting Default Values

function setDefaultValue(context) {
    const formContext = context.getFormContext();
    formContext.getAttribute("field_name").setValue("Default Value");
}

4. Retrieving Field Values

function getFieldValue(context) {
    const formContext = context.getFormContext();
    const value = formContext.getAttribute("field_name").getValue();
    console.log("Field Value: ", value);
}

Intermediate Level Scripts

5. Formatting Fields

function formatField(context) {
    const formContext = context.getFormContext();
    const control = formContext.getControl("field_name");
    control.setNotification("Invalid format!", "uniqueId");
}

6. Real-time Data Validation

function validateField(context) {
    const formContext = context.getFormContext();
    const value = formContext.getAttribute("field_name").getValue();
    if (!value || value.length < 5) {
        alert("The value must be at least 5 characters long!");
    }
}

7. Filtering Option Sets Dynamically

function filterOptionSet(context) {
    const formContext = context.getFormContext();
    const options = formContext.getControl("optionset_name").getOptions();
    options.forEach(option => {
        if (option.value !== desiredValue) {
            formContext.getControl("optionset_name").removeOption(option.value);
        }
    });
}

8. Triggering Workflows with JavaScript

function triggerWorkflow(recordId, workflowId) {
    const request = new XMLHttpRequest();
    const url = `/api/data/v9.2/workflows(${workflowId})/Microsoft.Dynamics.CRM.ExecuteWorkflow`;

    request.open("POST", url, true);
    request.setRequestHeader("OData-Version", "4.0");
    request.setRequestHeader("Content-Type", "application/json; charset=utf-8");
    request.setRequestHeader("Accept", "application/json");

    const data = JSON.stringify({ EntityId: recordId });
    request.send(data);
}

Advanced Level Scripts

9. Using Web API for CRUD Operations

function createRecord() {
    const data = {
        name: "New Record",
        description: "Created via JavaScript"
    };

    Xrm.WebApi.createRecord("entity_name", data).then(
        (response) => {
            console.log("Record created with ID: ", response.id);
        },
        (error) => {
            console.error("Error creating record: ", error);
        }
    );
}

10. Custom Ribbon Button Functionality

function ribbonButtonAction(primaryControl) {
    const recordId = primaryControl.data.entity.getId();
    alert(`Ribbon button clicked for record ID: ${recordId}`);
}

Best Practices

  • Always Comment Your Code: Make your scripts maintainable and easier to understand.
  • Use Try-Catch for Error Handling: Ensure robust error management.
  • Avoid Hardcoding: Use environment variables or configuration entities.
  • Test Thoroughly: Validate your scripts in different scenarios and environments.
  • Follow Naming Conventions: Use meaningful names for fields, functions, and variables.

Power Automate Optimization: Filter Rows vs Trigger Conditions - When and Why to Use Each

Filter Rows vs Trigger Conditions in Power Automate Filter Rows vs Trigger Conditions in Power Automate Why your flow...