Streamlining Workflow: Comparing Objects with Inline JavaScript in Azure Functions Powered by Power Automate

  1. What Are Azure Functions?
  2. Exercise
    1. Code
    2. Full code
    3. Power Automate

The short of it is…Power Platform lives on Azure. Now this article is not an attempt to push people into Azure, but I do at least want you to consider it. If you are like me, your introduction into Azure was most likely Azure AD now known as Entra ID and next year probably something else.

Azure is a big universe that supports many different workloads. Of particular interest to me and for the benefit of this article I want to focus on Azure Functions.

What Are Azure Functions?

I don’t think there is a need to dive deep into the subject, but what I will say is, Azure Functions allow us to run serverless, on-demand code. Azure Functions support various programming languages such as C# and JavaScript. The ability to call these functions and have them execute a piece of code is beneficial to developers, especially to “low-coders”. Reason being is, we don’t have to be as seasoned as a traditional “pro-coder” we can focus only on the code that matters to us.

With the advent of Co-Pilot and ChatGpt we can write code that serves our needs. I’d hate to cause a stir so I will say this, be careful with a.i. generated code, make sure your prompt is as explicit as you can get it and if you are unsure about the piece of code you are using…ask for help.

Exercise

I once worked with a client who needed to compare two objects and identify their differences to create exceptions for their customer service representatives to address. Implementing this process in Power Automate significantly saved time, though the solution involved multiple actions, loops, and data transformations, making it less streamlined than desired. However, as I gained more expertise in Azure Functions, I discovered that the same task could be accomplished more efficiently using just a few lines of JavaScript and a single action within the flow.

Before we proceed with this exercise, it’s important to note that a foundational understanding of how to provision an Azure resource and set up your function app is essential. To ensure you’re adequately prepared, I strongly recommend reviewing the information provided in the following link: Getting started with Azure Functions | Microsoft Learn.

Keep in mind that we are utilizing JavaScript for this task, so make sure to configure your function accordingly. While you can create your function directly in the Azure portal, which is the simplest approach, there’s also the option to develop it in Visual Studio Code. This latter method is a bit more complex but certainly achievable if you’re up for the challenge. For our function app, we’ll be using an HTTP trigger as the primary mechanism to initiate the process.

Code

Keep in mind that a function app essentially serves as a container for your code snippet. In this exercise, we’ll be adding code to our function and then obtaining the URL to integrate it with Power Automate. While I utilized ChatGPT to generate my code, it’s crucial to be aware that certain parts of this code are indispensable. One such essential line of code is as follows:

module.exports = async function (context, req){}

It’s important to operate within this framework, as the context from Power Automate will be fed into this function. Once that’s in place, our code will handle the rest of the process. This setup ensures that the data from Power Automate is seamlessly integrated and processed by our Azure Function.

Full code

Simply copy and paste the entire code into your function. Pay special attention to the parameter names ‘object1’ and ‘object2’. These are the parameters through which we will pass our objects. The code will then perform the comparison and return an object highlighting the differences, which we can utilize on the Power Automate side.

module.exports = async function (context, req) {

context.log('JavaScript HTTP trigger function processed a request.');

try {
// Assuming the input is a JSON string representing the objects.
const obj1 = req.query.object1 ? JSON.parse(req.query.object1) : null;
const obj2 = req.query.object2 ? JSON.parse(req.query.object2) : null;

// Validate the input
if (typeof obj1 !== 'object' || obj1 === null || typeof obj2 !== 'object' || obj2 === null) {
context.res = {
status: 400,
body: "Please pass two JSON objects in the query string or in the request body",
headers: {
'Content-Type': 'application/json; charset=utf-8'
}
};
return;
}

// Find differences
const differences = findObjectDifferences(obj1, obj2, "root");
context.log('Differences:', differences);

// Return the differences as a JSON string with the header "differences"
context.res = {
body: { "differences": differences },
headers: {
'Content-Type': 'application/json; charset=utf-8'
}
};

} catch (error) {
context.res = {
status: 500,
body: "Error processing your request",
headers: {
'Content-Type': 'application/json; charset=utf-8'
}
};
context.log.error("Exception thrown", error);
}
};

function findObjectDifferences(obj1, obj2, path) {
let differences = [];
let allKeys = new Set([...Object.keys(obj1), ...Object.keys(obj2)]);

allKeys.forEach((key) => {
let newPath = path + "." + key;
if (obj1.hasOwnProperty(key) && !obj2.hasOwnProperty(key)) {
differences.push({ path: newPath, obj1: JSON.stringify(obj1[key]), obj2: 'Not present' });
} else if (!obj1.hasOwnProperty(key) && obj2.hasOwnProperty(key)) {
differences.push({ path: newPath, obj1: 'Not present', obj2: JSON.stringify(obj2[key]) });
} else if (obj1[key] && obj2[key] && typeof obj1[key] === 'object' && typeof obj2[key] === 'object') {
differences = differences.concat(findObjectDifferences(obj1[key], obj2[key], newPath));
} else if (obj1[key] !== obj2[key]) {
differences.push({ path: newPath, obj1: JSON.stringify(obj1[key]), obj2: JSON.stringify(obj2[key]) });
}
});

return differences;
}

After configuring your function, save it. Then, locate and click on the ‘Get function URL’ button found in the editor’s ribbon. This URL is crucial as you will need it to make the HTTP request in Power Automate.

Power Automate

  1. Set Up the Trigger: Begin by setting up your Power Automate flow with a trigger of your choice. This trigger will initiate the flow.
  2. Initialize Object Variables: Next, initialize two object variables named ‘varObject1’ and ‘varObject2’. Initially, I attempted to directly write the objects in the HTTP request but encountered issues. It’s more effective to initialize these variables and assign your arrays to them, allowing for a smoother conversion process.
  3. Create an HTTP Request:
    • Choose ‘POST’ as the method for your HTTP request.
    • In the query, you will need to add two lines:
      • The first line should have ‘object1’ as the key, with the value being the ‘varObject1’ variable.
      • The second line should have ‘object2’ as the key, with the value being the ‘varObject2’ variable.
  4. Paste the Function App URL: Paste the URL you obtained from your Azure Function app into the appropriate field in your HTTP request setup.
  5. Save and Run: After setting up everything, save your flow configuration. Then, run your flow to test and see it in action.
varObject1
varObject2
HTTP Request
successful run and differences in the body

The following is the returned exceptions (differences) in the objects. The object specifies the affected columns “root.Borrower2” and “root.Address”

{ "differences": [ { "path": "root.Borrower2", "obj1": "\"Marsurice DeVan\"", "obj2": "\"Marsurice Devan\"" }, { "path": "root.Address", "obj1": "\"Dean\"", "obj2": "\"Down\"" } ] }

Why Not Azure (or however you pronounce it)?

Alright, Azure and Power Platform. It’s easy to overlook, but remember, Power Platform is cozily nestled right into Azure. I’ve gone from saying I build apps to now, more accurately, saying I work in the cloud.

Azure is essentially the backbone of the cloud world for Microsoft 365 and Power Platform, now Azure OpenAi. Between you and me, Azure has long had Ai capabilities with Azure Cognitive Services.

Azure has replaced the need for those loud, space-eating server rooms with a sleek, digital space. I remember the days of hiding in server rooms to escape the constant barrage of IT queries. Now, with Azure, those days are just a funny memory.

So, why Azure? Well, it’s vast. When my wife expressed interest in learning it, I had to ask, “Which part?” because it’s that big. It’s not just one thing; it’s a whole ecosystem waiting to be explored.

The cost might make some hesitant. I’m not claiming to be a Microsoft licensing expert, but generally, if the benefits are clear, people find the costs justifiable. And with Azure, the benefits are often worth it.

My journey and new role have led me deeper into Azure, especially Azure Functions. These have been a game-changer in my Power App development. They allow me to execute code, like JavaScript and my new favorite PowerShell, for complex tasks I’d rather not handle directly in the app. It’s been a great push to brush up on my coding skills too. I’ve reacquainted myself with Visual Studio Code as the last 4 years or so, I’ve lived in the Power Apps formula bar and Power Automate expressions.

But it’s not just about using Azure for the sake of it. Integrating Azure into your Power Platform solutions can significantly enhance what you’re able to do. It’s about making your solutions more robust and efficient. Azure Logic Apps, for instance, are like Power Automate on steroids, offering more capabilities and flexibility.

In essence, leveraging Azure within your Power Platform solutions isn’t just a good idea; it’s a strategic move. It’s about expanding your capabilities and delivering more sophisticated solutions. So, consider taking that step into Azure. It’s a solid move for anyone looking to enhance their Power Platform projects.

Enhancing Power Automate Desktop Proficiency with Access and PowerShell Integration

  1. Install ImportExcel Module
  2. Creating Desktop Flow
    1. Create Input Variable
    2. Retrieving Excel File From Folder
    3. Create “For each” loop
    4. Launch Excel
    5. Read from Excel worksheet
    6. Running PowerShell Script to insert data into our Access database

I recently had a conversation with a friend about the relevance of Access databases in today’s tech landscape. To my surprise, he emphatically affirmed that Access databases are still widely used. The last time I encountered Access was during my college days when it was used as a teaching tool for database modeling. Honestly, I had no idea that organizations continued to rely on it.

In my personal journey, Access has played a pivotal role in enhancing my skills with Power Automate Desktop. Initially, my goal was to create a Model Driven App and utilize it in conjunction with Power Apps desktop to execute desktop flows. This approach worked reasonably well, but over time, I no longer had the Model Driven App and didn’t want to recreate it. That’s when I turned to Access as my training ground.

However, even my practice with desktop flows on Access forms had its limitations. During my experimentation with PowerShell, I became determined to find a way to insert data into an Access database file without ever having to open Access itself. So, for those who still rely on Access in their workflows, This one is for you…

Here’s what you’ll need: a designated location to store your Excel file. Personally, I store mine in a folder on my desktop.

Install ImportExcel Module

To install the ImportExcel module, please open PowerShell and execute the provided script below:

Install-Module -Name ImportExcel -Scope CurrentUser

Creating Desktop Flow

Create Input Variable


To create an input variable for storing the path to the Access database we are working with, follow these steps to configure it:

  1. Open the Power Automate Desktop workflow where you intend to use the variable.
  2. Locate the area where you can define variables, typically in the workflow settings or variable panel.
  3. Create a new variable and give it a meaningful name, such as “varAccessDb.”
  4. Set the data type of this variable to “String” since it will store the file path, which is typically a text value.
  5. Save your workflow to ensure the variable is retained.

Now, you have an input variable named “varAccessDb” that can be used within your PowerShell script to hold the path to the Access database.

Retrieving Excel File From Folder


To initiate the process, the first action we need to perform is to retrieve the Excel file(s) that will be used in our flow. Follow these steps:

  1. In the Actions pane on the left-hand side of your Power Automate Desktop interface, type “Get files in folder” in the search bar.
  2. Locate the “Get files in folder” action in the search results.
  3. Drag and drop the “Get files in folder” action onto your workflow canvas.

This action will allow you to specify the folder from which you want to fetch the Excel files for further processing in your flow.


To direct the “Get files in folder” action to the folder containing your Excel workbooks and set it up to collect all files with a .xlsx extension, follow these steps:

  1. Select the “Get files in folder” action block on your workflow canvas to access its configuration.
  2. In the “Folder path” input field, specify the path to the folder where your Excel workbooks are located. You can either type the path or use the browse button (…) to navigate to and select the folder.
  3. In the “File filter” input box, input “*.xlsx”. This filter will ensure that the action captures all files with the .xlsx extension.
  4. Save your configuration.

Now, the “Get files in folder” action is configured to retrieve all Excel files with the .xlsx extension from the designated folder for further processing in your flow.

File filter: *.xlsx

Save.

Create “For each” loop

To make your solution dynamic and capable of handling one or multiple files, you can set up a “For Each” loop. Follow these steps:

  1. In the Actions pane, search for “For Each” to find the For Each loop action.
  2. Drag and drop the “For Each” loop action onto your workflow canvas.
  3. In the “Values to iterate” field of the “For Each” loop, use the %Files% variable that was generated from your “Get files in folder” action.
  4. To enhance readability, you can rename the “Store into” section from “CurrentItem” to “CurrentFile.”

By doing this, you’ve set up a loop that will iterate through each file retrieved by the “Get files in folder” action, allowing your workflow to handle both single and multiple files seamlessly.

Launch Excel

To add a “Launch Excel” action within your “For Each” loop and configure it, follow these steps:

  1. In the Actions pane, search for “Launch Excel” to locate the “Launch Excel” action.
  2. Drag and drop the “Launch Excel” action into your workflow editor, making sure it is nested within the “For Each” loop.

Now, let’s configure the “Launch Excel” action with the following parameters:

  • File path: This should point to the current Excel file being processed in the loop. Use the “CurrentFile” variable to specify the file path dynamically.
  • Visible: You can choose whether you want Excel to be visible during the execution of your workflow. Set it to “No” if you want Excel to run in the background without displaying the user interface, or “Yes” if you want Excel to be visible.
  • Create new instance: You can choose to create a new instance of Excel for each file, ensuring that each file is processed independently. Set it to “Yes” if you want a new instance for each file, or “No” if you want to reuse the same instance for all files.
  • Run macro (optional): If you have a specific Excel macro to run, you can specify it here. Otherwise, leave it blank if you don’t need to run a macro.

By configuring the “Launch Excel” action in this manner, you ensure that your workflow opens and processes each Excel file one by one within the “For Each” loop.

Launch Excel: and open the following document
Document Path: %CurrentFile.FullName%

Certainly, if you prefer to make the Excel instance visible, you can set the “Visible” option to “Yes” in the “Launch Excel” action configuration. This choice will allow you to see the Excel interface while the workflow is running, which can be helpful for monitoring and debugging your automation process.

Read from Excel worksheet

To add a “Read from Excel worksheet” action and configure it as described, follow these steps:

  1. In the Actions pane, search for “Read from Excel worksheet” and drag it into your workflow editor. Ensure that you place it within the same “For Each” loop as the “Launch Excel” action.
  2. Configure the “Read from Excel worksheet” action with the following parameters:
Excel instance: %ExcelInstance%
Retrieve: All available values from worksheet

Running PowerShell Script to insert data into our Access database

This PowerShell script will loop through the Excel data and insert records into the specified Access table without the need to have Access open. It should significantly improve the efficiency of your workflow compared to looping through each record in the spreadsheet using multiple actions in Power Automate Desktop.

  1. Add a “Run PowerShell script” action to your workflow.
  2. Copy and paste the following PowerShell script into the action, making any necessary updates to match your specific file paths, table names, and data variables:
# Define the path to the Access database
# varAccessDb is the our input variable
$accessDBPath = "%varAccessDb%"

# Define the connection string
$connectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=$accessDBPath;Persist Security Info=False;"

# Define the path to the Excel file
$excelFilePath = "%CurrentFile%"

# Import data from the Excel file
# Ensure that your Excel file has columns named PartName, SKU, Description, and Price
$dataCollection = Import-Excel -Path $excelFilePath

try {
    # Create a connection object
    $connection = New-Object System.Data.OleDb.OleDbConnection($connectionString)

    # Open the connection
    $connection.Open()

    foreach ($item in $dataCollection) {
        # Define the SQL query for inserting data
        $query = "INSERT INTO Part ([Part Name], [SKU], [Part Description], [Price]) VALUES ('$($item.PartName)', '$($item.SKU)', '$($item.Description)', $($item.Price))"

        # Create a command object
        $command = New-Object System.Data.OleDb.OleDbCommand($query, $connection)

        # Execute the command (perform the insert operation)
        $command.ExecuteNonQuery()
    }

    # Close the connection
    $connection.Close()
}
catch {
    # Handle the error
    Write-Host "An error occurred: $_"
}
finally {
    # Ensure that the connection is closed even if an error occurs
    if ($connection.State -eq [System.Data.ConnectionState]::Open) {
        $connection.Close()
    }
}

Lastly add an action to close Excel.

Power Automate Desktop – Turning on Track Changes in Word Document

Automation is a game-changer in the world of productivity and efficiency. It’s all about streamlining processes, ensuring consistency, and, believe it or not, even providing a sense of buoyancy. Yes, you read that right – buoyancy. Desktop automation can be your secret weapon in keeping those seemingly irreplaceable legacy systems afloat, even when they’re driving your CIO crazy. After all, many of your colleagues are still in love with those systems because they just work – that’s the magic of automation.

But desktop automation isn’t limited to legacy system support; it can also enhance your everyday tasks, like automating Microsoft Word. Now, you might be thinking, “I can do that with a cloud flow,” and you’re not wrong. Cloud-based automation is fantastic for updating templates and structured data. However, what I’m proposing is taking automation to the next level by working with free text.

But, before we dive into the exciting world of free-text automation, let’s talk safety first. There’s no point in automating something if we can’t roll it back in case of a mishap. We’ve all been there – that heart-pounding moment when you’ve pushed the proverbial ‘red button,’ and suddenly, you’re in a lonely, isolating place where your brain feels like it’s losing blood, and you start wondering if you’ll still have a job in the next five minutes. That’s why it’s crucial to implement safety measures to avoid such nerve-wracking situations.

The Guard Rails

In this exercise, we’ll be focusing on a SharePoint document library, which is already equipped with robust document management features such as check-in/check-out and version history. However, one feature that might be missing from your SharePoint arsenal is the ability to activate ‘Track Changes.’

Desktop automation opens up exciting possibilities, allowing us to activate ‘Track Changes’ effortlessly. One of my personal favorites for achieving this is harnessing the power of PowerShell in combination with desktop flows.

PowerShell provides the ultimate level of control and efficiency, eliminating the need for multiple unnecessary actions by allowing us to solve complex tasks with a single script. It’s a game-changer when it comes to automating tasks and streamlining processes in desktop flows.

The Solution

To work with a document in SharePoint and access its settings, follow these steps:

  1. Right-click on the document you want to work with within your SharePoint document library.
  2. From the context menu that appears, select “Copy Link.”
  3. After copying the link, proceed to select the “Settings” link or option to access and configure the document’s settings as needed. This step may vary slightly depending on your specific SharePoint environment and version.

4. In the sharing settings, you should see an option to invite people or add users. Enter the email addresses of the users you want to grant “Can Edit” permissions to. You can also use SharePoint groups if applicable.

5. Select apply and the link will be copied to your clipboard and settings for sharing saved.

In Power Automate Desktop

  1. Add the ‘Terminate’ action
  2. Configure the “Terminate” action settings as follows:
    • Process Name: Enter “WINWORD.EXE” (without quotes) in the “Process Name” field. This is the executable name for Microsoft Word.
    • Action: Choose “Terminate” to close any running instances of Microsoft Word.
  3. Save your flow with the “Terminate” action in place.

2. Next we’ll add an action called ‘Run application’

For the “Run application” action with the specified parameters, here’s how you can configure it:

  1. Application Path: Enter “winword.exe” (without quotes) as the application path. This specifies that you want to run Microsoft Word.
  2. Command Line Arguments: Enter “/t” followed by the sharing link from the Word document in your document library. Make sure to replace “sharing link” with the actual URL or path to your Word document. For example, if your document’s sharing link is “https://example.com/document.docx,” your command line argument would be “/t https://example.com/document.docx.”
  3. Window Style: Set the window style to “Maximized.” This will ensure that the Microsoft Word application window is opened in a maximized state.

With these parameters, the “Run application” action will launch Microsoft Word with the specified command line arguments, opening the document in a maximized window.

Sidebar – sort of

Introducing the concept of command line switches is crucial when working with Microsoft Office products and many other applications. Command line switches provide a convenient way to customize how an application behaves when launched from the command line or as part of an automation process.

In our specific case, the “/t” switch used before the document link is a common command line switch for Microsoft Word, indicating that we want to open an existing file. By including this switch in the command line arguments, we instruct Microsoft Word to treat the provided link as the path to an existing document that should be opened.

Back to regularly scheduled programming…


Next, we’ll insert two left clicks to the foreground window, which should be Microsoft Word, you can use the “Mouse Click” action. The goal is to mitigate any potential dialog interruptions:

  1. Add a “Mouse Click” action to your automation.
  2. In the “Mouse Click” action settings, specify the following:
    • Click Type: Choose “Left Click” to perform a single left-click.
    • Target: Select “Foreground Window” to ensure that the click is directed to the currently active window, which should be Microsoft Word.
    • Repeat: Set the “Repeat” option to “1” to perform the click once.
  3. After the first “Mouse Click” action, add another “Mouse Click” action with the same settings as the first one.

Trust me this step is warranted; I’ve seen some weird things in my time.

Now, let’s incorporate a PowerShell script into our workflow to achieve the goal of enabling track changes for our document. What makes this script particularly useful is its precision: it focuses solely on enabling track changes. Importantly, if track changes are already active, the script won’t “bat an eye”. However, if track changes are not yet enabled, the script will take action, enforcing track changes and ensuring that revisions are visible as they occur.

  1. Add a “Run PowerShell Script” action to your automation.
  2. In the “Run PowerShell Script” action settings, you can paste the provided script into the script editor. Make sure to include the entire script:
$wordApp = [System.Runtime.Interopservices.Marshal]::GetActiveObject("Word.Application")

# Get the active document
$document = $wordApp.ActiveDocument

# Enable track changes
$document.TrackRevisions = $true
$document.ShowRevisions = $true

# Save changes (optional)
$document.Save()
  1. Save your flow with the “Run PowerShell Script” action included.

Let’s introduce a step to test our automation by using the ‘Send Keys’ action. In the input parameters, we’ll enter a specific text of our choice. For instance, I’ll input “tested.” This action will transmit this free-text input directly to our Word document, allowing us to verify the effectiveness of our automation process.

Run the flow

Best Practices: Solutioning Part 2 – Adding Existing Tables to Solution

Welcome back for Part 2 of Solutioning. If you missed Part 1 please check it out.

In this week’s discussion on enhancing your Power Platform solutions, we’ll be delving into the process of incorporating existing tables into your solution. Before we dive into the details, it’s essential to recognize that there’s no mystical formula for adding a component to your solution. This reminds me of a valuable lesson imparted by one of my colleagues, especially when discussing governance: “Just because we can, doesn’t mean we should.” This principle directly applies to the inclusion of tables in a solution. Simply having the ability to add every field from a table into your solution doesn’t automatically mean it’s the best course of action. Let’s explore this further.

Business Value

In the context of Power Platform solutions, adding an existing object to a table may not seem immediately impactful, but it holds significant value for your business.

Here’s a brief story to help illustrate this:

To better understand the importance of adding only the necessary fields to your table within a Power Platform solution, let me share a personal experience that highlights this point. It relates to that insightful phrase from my colleague. Here’s what happened:

I was working on a Power Platform solution—whether it was an app or a flow—in a development environment, preparing it for the transition to the testing phase. In the early stages of building the solution, I faced a crucial decision when incorporating an existing table into the solution. At that moment, I chose to select the ‘Include all objects’ checkbox without giving it much thought.

Little did I know, I unintentionally created a situation that would require some effort to resolve. As we all understand, time is valuable in business.

In a development environment, it’s not always easy to track who’s working on what, and how changes to a table schema might affect your deployment. Communication isn’t always perfect. To reduce risks and minimize errors, it’s a good practice to take some precautions.

In simple terms, this approach helps to prevent errors, leading to quicker issue resolution. This saves time, money, and the stress of figuring out why you’re encountering errors related to missing dependencies that you didn’t directly modify.

Adding existing table to solution

Within our solution, follow these steps:

  1. Select “Add Existing.”
  2. Choose “Table” from the options.
  1. Pick the specific table you wish to work with.
  2. Then, click on “Next” to proceed.

5. Now, go ahead and choose “Select objects.”

At this point, you’ll be presented with a list of all the objects that are currently part of the selected table.

  1. Pick only the objects that you plan to include in your solution.
  2. Then, click the “Add” button to add the selected objects.

On this screen, you’ll notice a count of the objects you’ve selected.

  1. Go ahead and click the “Add” button to proceed.

You will now see that you have added your table to the solution

  1. To view the contents of your table, click on it. (Please note that if there is data in the table, you’ll be able to see all the columns and data. However, these columns are not automatically included in your solution when you move it.)
  2. Now, choose “Columns,” and you will only see the column or object that you added to the solution.

Thank you for joining us this week. In our next installment, we will continue to explore the world of solutioning. Our ultimate aim in solutioning is to lead into a discussion about Application Lifecycle Management (ALM). So, stay tuned for more exciting insights!

Best Practices: Solutioning Part 1

The Power Platform maker experience can seem like the Wild West. At any moment you can create a component that relies on other components and can create a disaster quickly.

In this series I’d like to cover some best practices, that I’ve seen over the past 5 years of working with the Power Platform – some lessons learned – such as – ‘DON’T BUILD IN THE DEFAULT ENVIRONMENT’ and other.

Solutions explained in a not so Microsoft way…A.K.A. – my way.

Imagine you have a kid with a room bursting with toys. It’s like a treasure trove of fun waiting to be explored. But hold on, it’s also a minefield! Stepping on a Lego here, tripping over something there – it’s an adventure you never signed up for. Sound familiar?

One fine day, you, the heroic parent, decide to bring order to this chaos. You’re on a mission to save your head and your feet from more unexpected encounters with toy-related perils. So, what’s your grand plan? You buy a container—a magical vessel that can hold all the toys.

In essence, a solution is your trusty container. It’s like that superhero utility belt, but for organizing digital goodies. You can get as fancy as you want with how you organize your toys – perhaps all the trucks go in one, Legos have their VIP section, and stuffed animals chill in another. Each container’s job? To amp up the fun!

Now, you might be thinking, “Why not just cram them all into one mega-container?” Well, my friend, think about it this way: when a child wants to supercharge their fun, they focus on one toy at a time, right? Or maybe you swap out the old for something shiny and new—a fantastic upgrade! We divide our solutions because not everything needs an upgrade. Remember, the bigger the toybox, the heavier it is to move.

So, there you have it – solutions are like these nifty toy containers, but for the Power Platform. They house all the bits and bobs that make the digital world go ’round. You can move them around effortlessly, using pipelines or a little digital muscle. And how you put these solutions together? Well, that’s where your creativity kicks in.

In a nutshell, solutions are your secret sauce, your digital playroom organizers. They’ll make your life as a Power Platform Developer/Maker a breeze. So go ahead, dive into the world of solutions, and let the fun (and efficiency) begin! Toys sold separately… 🚀

Creating a Solution

First things first, head on over to the Power Platform environment. You can do this by simply going to either make.powerapps.com or make.powerautomate.com – choose the one that suits your needs.

Now, on the left-hand side, look for and click on the “Solutions” option in the navigation bar.

Here comes the fun part – hit the “+ New Solution” button. That’s the one that gets the ball rolling for creating your new solution.

In the form that pops up, give your solution a cool name. If you’re not cool with the default publisher for your solution, no worries – you can make a new one. It’s actually a good idea to give it a name that makes sense. We can cover the publisher in further posts until then click “The Link” to view “Solution Concepts”… The Link

Finally, just click “Create,” and there you have it – a solution.

Thank you for tuning in… until next time.

Dataverse Auditing Part 2 – Retrieve Old Value for Dataverse Record Using HTTP With Azure AD

Extending our exploration of auditing within Dataverse, we’ll employ the ‘HTTP With Azure AD’ action in Power Automate to retrieve the previous value of a modified record.

Before proceeding with this tutorial, it’s essential to have auditing enabled in your environment. If you’re not sure how to do this, please consult the first episode in this series for step-by-step guidance.

Business Value

“Garbage In, Garbage Out.” You’ve probably heard the importance of data integrity. Ensuring data consistency and quality is pivotal because it lays the foundation for reliable analytics and decision-making. Audit trails serve a dual purpose: they not only make users responsible for their actions within the system but also offer invaluable data points for analysts to identify trends, facilitating smarter business decisions.

Framework

Solution

In this tutorial, our focus will primarily be within the context of a solution. A solution serves as a container where we organize and house the components related to the digital transformation we are undertaking. While we won’t delve into an exhaustive explanation, it’s important to understand that solutions provide a structured framework for managing and deploying our project’s components.

Datasource

With auditing activated in our setup, go ahead and pick a table to experiment on. For this demonstration, I’ve selected the “Accounts” table from Dataverse’s Common Data Model.

Retrieve Environment URL

To access the Environment URL required for our flow, navigate to the ‘Power Platform Admin Center.’ This URL is crucial for configuring the flow within its respective environment.

Upon reaching the Admin center, follow these steps:

  1. Navigate to the left-hand side navigation menu.
  2. Select “environments.”
  3. Choose the specific environment you are currently working in.

This will allow you to access the environment URL needed later on.

Copy the environment URL and paste it for use later on.

Environment Variable

In our workflow, especially for the ‘HTTP with Azure AD’ action, we will create an environment variable. This variable will store the URL specific to our environment, which we will subsequently incorporate into the request URL for this action.

By leveraging environment variables, you’re not just streamlining your current workflow; you’re future-proofing your solution. When the time comes to move your solution to a new environment, you can easily update these variables to align with the context of the new environment. This ensures that your solution maintains its agility and effectiveness, regardless of where it’s deployed.

In your solution, follow these steps to create a new environment variable:

  1. Select “+New.”
  2. Click on “More.”
  3. Choose “Environment Variable.”

This will initiate the process of creating a new environment variable within your solution.

For the form, please complete it as follows (note that there is some flexibility with the Display Name and Description):

  1. Copy and paste the previously retrieved environment URL into the “Default Value” field.
  2. Save the changes.

This step ensures that the environment URL is properly integrated into the flow, facilitating the seamless configuration of your flow.

Connection Reference

Before we proceed to configure our flow, there’s one more crucial task to enhance its robustness. We need to create a connection reference for the ‘HTTP with Azure AD’ action. This connection reference serves a similar purpose to the environment variable by enabling the flow to seamlessly transition to a new environment while establishing a connection to the resources within that environment. This step ensures the continuity and adaptability of our flow across different environments.

Inside our solution, follow these steps to create a connection reference:

  1. Select “+New.”
  2. Click on “More.”
  3. Choose “Connection Reference” to open the connection reference form.

This form will allow us to establish a connection reference for our flow, enhancing its flexibility and portability across different environments within the solution.

In the connection reference form, follow these steps:

  1. Name the connection reference.
  2. Optionally, add a description.
  3. From the connector dropdown, select ‘HTTP with Azure AD.’
  4. In the connection selector, click “+ New connection” to create a new ‘HTTP with Azure AD’ connection. This action will open a new tab where you can set up the connection details for ‘HTTP with Azure AD.’

In the new tab for connection details, follow these steps to create our connection:

  1. Select ‘Connect directly (cloud-services).’
  2. Enter the environment URL that we retrieved from the admin center earlier into the “Base Resource URL” field.
  3. Enter the environment URL in Azure AD Resource URI (Application ID URI) field.
  4. Select Create to create connection.
Creating connection

5. Next, you will be prompted to select an account to establish the connection with. Once you’ve chosen the account, the connection will be created, solidifying the link between your flow and the ‘HTTP with Azure AD’ action.

Return to the tab with the connection reference form and follow these steps:

  1. Select the refresh button to ensure your newly created connection is visible.
  2. Choose your newly created connection from the list.
  3. Finally, select “Create” to create the connection reference.

This step will integrate the connection reference into your solution.

Flow

Time to build a flow!

Within your solution, proceed as follows:

  1. Select “New.”
  2. Choose “Automation.”
  3. Select “Cloud Flow.”
  4. Opt for “Automated” to set up your flow.
  5. You will be directed to a screen where you can name your flow and select its trigger.

For the purposes of this demo, you’ve named your flow ‘Account Updates’ and selected the Dataverse trigger, specifically, ‘When a row is added, modified, or deleted.’ Now, proceed by selecting “Create” to start configuring your flow.

In the trigger configuration, set the following parameters:

  1. For “Change type,” select “Modified.”
  2. For “Table name,” choose “Accounts” or the table of your choice.
  3. Set the “Scope” to “Organization.”

These parameters will define the trigger conditions for your flow, ensuring it activates when a row is modified in the specified table.

Let’s add a new step to your flow. Follow these steps:

  1. Search for “HTTP with Azure AD.”
  2. Select “Invoke an HTTP request.”

To ensure that your connection reference is properly selected, follow these steps:

  1. Click on the three dots in the top right corner of the “Invoke an HTTP request” action.
  2. Confirm that your connection reference is correctly chosen. This step is essential for maintaining the flow’s integrity if you decide to migrate the solution to a different environment.

To configure the “Invoke an HTTP request” action, follow these steps:

  1. For the HTTP method, select “GET.”
  2. In the URL field of your request, use the following template:
{Environment Variable URL}/api/data/v9.2/RetrieveRecordChangeHistory(Target=@target,PagingInfo=@paginginfo)?@target={'@odata.id':'[table plural name](GUID of the updated record from our trigger)'}&@paginginfo={"PageNumber": 1,"Count": 1,"ReturnTotalRecordCount": true}

Replace the placeholders with the appropriate values:

  • {Environment Variable URL} should be replaced with the actual environment variable you created.
  • [table plural name] should be replaced with the name of the table (in plural form).
  • (GUID of the updated record from our trigger) should be replaced with the GUID of the updated record obtained from your trigger.

This configuration sets up the HTTP request to retrieve the record change history based on the trigger’s parameters.

Make sure to validate and adjust the URL according to your specific environment and requirements.

For the action headers, follow these steps:

  1. Switch to text mode by selecting the ‘T’ icon.
  2. Copy and paste the following headers configuration:
{
  "Accept": "application/json",
  "OData-MaxVersion": "4.0",
  "OData-Version": "4.0",
  "If-None-Match": "null",
  "Prefer": "odata.include-annotations=\"*\""
}

To simplify, follow these steps:

  1. Add a “Compose” action to your flow.
  2. In the “Compose” action, select the output from the “Invoke an HTTP Request” action as its input.

This way, you’re capturing the output from the HTTP request for further use in your flow.

Save your flow.

To test your flow, follow these steps:

  1. In Dataverse, open the table you are working on.
  2. Edit the value of a field on an existing record within that table.

This action will trigger your flow, allowing you to verify that it responds correctly to the changes made in Dataverse.

Old Value
New Value
Changes reflected in flow

Dataverse Auditing Part. 1- Configure Auditing Settings for Environment

Ever curious about the previous state of a modified record? In the Power Platform, you have various methods to track these changes. One approach is to set up a transaction or log table in Dataverse, capturing every event and its corresponding value. Alternatively, you can enable audit logs in your environment and then deploy a flow utilizing the ‘HTTP With Azure AD’ action to achieve a similar outcome.

Implementing Audit logs in your organization comes with intrinsic business advantages, chief among them being transparency and accountability. This principle aids in identifying the origin of data or transactions within the system, thereby establishing a framework that holds users accountable for their interactions with the platform.

Turn on Auditing for the Environment

To turn on auditing

  1. navigate to the Power Platform Admin Center @ https://admin.powerplatform.microsoft.com/
  2. Open the appropriate environment.
Power Platform Admin Center
Power Platform Admin Center

In the ‘Auditing Tile’, you can easily check whether auditing is activated for your environment. If it’s not yet enabled, simply click the ‘Manage’ link located in the upper right corner of the tile, which will direct you to the Auditing Settings page.

Audit Settings

On the Audit Settings screen check the ‘Start Auditing’, ‘Log access’, and ‘Read logs’ checkboxes.

There is a prompt that instructs you to set the retention policy for the logs. Options are:

  1. 30 days
  2. 90 days
  3. 180 days
  4. One year
  5. Two years
  6. Seven years
  7. Custom
  8. Forever
Audit Settings – Configuration

Configure your Audit Settings and select save button.

Nested If

Coding in a dynamic manner is usually one of the best ways to construct a solution. In doing so, designers and programmers try to reduce the amount of guess work that the end user has to do in an effort to keep them productive the process streamlined. To make this magic happen, developers have to do the guess work programmatically. This requires them to get creative and inject varying forms of logic into the solution. The end user may enter data the same way every time, but the output may not be as straight forward as the entry.

In Power Apps, one of the forms of logic that we have available to us that allows us to lower the amount thinking the user has to do is the If() function https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/functions/function-if. In short, the If() function checks to see if a condition is true then returns a result. The results of the evaluation can be a function that is nested inside of the If() or something as simple as a color or line of text.

Side note: The focus of this tutorial is the nested If() function. Additionally, the Switch() function will be featured as well. If I am being honest and I am, I didn’t want to write another section that spells out the switch function. The Microsoft Docs link that I provided about speaks about both functions.

The focus of the this blog will be nesting If() functions inside of another If() function. The scenario that we are programming for is moving an application from one environment to another. In order to move environments the application must have approval from the qualified individuals in the current environment. Our signoffs will be collected when a user checks a box. The checkbox visibility will be controlled by the selected items property in a dropdown control.

As always navigate to make.powerapps.com and what we want to do is create a canvas app it doesn’t matter which form factor you choose to utilize. For this exercise I will be using a mobile phone form factor.

Our first order of business is to add a dropdown control to our canvas.

In the items property enter the following

["","DEV","Test","UAT","All"]

Next we’re going to add a Switch() function into the OnChange property of the dropdown control. The purpose of this is to set a visibility variable based on the selected text in our dropdown.

Enter the following PowerFx expression into the OnChange property of the dropdown.

Switch(
Self.SelectedText.Value,
" ",
And(
UpdateContext({VarCheck1: false}),
UpdateContext({VarCheck2: false}),
UpdateContext({VarCheck3: false})
),
"DEV",
And(
UpdateContext({VarCheck1: true}),
UpdateContext({VarCheck2: false}),
UpdateContext({VarCheck3: false})
),
"Test",
And(
UpdateContext({VarCheck1: false}),
UpdateContext({VarCheck2: true}),
UpdateContext({VarCheck3: true})
),
"UAT",
And(
UpdateContext({VarCheck1: false}),
UpdateContext({VarCheck2: false}),
UpdateContext({VarCheck3: true})
),
"All",
And(
UpdateContext({VarCheck1: true}),
UpdateContext({VarCheck2: true}),
UpdateContext({VarCheck3: true})
)
)

Next we’re going to create our three checkboxes. The visibility of each checkbox will host one of the context variables that we created in the previous step.

For this instructional, the checkbox names will be generic. In the we will leverage two of the properties in our checkbox. The first being the default property. We want to use this property to set the default value of our checkbox to false if the checkbox is not visible. We will also use the visible property to toggle visibility on and off based on the selected item in the dropdown.

In the default property of the first checkbox or Checkbox 1 enter the following PowerFx expression If(VarCheck1 = false,false)

In the visible property of the first checkbox or Checkbox1 insert our context variable VarCheck1.

Using context variables VarCheck2 and VarCheck3, repeat the previous steps for Checkboxes 2 and 3.

Once we have our checkboxes wired up we can test the visibility of our checkboxes.

Now the moment that we’ve all been waiting for…..the Nested If().

If you have been following along and I hope that you have, the series of steps that we went through will all be brought together once we implement the Nested If(). Tying this altogether…If a checkbox is visible it must be checked in order for our submit button to become active. Removing a checkbox by changing the selection in the dropdown will clear it and only the remaining checkbox will need to be checked.

So…lets insert a button. We will leverage two properties here. The text property and the displaymode property.

In the text property of our button, enter in the following formula.

If(
If(
Checkbox1.Visible = true && Checkbox1.Value = true || If(
Checkbox1.Visible = false && Checkbox1.Value = false,
true
),
true
) && If(
Checkbox2.Visible = true && Checkbox2.Value = true || If(
Checkbox2.Visible = false && Checkbox2.Value = false,
true
),
true
) && If(
Checkbox3.Visible = true && Checkbox3.Value = true || If(
Checkbox3.Visible = false && Checkbox3.Value = false,
true
),
true
)

&& If(
And(
VarCheck1 = false,
VarCheck2 = false,
VarCheck3 = false
),
false,
true
),

"Ready to Move",
"Not Ready to Move"
)

At first glance this may not be easy to understand so I’ll break it down. Essentially, I have one If() function encasing multiple if functions, I then put another if function inside of that…still confusing. I know. So lets go with the basics.

The nested if works like this. We have a parent or top level If() function that houses the first If() function for Checkbox1 and it reads like this.

If(

If(Checkbox1 is visible(true) and Checkbox1 is checked (value = true) or

[another nested if]

If(

Checkbox1 is not visible(false) and Checkbox1 is not checked (value = false), then evaluate this If() to true

), If all conditions in this function are met, evaluate to true

)

If’s nested inside of another roll up to the parent, it is important that when you are designing your nested Ifs that you keep this in mind.

Finally we will leverage the display mode of the button. Paste the following into the displaymode property.

If(
If(
Checkbox1.Visible = true && Checkbox1.Value = true || If(
Checkbox1.Visible = false && Checkbox1.Value = false,
true
),
true
) && If(
Checkbox2.Visible = true && Checkbox2.Value = true || If(
Checkbox2.Visible = false && Checkbox2.Value = false,
true
),
true
) && If(
Checkbox3.Visible = true && Checkbox3.Value = true || If(
Checkbox3.Visible = false && Checkbox3.Value = false,
true
),
true
)

&& If(
And(
VarCheck1 = false,
VarCheck2 = false,
VarCheck3 = false
),
false,
true
),
DisplayMode.Edit,
DisplayMode.Disabled
)

And now for the final reveal..

I promise I’ll get better at these…Please leave feedback.

Custom Connector – ClickUp API

Foreword, foreword, foreword. A foreword written by the author. Ok…I like to share things that I’ve learned in practice for a client, things that I think are interesting concepts, and things that I think will extend the Power Platform. I like to extend the Power Platform to users that aren’t in the tenant, to provide users and clients with tools that they are familiar with…Insert custom connector, Power Apps Portal, webhooks, MS forms, emails…yada…yada…yada, this could’ve turned into the song that doesn’t end, and yes it does go on and on my friend….

Today, we will focus on the extensibility via the Custom Connector utilizing a tool that I like to use because it has a free tier that is more than useful — ClickUp. In short I use ClickUp the way that someone would use MS Planner, Monday.com, Jira, and the like. I’ve also worked with clients that use it as a ticketing system.

Because I work with the Power Platform, I thought that it would be neat to merge these two swell technologies.

First things first…you will need a ClickUp account, if you don’t have one visit https://www.clickup.com and get signed up or if you have a trusting friend ask to use theirs…

Lets begin…

First open notepad, a text editor, or have a paper and pencil handy. Honestly, I’d go with the copy and paste electronic method because the things that you are going to copy and paste will be pretty long…fine…nix the paper and pencil.

Now…

Once you’ve signed into your ClickUp Dashboard copy the team id from the address bar and paste it into notepad

Your team id will be after the first forward slash –indicated by the orange rectangle

Next go to settings and click on Integrations

Click on ClickUp API

Create App

Enter the name of your app and the redirect uri then select Create App. I use the standard redirect uri that we get back from Power Platform when we create a connector global.consent.azure-apim.net. It’s perfectly fine if you use it too. I promise I don’t own it…no rights whatsoever.

Once you successfully create your app, you will be provided with a Client ID and Client Secret, please copy and paste these to notepad as well.

Now the fun begins, it really doesn’t take long to go through the next steps…maybe. If you do this right, you will create a custom connector to ClickUp.

Documentation is your best friend, fortunately for us, ClickUp has documentation on how to work with their API. In a new tab/window navigate to https://clickup.com/api to see the docs and different actions that are at your disposal.

The first thing that we will need to do is generate a code for if you used the redirect uri that I provided in the previous step copy and paste the following address into your address bar, you will need to add your Client ID to the address

https://app.clickup.com/api?client_id={client_id}&redirect_uri=global.consent.azure-apim.net

After executing the previous step you will be prompted to connect to your workspace. If you’ve done everything right you will see the name of your app in the prompt. Select the workspace and click connect.

Once you select connect you will navigate to a new page. Please copy the code from the address bar and paste it into notepad.

Next we will generate our authorization code or API Key. You will need the Client ID, Client Secret, and the code from the previous step.

Return to https://clickup.com/api

  • Select Authentication
  • Get Access Token
  • Switch to Console
  • Insert
    • Client ID
    • Client Secret
    • Code
  • Call Resources
  • Scroll down to the response body and copy the authorization code then paste it to notepad

Next navigate to make.powerapps.com

  • Select Data
  • Select Custom Connectors
  • New Custom Connector
  • New From Blank
  • Name your connector
  • Click Continue

Next we will go through setting up general information for our connector. You can choose to upload an image for your connector or add a description – either not required. However, you will need to enter the host url and the base url.

The host url is api.clickup.com and in the base url input insert a forward slash /

Next we will set up the security for our connector and this is one of the easier steps.

The Authentication Type will be API Key, the value for Parameter Label and Parameter Name is “Authorization”, and the Location will be “Header”

Next we will create our connector definitions by navigating to the Definition tab.

Select New Action from the left pane.

In the General box the only required field is the operation id. I tend to name this something similar to whatever action is being performed.

Next we will want to select Import from sample in the Request box

For this tutorial, the action that we will be performing is getting all of the Workspaces in our team

Please select Get from the radio options.

Next enter the following url as is into the Url text input. We don’t want to hard code the team id into this we want to be able to access it from outside of the request by using curly brackets we create arguments that can be passed in from an app or flow.

https://api.clickup.com/api/v2/team/{team_id}/space

Finally select Import

Next select create connector to create the connector. Once the connector is created navigate to test here we will create our connection to the api.

Select New connection when prompted, enter the authorization code we created earlier, then create.

Once you click create you will be moved to the connections screen. if you’ve done it right you will see that you have a connection to the api.

Now lets go back to data>custom connectors > edit our newly created connector > test. In the GetSpaces box at the bottom of the page enter the team id into the text field and select test operation. If all goes well we will receive a 200 code and receive a JSON payload.

Copy the JSON payload and navigate to the Definition tab. In the response area select Add default response > paste payload into the body > import.

Finally update the connector. By adding this response we now have a schema for how our data is being returned and dynamic content to be used in our apps or flows once we import the custom connector into the process.

Now that we have the basic knowledge of how to create a custom connector for the ClickUp, feel free to see what other actions you can create. At the time of this publication I have 15 actions that I am using in an app. In the weeks to come I will share this with you.

Thanks for spending time with me. You could’ve been anywhere else on the world wide web, but you’re here with me…I appreciate that.