Power Automate Desktop – Turning on Track Changes in Word Document

Automation is a game-changer in the world of productivity and efficiency. It’s all about streamlining processes, ensuring consistency, and, believe it or not, even providing a sense of buoyancy. Yes, you read that right – buoyancy. Desktop automation can be your secret weapon in keeping those seemingly irreplaceable legacy systems afloat, even when they’re driving your CIO crazy. After all, many of your colleagues are still in love with those systems because they just work – that’s the magic of automation.

But desktop automation isn’t limited to legacy system support; it can also enhance your everyday tasks, like automating Microsoft Word. Now, you might be thinking, “I can do that with a cloud flow,” and you’re not wrong. Cloud-based automation is fantastic for updating templates and structured data. However, what I’m proposing is taking automation to the next level by working with free text.

But, before we dive into the exciting world of free-text automation, let’s talk safety first. There’s no point in automating something if we can’t roll it back in case of a mishap. We’ve all been there – that heart-pounding moment when you’ve pushed the proverbial ‘red button,’ and suddenly, you’re in a lonely, isolating place where your brain feels like it’s losing blood, and you start wondering if you’ll still have a job in the next five minutes. That’s why it’s crucial to implement safety measures to avoid such nerve-wracking situations.

The Guard Rails

In this exercise, we’ll be focusing on a SharePoint document library, which is already equipped with robust document management features such as check-in/check-out and version history. However, one feature that might be missing from your SharePoint arsenal is the ability to activate ‘Track Changes.’

Desktop automation opens up exciting possibilities, allowing us to activate ‘Track Changes’ effortlessly. One of my personal favorites for achieving this is harnessing the power of PowerShell in combination with desktop flows.

PowerShell provides the ultimate level of control and efficiency, eliminating the need for multiple unnecessary actions by allowing us to solve complex tasks with a single script. It’s a game-changer when it comes to automating tasks and streamlining processes in desktop flows.

The Solution

To work with a document in SharePoint and access its settings, follow these steps:

  1. Right-click on the document you want to work with within your SharePoint document library.
  2. From the context menu that appears, select “Copy Link.”
  3. After copying the link, proceed to select the “Settings” link or option to access and configure the document’s settings as needed. This step may vary slightly depending on your specific SharePoint environment and version.

4. In the sharing settings, you should see an option to invite people or add users. Enter the email addresses of the users you want to grant “Can Edit” permissions to. You can also use SharePoint groups if applicable.

5. Select apply and the link will be copied to your clipboard and settings for sharing saved.

In Power Automate Desktop

  1. Add the ‘Terminate’ action
  2. Configure the “Terminate” action settings as follows:
    • Process Name: Enter “WINWORD.EXE” (without quotes) in the “Process Name” field. This is the executable name for Microsoft Word.
    • Action: Choose “Terminate” to close any running instances of Microsoft Word.
  3. Save your flow with the “Terminate” action in place.

2. Next we’ll add an action called ‘Run application’

For the “Run application” action with the specified parameters, here’s how you can configure it:

  1. Application Path: Enter “winword.exe” (without quotes) as the application path. This specifies that you want to run Microsoft Word.
  2. Command Line Arguments: Enter “/t” followed by the sharing link from the Word document in your document library. Make sure to replace “sharing link” with the actual URL or path to your Word document. For example, if your document’s sharing link is “https://example.com/document.docx,” your command line argument would be “/t https://example.com/document.docx.”
  3. Window Style: Set the window style to “Maximized.” This will ensure that the Microsoft Word application window is opened in a maximized state.

With these parameters, the “Run application” action will launch Microsoft Word with the specified command line arguments, opening the document in a maximized window.

Sidebar – sort of

Introducing the concept of command line switches is crucial when working with Microsoft Office products and many other applications. Command line switches provide a convenient way to customize how an application behaves when launched from the command line or as part of an automation process.

In our specific case, the “/t” switch used before the document link is a common command line switch for Microsoft Word, indicating that we want to open an existing file. By including this switch in the command line arguments, we instruct Microsoft Word to treat the provided link as the path to an existing document that should be opened.

Back to regularly scheduled programming…


Next, we’ll insert two left clicks to the foreground window, which should be Microsoft Word, you can use the “Mouse Click” action. The goal is to mitigate any potential dialog interruptions:

  1. Add a “Mouse Click” action to your automation.
  2. In the “Mouse Click” action settings, specify the following:
    • Click Type: Choose “Left Click” to perform a single left-click.
    • Target: Select “Foreground Window” to ensure that the click is directed to the currently active window, which should be Microsoft Word.
    • Repeat: Set the “Repeat” option to “1” to perform the click once.
  3. After the first “Mouse Click” action, add another “Mouse Click” action with the same settings as the first one.

Trust me this step is warranted; I’ve seen some weird things in my time.

Now, let’s incorporate a PowerShell script into our workflow to achieve the goal of enabling track changes for our document. What makes this script particularly useful is its precision: it focuses solely on enabling track changes. Importantly, if track changes are already active, the script won’t “bat an eye”. However, if track changes are not yet enabled, the script will take action, enforcing track changes and ensuring that revisions are visible as they occur.

  1. Add a “Run PowerShell Script” action to your automation.
  2. In the “Run PowerShell Script” action settings, you can paste the provided script into the script editor. Make sure to include the entire script:
$wordApp = [System.Runtime.Interopservices.Marshal]::GetActiveObject("Word.Application")

# Get the active document
$document = $wordApp.ActiveDocument

# Enable track changes
$document.TrackRevisions = $true
$document.ShowRevisions = $true

# Save changes (optional)
$document.Save()
  1. Save your flow with the “Run PowerShell Script” action included.

Let’s introduce a step to test our automation by using the ‘Send Keys’ action. In the input parameters, we’ll enter a specific text of our choice. For instance, I’ll input “tested.” This action will transmit this free-text input directly to our Word document, allowing us to verify the effectiveness of our automation process.

Run the flow

Best Practices: Solutioning Part 1

The Power Platform maker experience can seem like the Wild West. At any moment you can create a component that relies on other components and can create a disaster quickly.

In this series I’d like to cover some best practices, that I’ve seen over the past 5 years of working with the Power Platform – some lessons learned – such as – ‘DON’T BUILD IN THE DEFAULT ENVIRONMENT’ and other.

Solutions explained in a not so Microsoft way…A.K.A. – my way.

Imagine you have a kid with a room bursting with toys. It’s like a treasure trove of fun waiting to be explored. But hold on, it’s also a minefield! Stepping on a Lego here, tripping over something there – it’s an adventure you never signed up for. Sound familiar?

One fine day, you, the heroic parent, decide to bring order to this chaos. You’re on a mission to save your head and your feet from more unexpected encounters with toy-related perils. So, what’s your grand plan? You buy a container—a magical vessel that can hold all the toys.

In essence, a solution is your trusty container. It’s like that superhero utility belt, but for organizing digital goodies. You can get as fancy as you want with how you organize your toys – perhaps all the trucks go in one, Legos have their VIP section, and stuffed animals chill in another. Each container’s job? To amp up the fun!

Now, you might be thinking, “Why not just cram them all into one mega-container?” Well, my friend, think about it this way: when a child wants to supercharge their fun, they focus on one toy at a time, right? Or maybe you swap out the old for something shiny and new—a fantastic upgrade! We divide our solutions because not everything needs an upgrade. Remember, the bigger the toybox, the heavier it is to move.

So, there you have it – solutions are like these nifty toy containers, but for the Power Platform. They house all the bits and bobs that make the digital world go ’round. You can move them around effortlessly, using pipelines or a little digital muscle. And how you put these solutions together? Well, that’s where your creativity kicks in.

In a nutshell, solutions are your secret sauce, your digital playroom organizers. They’ll make your life as a Power Platform Developer/Maker a breeze. So go ahead, dive into the world of solutions, and let the fun (and efficiency) begin! Toys sold separately… 🚀

Creating a Solution

First things first, head on over to the Power Platform environment. You can do this by simply going to either make.powerapps.com or make.powerautomate.com – choose the one that suits your needs.

Now, on the left-hand side, look for and click on the “Solutions” option in the navigation bar.

Here comes the fun part – hit the “+ New Solution” button. That’s the one that gets the ball rolling for creating your new solution.

In the form that pops up, give your solution a cool name. If you’re not cool with the default publisher for your solution, no worries – you can make a new one. It’s actually a good idea to give it a name that makes sense. We can cover the publisher in further posts until then click “The Link” to view “Solution Concepts”… The Link

Finally, just click “Create,” and there you have it – a solution.

Thank you for tuning in… until next time.

Dataverse Auditing Part 2 – Retrieve Old Value for Dataverse Record Using HTTP With Azure AD

Extending our exploration of auditing within Dataverse, we’ll employ the ‘HTTP With Azure AD’ action in Power Automate to retrieve the previous value of a modified record.

Before proceeding with this tutorial, it’s essential to have auditing enabled in your environment. If you’re not sure how to do this, please consult the first episode in this series for step-by-step guidance.

Business Value

“Garbage In, Garbage Out.” You’ve probably heard the importance of data integrity. Ensuring data consistency and quality is pivotal because it lays the foundation for reliable analytics and decision-making. Audit trails serve a dual purpose: they not only make users responsible for their actions within the system but also offer invaluable data points for analysts to identify trends, facilitating smarter business decisions.

Framework

Solution

In this tutorial, our focus will primarily be within the context of a solution. A solution serves as a container where we organize and house the components related to the digital transformation we are undertaking. While we won’t delve into an exhaustive explanation, it’s important to understand that solutions provide a structured framework for managing and deploying our project’s components.

Datasource

With auditing activated in our setup, go ahead and pick a table to experiment on. For this demonstration, I’ve selected the “Accounts” table from Dataverse’s Common Data Model.

Retrieve Environment URL

To access the Environment URL required for our flow, navigate to the ‘Power Platform Admin Center.’ This URL is crucial for configuring the flow within its respective environment.

Upon reaching the Admin center, follow these steps:

  1. Navigate to the left-hand side navigation menu.
  2. Select “environments.”
  3. Choose the specific environment you are currently working in.

This will allow you to access the environment URL needed later on.

Copy the environment URL and paste it for use later on.

Environment Variable

In our workflow, especially for the ‘HTTP with Azure AD’ action, we will create an environment variable. This variable will store the URL specific to our environment, which we will subsequently incorporate into the request URL for this action.

By leveraging environment variables, you’re not just streamlining your current workflow; you’re future-proofing your solution. When the time comes to move your solution to a new environment, you can easily update these variables to align with the context of the new environment. This ensures that your solution maintains its agility and effectiveness, regardless of where it’s deployed.

In your solution, follow these steps to create a new environment variable:

  1. Select “+New.”
  2. Click on “More.”
  3. Choose “Environment Variable.”

This will initiate the process of creating a new environment variable within your solution.

For the form, please complete it as follows (note that there is some flexibility with the Display Name and Description):

  1. Copy and paste the previously retrieved environment URL into the “Default Value” field.
  2. Save the changes.

This step ensures that the environment URL is properly integrated into the flow, facilitating the seamless configuration of your flow.

Connection Reference

Before we proceed to configure our flow, there’s one more crucial task to enhance its robustness. We need to create a connection reference for the ‘HTTP with Azure AD’ action. This connection reference serves a similar purpose to the environment variable by enabling the flow to seamlessly transition to a new environment while establishing a connection to the resources within that environment. This step ensures the continuity and adaptability of our flow across different environments.

Inside our solution, follow these steps to create a connection reference:

  1. Select “+New.”
  2. Click on “More.”
  3. Choose “Connection Reference” to open the connection reference form.

This form will allow us to establish a connection reference for our flow, enhancing its flexibility and portability across different environments within the solution.

In the connection reference form, follow these steps:

  1. Name the connection reference.
  2. Optionally, add a description.
  3. From the connector dropdown, select ‘HTTP with Azure AD.’
  4. In the connection selector, click “+ New connection” to create a new ‘HTTP with Azure AD’ connection. This action will open a new tab where you can set up the connection details for ‘HTTP with Azure AD.’

In the new tab for connection details, follow these steps to create our connection:

  1. Select ‘Connect directly (cloud-services).’
  2. Enter the environment URL that we retrieved from the admin center earlier into the “Base Resource URL” field.
  3. Enter the environment URL in Azure AD Resource URI (Application ID URI) field.
  4. Select Create to create connection.
Creating connection

5. Next, you will be prompted to select an account to establish the connection with. Once you’ve chosen the account, the connection will be created, solidifying the link between your flow and the ‘HTTP with Azure AD’ action.

Return to the tab with the connection reference form and follow these steps:

  1. Select the refresh button to ensure your newly created connection is visible.
  2. Choose your newly created connection from the list.
  3. Finally, select “Create” to create the connection reference.

This step will integrate the connection reference into your solution.

Flow

Time to build a flow!

Within your solution, proceed as follows:

  1. Select “New.”
  2. Choose “Automation.”
  3. Select “Cloud Flow.”
  4. Opt for “Automated” to set up your flow.
  5. You will be directed to a screen where you can name your flow and select its trigger.

For the purposes of this demo, you’ve named your flow ‘Account Updates’ and selected the Dataverse trigger, specifically, ‘When a row is added, modified, or deleted.’ Now, proceed by selecting “Create” to start configuring your flow.

In the trigger configuration, set the following parameters:

  1. For “Change type,” select “Modified.”
  2. For “Table name,” choose “Accounts” or the table of your choice.
  3. Set the “Scope” to “Organization.”

These parameters will define the trigger conditions for your flow, ensuring it activates when a row is modified in the specified table.

Let’s add a new step to your flow. Follow these steps:

  1. Search for “HTTP with Azure AD.”
  2. Select “Invoke an HTTP request.”

To ensure that your connection reference is properly selected, follow these steps:

  1. Click on the three dots in the top right corner of the “Invoke an HTTP request” action.
  2. Confirm that your connection reference is correctly chosen. This step is essential for maintaining the flow’s integrity if you decide to migrate the solution to a different environment.

To configure the “Invoke an HTTP request” action, follow these steps:

  1. For the HTTP method, select “GET.”
  2. In the URL field of your request, use the following template:
{Environment Variable URL}/api/data/v9.2/RetrieveRecordChangeHistory(Target=@target,PagingInfo=@paginginfo)?@target={'@odata.id':'[table plural name](GUID of the updated record from our trigger)'}&@paginginfo={"PageNumber": 1,"Count": 1,"ReturnTotalRecordCount": true}

Replace the placeholders with the appropriate values:

  • {Environment Variable URL} should be replaced with the actual environment variable you created.
  • [table plural name] should be replaced with the name of the table (in plural form).
  • (GUID of the updated record from our trigger) should be replaced with the GUID of the updated record obtained from your trigger.

This configuration sets up the HTTP request to retrieve the record change history based on the trigger’s parameters.

Make sure to validate and adjust the URL according to your specific environment and requirements.

For the action headers, follow these steps:

  1. Switch to text mode by selecting the ‘T’ icon.
  2. Copy and paste the following headers configuration:
{
  "Accept": "application/json",
  "OData-MaxVersion": "4.0",
  "OData-Version": "4.0",
  "If-None-Match": "null",
  "Prefer": "odata.include-annotations=\"*\""
}

To simplify, follow these steps:

  1. Add a “Compose” action to your flow.
  2. In the “Compose” action, select the output from the “Invoke an HTTP Request” action as its input.

This way, you’re capturing the output from the HTTP request for further use in your flow.

Save your flow.

To test your flow, follow these steps:

  1. In Dataverse, open the table you are working on.
  2. Edit the value of a field on an existing record within that table.

This action will trigger your flow, allowing you to verify that it responds correctly to the changes made in Dataverse.

Old Value
New Value
Changes reflected in flow

Dataverse Auditing Part. 1- Configure Auditing Settings for Environment

Ever curious about the previous state of a modified record? In the Power Platform, you have various methods to track these changes. One approach is to set up a transaction or log table in Dataverse, capturing every event and its corresponding value. Alternatively, you can enable audit logs in your environment and then deploy a flow utilizing the ‘HTTP With Azure AD’ action to achieve a similar outcome.

Implementing Audit logs in your organization comes with intrinsic business advantages, chief among them being transparency and accountability. This principle aids in identifying the origin of data or transactions within the system, thereby establishing a framework that holds users accountable for their interactions with the platform.

Turn on Auditing for the Environment

To turn on auditing

  1. navigate to the Power Platform Admin Center @ https://admin.powerplatform.microsoft.com/
  2. Open the appropriate environment.
Power Platform Admin Center
Power Platform Admin Center

In the ‘Auditing Tile’, you can easily check whether auditing is activated for your environment. If it’s not yet enabled, simply click the ‘Manage’ link located in the upper right corner of the tile, which will direct you to the Auditing Settings page.

Audit Settings

On the Audit Settings screen check the ‘Start Auditing’, ‘Log access’, and ‘Read logs’ checkboxes.

There is a prompt that instructs you to set the retention policy for the logs. Options are:

  1. 30 days
  2. 90 days
  3. 180 days
  4. One year
  5. Two years
  6. Seven years
  7. Custom
  8. Forever
Audit Settings – Configuration

Configure your Audit Settings and select save button.

Nested If

Coding in a dynamic manner is usually one of the best ways to construct a solution. In doing so, designers and programmers try to reduce the amount of guess work that the end user has to do in an effort to keep them productive the process streamlined. To make this magic happen, developers have to do the guess work programmatically. This requires them to get creative and inject varying forms of logic into the solution. The end user may enter data the same way every time, but the output may not be as straight forward as the entry.

In Power Apps, one of the forms of logic that we have available to us that allows us to lower the amount thinking the user has to do is the If() function https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/functions/function-if. In short, the If() function checks to see if a condition is true then returns a result. The results of the evaluation can be a function that is nested inside of the If() or something as simple as a color or line of text.

Side note: The focus of this tutorial is the nested If() function. Additionally, the Switch() function will be featured as well. If I am being honest and I am, I didn’t want to write another section that spells out the switch function. The Microsoft Docs link that I provided about speaks about both functions.

The focus of the this blog will be nesting If() functions inside of another If() function. The scenario that we are programming for is moving an application from one environment to another. In order to move environments the application must have approval from the qualified individuals in the current environment. Our signoffs will be collected when a user checks a box. The checkbox visibility will be controlled by the selected items property in a dropdown control.

As always navigate to make.powerapps.com and what we want to do is create a canvas app it doesn’t matter which form factor you choose to utilize. For this exercise I will be using a mobile phone form factor.

Our first order of business is to add a dropdown control to our canvas.

In the items property enter the following

["","DEV","Test","UAT","All"]

Next we’re going to add a Switch() function into the OnChange property of the dropdown control. The purpose of this is to set a visibility variable based on the selected text in our dropdown.

Enter the following PowerFx expression into the OnChange property of the dropdown.

Switch(
Self.SelectedText.Value,
" ",
And(
UpdateContext({VarCheck1: false}),
UpdateContext({VarCheck2: false}),
UpdateContext({VarCheck3: false})
),
"DEV",
And(
UpdateContext({VarCheck1: true}),
UpdateContext({VarCheck2: false}),
UpdateContext({VarCheck3: false})
),
"Test",
And(
UpdateContext({VarCheck1: false}),
UpdateContext({VarCheck2: true}),
UpdateContext({VarCheck3: true})
),
"UAT",
And(
UpdateContext({VarCheck1: false}),
UpdateContext({VarCheck2: false}),
UpdateContext({VarCheck3: true})
),
"All",
And(
UpdateContext({VarCheck1: true}),
UpdateContext({VarCheck2: true}),
UpdateContext({VarCheck3: true})
)
)

Next we’re going to create our three checkboxes. The visibility of each checkbox will host one of the context variables that we created in the previous step.

For this instructional, the checkbox names will be generic. In the we will leverage two of the properties in our checkbox. The first being the default property. We want to use this property to set the default value of our checkbox to false if the checkbox is not visible. We will also use the visible property to toggle visibility on and off based on the selected item in the dropdown.

In the default property of the first checkbox or Checkbox 1 enter the following PowerFx expression If(VarCheck1 = false,false)

In the visible property of the first checkbox or Checkbox1 insert our context variable VarCheck1.

Using context variables VarCheck2 and VarCheck3, repeat the previous steps for Checkboxes 2 and 3.

Once we have our checkboxes wired up we can test the visibility of our checkboxes.

Now the moment that we’ve all been waiting for…..the Nested If().

If you have been following along and I hope that you have, the series of steps that we went through will all be brought together once we implement the Nested If(). Tying this altogether…If a checkbox is visible it must be checked in order for our submit button to become active. Removing a checkbox by changing the selection in the dropdown will clear it and only the remaining checkbox will need to be checked.

So…lets insert a button. We will leverage two properties here. The text property and the displaymode property.

In the text property of our button, enter in the following formula.

If(
If(
Checkbox1.Visible = true && Checkbox1.Value = true || If(
Checkbox1.Visible = false && Checkbox1.Value = false,
true
),
true
) && If(
Checkbox2.Visible = true && Checkbox2.Value = true || If(
Checkbox2.Visible = false && Checkbox2.Value = false,
true
),
true
) && If(
Checkbox3.Visible = true && Checkbox3.Value = true || If(
Checkbox3.Visible = false && Checkbox3.Value = false,
true
),
true
)

&& If(
And(
VarCheck1 = false,
VarCheck2 = false,
VarCheck3 = false
),
false,
true
),

"Ready to Move",
"Not Ready to Move"
)

At first glance this may not be easy to understand so I’ll break it down. Essentially, I have one If() function encasing multiple if functions, I then put another if function inside of that…still confusing. I know. So lets go with the basics.

The nested if works like this. We have a parent or top level If() function that houses the first If() function for Checkbox1 and it reads like this.

If(

If(Checkbox1 is visible(true) and Checkbox1 is checked (value = true) or

[another nested if]

If(

Checkbox1 is not visible(false) and Checkbox1 is not checked (value = false), then evaluate this If() to true

), If all conditions in this function are met, evaluate to true

)

If’s nested inside of another roll up to the parent, it is important that when you are designing your nested Ifs that you keep this in mind.

Finally we will leverage the display mode of the button. Paste the following into the displaymode property.

If(
If(
Checkbox1.Visible = true && Checkbox1.Value = true || If(
Checkbox1.Visible = false && Checkbox1.Value = false,
true
),
true
) && If(
Checkbox2.Visible = true && Checkbox2.Value = true || If(
Checkbox2.Visible = false && Checkbox2.Value = false,
true
),
true
) && If(
Checkbox3.Visible = true && Checkbox3.Value = true || If(
Checkbox3.Visible = false && Checkbox3.Value = false,
true
),
true
)

&& If(
And(
VarCheck1 = false,
VarCheck2 = false,
VarCheck3 = false
),
false,
true
),
DisplayMode.Edit,
DisplayMode.Disabled
)

And now for the final reveal..

I promise I’ll get better at these…Please leave feedback.

How Did We Get Here?

The first thing that I want to get across to whoever reads this is…YOU CAN DO IT. If you’re reading this, you may have just embarked on your journey with the Power Platform, you’re organization is thinking about making the switch to go Low-Code, or you’re like me and knee deep in Power Platform, but routinely look up to marvel at where you are.

I’m Duke by the way…I have been working with the Power Platform for two years and have apps and flows in every time zone in the continental United States, hopefully I can get to Alaska and Hawaii, then the rest of the world.

How did we get here? My path to the Power Platform, may seem similar to yours or completely devoid of the nuisances and challenges that you’ve faced. However, we’re here.

I started working in a factory at age 25 while attending school full time, simultaneously. Prior to graduation, I accepted an analyst position in the IT department. For about 7 months I worked as a help desk technician assisting users with issues related to our ERP system. Although, I was grateful for this opportunity, I needed to find my spot and ultimately something that I owned. Each member of the team had their niche and they did well at it.

In the middle of all of this, the guy that took a chance on me and hired me left the company and I felt like I was in limbo. For about two months I did my job wondering what the next phase would be. I knew that new leadership would bring new challenges and who knows, maybe I’d lose my job. Fortunately, things turned out well and I was introduced to SharePoint.

I know, you probably thought that I would say I dove right into the Power Platform, but trust me its happened this way for a lot of people. SharePoint….I can’t say I was too happy about my new role as a part time SharePoint admin, but it was new for me and I had something that I could own. I thought it would be temporary as I was tasked with moving our current company intranet over to SharePoint online, but it blossomed.

The finance team was looking to replace their matrix approval flows. I knew nothing about flows in general (remember factory worker, now IT). Initially we were going to build them using the 2013 workflows in SharePoint, but they weren’t as robust as the CFO needed them to be. We needed a way to resolve this matter and I didn’t want to let my boss down as he was new to the company.

Playing around in a SharePoint list one day, I noticed that there was a button at the very top. The “Flow” button.

What was this “Flow” thing and why was it? Little did I know, it was at this moment that I “discovered” the Power Platform.

To meet the requirements of the finance department, the flow would need to follow our authority matrix for approvals which was manageable because flow has approval functionality, but the fun didn’t stop there, they wanted a dashboard.

There is one small problem with approvals in flow. In order to interact with the request, users would have to use Outlook, for our operation this wasn’t feasible and I wouldn’t recommend it for high volume requests.

Suffice it to say, I was stuck and our solution was incomplete, I’ve only been at this for about 3 months and there seemed like no way up. We needed this dashboard to complete our solution or we were dead in the water. After googling multiple combinations of “dashboard with flow”, “dashboard with SharePoint”, and the like, I once again made another “discovery”, Power Apps.

One week before I was to deliver the finished product, I scrapped all of my work and took a different approach using flow to move the requests once a determination was made in the custom dashboard in Power Apps. I watched tons of videos and practically slept in my basement to make sure we could deliver the solution on time.

During the duration of the project I discovered that I had an affinity for the Power Platform, so much so that I purchased my own tenant so I could make and break things. I began freelancing in addition to my day job and started networking to meet other #PowerAddicts.

Eventually, I left my position at my former employer to become a consultant at Hitachi Solutions of America because I wanted to do all things Power Platform – all day and I’m still learning and growing.

I wanted my first post to be a solution or a tutorial, but I wanted to let you know how we got here. The use of we is intentional. If there is anything that I learned while working on the Power Platform is that it is a community and you should leverage it. Commonality is an overarching theme in Microsoft’s Power Platform community, we may have different use cases and work in different industries, but believe it or not we are all solving problems to aid our businesses in streamlining their processes. I have seen so many different solutions created by the community and I often take bits and pieces and add them to my solutions. Now I’m at the point where I need to pay it forward and share.

We are accountants, former engineers. We are pro developers and citizen developers. The good thing about this platform is that there is a space for everyone, its inclusive. I urge you to follow some of the superstars in the community and if you feel so inclined subscribe and follow me.

We’re going to have fun…