Securing Internal Documents: Managing SharePoint Locations to Restrict External Portal Access

  1. Introduction
  2. Shoutout to Keegan
  3. Solution
    1. Step 1: Configuring the Basic Form
    2. Aside: Spoofing the System
    3. Step 2: Building the Flow
      1. 1. Trigger: When a row is added, modified, or deleted
    4. SharePoint Actions
      1. 2. Create new folder (Create Parent)
      2. 2. Create new folder (Create External Documents)
    5. Dataverse Actions
      1. 1. Add a New Row (Create External Document Location Record):
      2. 2. Add a New Row (Create Parent Document Location Record):
      3. 3. Update a Row (Change Parent Site on External Folder)
    6. In practice
      1. Steps to Set Up the Internal Subgrid:
      2. Test out internal subgrid

Introduction

I want to start by saying that my recent work with Power Pages has been a significant learning experience. While I had some idea of what I was getting into, I had to quickly upskill. Here’s the key takeaway: don’t fear what you don’t know. Fear of the unknown often holds us back, especially in the tech world, where there are always countless uncertainties. We use tools to build solutions and inevitably face gaps or unknowns along the way. Instead of shying away, embrace those unknowns, expect them, and take action. Soapbox moment over.

Shoutout to Keegan

I have to acknowledge my colleague, Keegan, for this. We worked on a Power Pages project with SharePoint integration and want to ensure that only external users can view external-facing documents. Internally, a reviewer will upload documents to the external user’s record. While the internal reviewer will have access to all documents in the library, the external user will be restricted to documents stored in an external-facing SharePoint folder.

We experimented with several approaches to implement the segregation, but nothing seemed to work. Initially, we considered adding a metadata field to the document library to flag files or folders as external, then using JavaScript in the portal to filter out the results. However, I was concerned that if the JavaScript failed or loaded slowly, it could inadvertently expose sensitive files to users who shouldn’t have access. That potential risk made me uncomfortable with relying on this approach.

We noticed there had to be an internal process Microsoft uses to decide which folder gets surfaced in the subgrid. It was Keegan who suggested that maybe the ‘Created On’ field in the Document Locations Dataverse table was being used to determine which folder gets displayed. And sure enough, that was the case. So, big shoutout to Keegan for that insight. He even suggested I write an article about it, but I have no problem giving credit where it’s due—he’s a great guy.

Solution

I won’t be covering the process of setting up SharePoint integration with Power Pages in this article, but I will go over the basics of our table permissions. Anything beyond that is outside the scope of this discussion. If you’re unfamiliar with the SharePoint integration setup, please visit Manage SharePoint documents. Once you’re comfortable with that, return to this article and pick up the instructions from there.

Step 1: Configuring the Basic Form

For our solution, we didn’t need to use the entire Dataverse form; we just needed a simple space to upload files. You can choose to use the full form or modify it to fit your needs. We’ll be uploading our forms to a table we named “Site.” A quick note: avoid naming a table in your Dataverse environment “Site.” Power Pages uses a table called “Site” that holds the site record for the Power Pages website, so using the same name could cause conflicts within the environment.

Whatever permissions you set on the form; I recommend changing the parent permission to “Global” for the purposes of this discussion. You can adjust the permissions later based on your specific needs, but for now, this will help show functionality. Once you change the parent permission’s access type, we should be all set. Still, please note—until we complete the Power Automate flow, I suggest against uploading anything into the subgrid. You’d likely have to delete it later anyway.

Aside: Spoofing the System


One thing we noticed when manually walking through this process is that in the Document Location table in Dataverse, the first Document Location record “wins” and becomes the default. This is why we need to “spoof” the system a bit. When the first file or folder is added, it automatically creates the parent document location in Dataverse, which we don’t want in this case. Instead, we want to create the external document location first. Once that’s done, we create the parent document location. After the parent is created, we’ll associate it with the external document location. This ensures that the external location takes priority in the subgrid, making it visible only to external users on the Power Pages website, while files or folders outside of the external folder structure and including the external folder will still be visible to internal users via the model-driven app.

Step 2: Building the Flow

Building out the flow is fairly straightforward. We want to automatically create the folder structure every time a new site is created, ensuring that when documents are uploaded from Power Pages, they are always directed into the external-facing folder. To implement this design effectively, we’ll use an automated trigger.

1. Trigger: When a row is added, modified, or deleted

This trigger will monitor the creation of new sites and automatically build the necessary folder structure, making sure the external folder is ready for document uploads.

SharePoint Actions

To achieve the necessary design for our folder structure, we will be using two SharePoint actions. These actions will create both the parent folder and the external folder nested inside the parent folder. Both actions will utilize the “Create new folder” action.

2. Create new folder (Create Parent)

  • Site Address:
    This should be the location of your document integration. If you’re unsure of the SharePoint site location for the integration, you can review the document management settings in Advanced Settings to find it.
  • List or Library:
    Library where your documents are stored, check the site contents of the SharePoint site to verify the document library where you want your documents to be stored.
  • Folder Path:
    The folder path will be a concatenation of the Site Name and the Site GUID, separated by an underscore (_). This ensures a unique folder structure for each site created.
replace(concat(triggerOutputs()?['body/dul_name'],'_',toUpper(triggerOutputs()?['body/dul_siteid'])),'-','')

This setup will create the parent folder, setting the foundation for further customization with the external folder inside.

2. Create new folder (Create External Documents)

  • Site Address:
    Use the same site address as the Create Parent action.
  • List or Library:
    Select the same document library as the Create Parent action.
  • Folder Path:
    Using the dynamic content from the previous action, select the Title of the folder created in the parent action, then add a forward slash (/) and specify the name of your external documents folder (e.g., 0External_Documents).

This will nest the external folder inside the parent folder, ensuring that all external-facing documents are uploaded into the appropriate folder structure.

Dataverse Actions

Once the folder structure is set up in SharePoint, the next step is to establish the corresponding Document Location in Dataverse. Before proceeding with any Dataverse actions, navigate to make.powerapps.com and open the Document Locations table. Find and extract the GUID for the default site’s Document Location record, which should be titled “Documents on Default Site.” This GUID is crucial for the Dataverse actions you’re about to implement, so make sure the record is available and correctly identified before continuing.

1. Add a New Row (Create External Document Location Record):

  • Table Name:
    Document Locations
  • Name:
    External Documents
  • Parent Site or Location (Document Locations):
    sharepointdocumentlocations({GUID of Default Site record})
  • Regarding (Sites):
    dul_sites({GUID of the site in the trigger})
  • Relative URL
    0External_Documents
    (Note: “Sites” is the table name being used for SharePoint integration in this example. Your table name may differ.)

It is essential to create the external document location record before proceeding with any further steps.

2. Add a New Row (Create Parent Document Location Record):

  • Table Name:
    Document Locations
  • Name:
    Trigger Name
  • Parent Site or Location (Document Locations):
    sharepointdocumentlocations({GUID of Default Site record})
  • Regarding (Sites):
    dul_sites({GUID of the site in the trigger})
  • Relative URL
    replace(concat({Name of Trigger},’_’,toUpper({GUID of Trigger})),’-‘,”)
    (Note: “Sites” is the table name being used for SharePoint integration in this example. Your table name may differ.)

3. Update a Row (Change Parent Site on External Folder)

  • Table Name: Document Locations
  • Row ID: {GUID of External Folder Document Location}
  • Parent Site or Location (Document Locations): sharepointdocumentlocations({GUID of Parent folder})

It’s crucial to update the parent site or location to align with the folder structure in SharePoint. This process leverages relative URLs, so it’s important that the relative URLs follow a consistent hierarchy. For example:

  • Default Site Relative URL: dul_site
  • Parent Relative URL: LinkedInTestSite
  • External Folder Relative URL: 0External_Folder

By correctly linking these records, the fully constructed URL will follow the pattern: dul_site/LinkedInTestSite/0External_Folder. This ensures the folder structure in Dataverse matches the structure in SharePoint, maintaining consistency and proper document organization.

Automated Dataverse trigger and creating folder structure in SharePoint
Creating external document location record in dataverse
Creating parent folder document location
Reparenting external document location with Parent record

In practice

Steps to Set Up the Internal Subgrid:

1. Open the main form for your table.

2. Add a subgrid to the form.

3. In the subgrid properties, select Related Records.

4. Choose Documents (Regarding) from the list.

5. Select All SharePoint Documents.

6. Save the changes.

7. Publish the form to apply the updates.

Test out internal subgrid

  1. Navigate to the table with the integration and select a record.
  2. Click Edit to open the form.
  3. In the subgrid, choose the Document Location dropdown and select All Locations. This will display all files in the folder structure.

In my case, the internal subgrid shows 5 records, including those uploaded from Power Pages or residing in the External Documents folder, but on the Power Pages side, only 3 files are visible.

  • This is because documents on the Power Pages side are written to the External Folder, so only documents in that folder will be visible to external users.

Important: If you upload a file to any location other than the External Folder, it will not be accessible to external users.

If you select the External Documents tab in the internal subgrid and upload a file, it will be added directly to the External Folder, making it visible to external users.

Internal facing subrid with all files including files uploaded from Power Pages
Files residing in the External Documents folder shown here in Power Pages

Canvas App Optimization: Collections

  1. Prelude
    1. Thank you “L”
  2. Introduction
  3. Collections
    1. Collection Construction
      1. Filter Criteria
    2. Example 1: Filtering Data Source based on Status
    3. Example 2: ShowColumns()
    4. Example 3: DropColumns()
    5. Example 4: 1 Record Collection
    6. Example 5: Real Time Feedback
  4. Conclusion

Prelude

I’ve just come back from a trip to Las Vegas, Nevada, USA, where I had the honor of attending the Power Platform Community Conference — it was fantastic. Before I continue, I must express how incredible the power of community is. My growth over the past six years has been significant, and I believe it has progressed in tandem with the Power Platform Community.

The inspiration for this post comes from the community. Professionally, I hold the position of Power Platform Supervisor, essentially an architect. Before reaching my current role, I was in the same position as many of you, trying to understand the intricacies of the Power Platform. There were a few trailblazers who had already started to unravel its mysteries years ago, like Matthew Devaney, Hardit Bhatia, Reza Dorrani, and Shane Young. Their pioneering work has been instrumental in my journey. Matthew even gifted me a bottle of Canadian maple syrup along with a touching note. This community is amazing!

Thank you “L”

During my time in Vegas, I encountered a kind young woman who, coincidentally, shares the same surname as one of my close friends, though they are not related. We’ll refer to her as L. Her humility and determination to resolve the issue with her app compelled me to offer some guidance.

I noticed many similarities between myself and L, particularly that she was using SharePoint as her data source, which is where many of us seasoned pros on the Power Platform started. Her dedication to her organization’s needs was clear. I sensed her passion, and on the first day, I offered her some advice that seemed to open up a new perspective on her app to her. The next day, she came back to the conference room, presented the progress she made overnight, and discussed her successes and challenges with me.

It’s incredible to think that nearly six years ago, I was in L’s position. I suspect that if her passion for the Power Platform endures, she’ll likely surpass me. That’s what it’s all about: spreading ideas and fostering a community that’s the antithesis of gatekeeping. If it’s a matter of money and career, there’s plenty to go around. Here, you’ll find no closed doors. L, I want to express my gratitude for reminding me of what it felt like six years ago and for giving me the opportunity to pay it forward. So, here’s to you, L – thank you.

Introduction

Canvas Apps are both fun and frustrating. I find the beauty to be in creating these pixel perfect custom applications, the angst in the patterns we must construct to make the app run as smoothly as possible. When done right, not only do you have a beautiful creation, but you also have beautifully written patterns that retrieve data efficiently and in an optimized manner that will make any “pro” developer proud.

Collections

Using collections is a method to enhance our application’s performance. Collections are, in essence, temporary tables or tables stored in the application’s memory, acting as a type of variable. Their strength lies in their capacity to store tabular data, which can be a precise replica of our data source.

This data can stay in the app until the collection is updated or destroyed, either by command or upon the application’s launch or closure. Storing data within the app significantly boosts performance, as it eliminates the need to repeatedly retrieve data, thus avoiding performance lags. When fields in your forms or galleries draw from the collection, updating them provides instant feedback to those fields, bypassing the need for a patch to the data source and the later retrieval of the updated value or reloading of the collection.

Moreover, collections value extends to improving the UI of our canvas app, enabling us to write conditional patterns in our formulas that adjust the visibility of controls, borders, colors, and similar elements.

Collection Construction

Filter Criteria

Before we continue with our exploration of collections, it’s important to consider certain aspects of how we will construct our collection. Although collections can be an exact mirror of our data source, it is crucial to contemplate the techniques we use to filter our data source.

Ensuring we have proper filter criteria is crucial because it prevents us from retrieving unnecessary data. We aim to avoid overloading our application with irrelevant rows that do not contribute to the operations we intend to execute. When considering data sources or datasets, think about the number of operations that can be derived from a single source. With this consideration, meticulously decide your objectives and set filter criteria that serve as a pathway to those goals. Otherwise, you end up creating collections that deviate from the original intent of your project.

Example 1: Filtering Data Source based on Status

A prevalent method for applying filter criteria is to set up the database to return only active records. The first example will show this approach.


Using my SharePoint List:

  1. Create Collection called colBooks
  2. Filter SharePoint List by Status field to retrieve records that are active
  3. Display records in items property of table
Creating collection with no filter criteria
Unfiltered collection in data table control
Using Filter function to retrieve records with Active status
Filtered collection

Example 2: ShowColumns()

Another way to create a collection in Power Apps is by using the ShowColumns() function. This function enables developers to explicitly choose which columns they want to include and make available within the app.

Our method for building collections improves performance by retrieving only the essential columns. This reduces the app’s load, resulting in faster execution, especially when working with large datasets.

In my experience, there’s rarely a valid reason to retrieve every column from a data source. While clients may sometimes ask for this, it’s crucial to remember that doing so can pull in unnecessary system columns. For example, in SharePoint, this will include columns like {ModerationComment} and {ModerationStatus}, among others.

Formula for ShowColumns(), explicitly selects the columns needed and leaves out the rest
Explicitly selected columns on view in table

Example 3: DropColumns()

The inverse of ShowColumns() is DropColumns(). While ShowColumns() lets you explicitly select which columns to show in the app, DropColumns() lets you specify which columns should be excluded. This is particularly useful when you need most of the data but want to omit specific columns, like unnecessary system fields or sensitive information, keeping the app more efficient and secure.

Streamline your app with DropColumns() by removing unnecessary fields and keeping only essential data.
The ISBN field is empty because we used DropColumns() to remove it when refreshing our collection

Example 4: 1 Record Collection

What’s the point of having a collection with just one record? While collections are typically used to display tabular data, they’re also highly flexible and can be updated. I often pull a single record into a collection to isolate it and work with that specific data. In my opinion, it’s more efficient to modify a single record in a collection than to reset an entire variable.

To illustrate this with a use case: In Power Apps, the out-of-the-box form experience can be limited, and there are situations where creating a custom form is necessary. I often do this using a gallery, as the custom form only needs to handle one record at a time. In such cases, I create a single-record collection and assign it to the gallery’s Items property. This makes the contents of that collection available to the gallery or custom form, allowing for more flexibility and control in handling the record.

Using First() function to return the first record from the data source and create a single record collection
Single record collection
Using LookUp function to filter based on unique value (ISBN) to create single record collection
Single record collection filtered by ISBN

Example 5: Real Time Feedback

A key advantage of a collection is that it stores tabular data in memory on the client side. Since the collection is available within the app, you can change it without needing to alter the data on the server. This allows any control using the collection as its data source to instantly show changes once the collection is updated, providing real-time flexibility and responsiveness in the app.

To update a collection, there are several techniques you can use depending on your needs. You can leverage functions like Update(), UpdateIf(), and Patch(), among others. Each of these offers a different approach—Update() replaces an entire record, UpdateIf() modifies records based on a condition, and Patch() lets you selectively update specific fields within a record. These options give flexibility in how you manipulate data within the collection, enabling efficient real-time updates.

Notice price before modification
Formula to update Price of last book in the collection
Updated price

Conclusion

Once again, I’d like to thank my friend “L.” As a consultant who works with many clients throughout the year, it’s easy to forget those early days and the challenges you face when starting out as a Power Platform developer. If you’re like “L” and need some guidance, don’t hesitate to reach out to the community through various social media forums or the official Power Platform community. There’s a wealth of knowledge and support available.

Collections are a powerful tool, and I encourage all canvas app developers to take full advantage of them. Not only can collections help improve performance in terms of data retrieval, but they can also enhance the user interface and overall app functionality, as shown in the examples throughout this article. There are countless ways to leverage collections, so I urge you to explore their potential. Happy building!

Enhancing Microsoft Bookings with Power Platform: Streamlining Scheduling and Automation


  1. Introduction
  2. Bookings – 30000 Foot Overview
  3. Our Solution
  4. Building Bookings Site
    1. Step 1 – Bookings Homepage
    2. Step 2 – Create Shared Bookings Page
    3. Step 3 – Add a new service
    4. Service configuration
  5. Creating Appointment Spreadsheet
    1. Step 1: Launch OneDrive
    2. Step 2: Create Excel Workbook
    3. Step 3: Establish a formatted field using Data Validation to generate a dropdown menu
  6. Creating Power Automate Flow
    1. Step 1: Select Flow Type
    2. Step 2: Configure Flow
  7. Conclusion

Introduction

I remember my second IT leader always emphasizing the importance of leveraging out-of-the-box solutions. That initially clashed with how I envisioned my tech career. I imagined myself as a coder, creating innovative, never-before-seen applications. However, over time, I realized the true value lies in making the most of the tools and features already available on our platforms. While it’s possible to build an API for nearly anything, we have to consider that others will maintain what we create. I’ve learned that just because we can build something, doesn’t mean we should. The focus should always be on delivering value without overcomplicating the solution.

One of the questions I often ask myself as a consultant is why a business customer would purchase Microsoft 365 Business Licensing and not fully utilize its features. Is it due to a lack of knowledge? Are users simply sticking to familiar tools like Word, Outlook, and Excel? Whatever the reason, I believe we should aim to maximize the value of our existing licensing and platform(s) before seeking external services that could be recreated in-house or eliminating redundant software to save costs.

It is not my intent to talk anyone out of purchasing a platform that they are familiar with. However, I want customers to know what they already have available with their Microsoft 365 subscription. In some cases, newer customers to the Microsoft 365 ecosystem aren’t fully aware of all the services and tools they’re paying for. For example, I have found that customers are unaware of their access to Power Apps and Power Automate—tools that can unlock tremendous potential for customization and automation.


Bookings – 30000 Foot Overview

When you hear “Microsoft Bookings,” what comes to mind? Or do you even think about Microsoft Bookings at all? If not, you’re not alone. I like to think of Bookings as a web-based interface for Outlook because it essentially builds on Outlook’s scheduling capabilities, like creating Teams meetings. In some ways, Bookings is similar to Calendly, though it lacks support for multiple backend calendars.

If you’re reading this, you likely already have, or are considering, a Microsoft 365 license. Thanks to the Power Platform’s flexibility, integrating with external services like Calendly is quite simple, especially with its available API connector. However, the focus of this article is on maximizing extensibility without incurring extra costs beyond your standard business licensing. As of this writing, the Calendly connector is classified as a premium feature, which means additional charges for using it in automation or apps—approximately $15 per month for Power Automate and $20 per user per month for Power Apps.


Our Solution

We want to provide our clients with the ability to book appointments with us. After the booking is confirmed, we will send an email to the client containing details on how to remit payment before the upcoming appointment. Using Power Automate, we’ll automatically log the booking details into a spreadsheet. This will include the client’s name, meeting description, meeting notes, meeting date, and a boolean field to track whether payment has been received before the scheduled appointment.


Building Bookings Site

Step 1 – Bookings Homepage

To get started, navigate to Bookings – Admin – Outlook (office.com). Once you arrive at the site you will see the homepage (pictured below).

Bookings homepage

Step 2 – Create Shared Bookings Page

Next we’re going to create our booking page. There are two ways to create a booking page.

  1. Create from scratch
  2. Clone an existing booking page
Booking Page creation prompt


During this demo, we will create our page from scratch. To begin, select “Create from scratch,” which will launch a creation wizard with the following options:

  • Name – Name of Booking Page
  • Logo – Company logo. I would use a vector graphic of some sort just so image does not get distorted
  • Business type
    • Healthcare
    • Recruiting
    • IT Support
    • Education
    • Financial Services
    • Sales
    • Other
Configuring booking page name, logo, business type, business hours
Inviting users in organization
Selecting service – service is an appointment type
Configuring visibility of bookings page
System is creating bookings page

Bookings page created
Bookings Calendar

Step 3 – Add a new service


Once our Bookings page has been created, the next step is to add services that will be displayed to users. When creating a service, there are several categories with configurable options, including:

  • Basic Details
  • Availability Options
  • Assign Staff
  • Custom Fields
  • Notifications

These settings allow us to tailor the meeting scheduling experience to fit both our availability and the needs of our clients.

Add service modal

Service configuration

As we configure the basic settings for our new service, you’ll learn how the booking system works. Feel free to explore and configure additional options for your service during this demo.

The options we are going to configure can be found under the Basic details tab.

  • Name
  • Description
  • Location
  • Duration
  • Buffer Time
  • Cost
  • Maximum number of attendees

We will also include a custom field to capture how customers discovered our company.

This image has an empty alt attribute; its file name is image-18.png
Configuring basic details
Adding custom field

Creating Appointment Spreadsheet

Let’s focus on our outputs; we will utilize a spreadsheet to monitor whether a customer has paid for their appointment. We will store our spreadsheet on OneDrive.

Step 1: Launch OneDrive

  1. To access OneDrive navigate to office.com
  2. From our app launcher Select the OneDrive logo to launch OneDrive

Step 2: Create Excel Workbook

  1. Click the Plus (+) sign.
  2. Choose ‘Excel Workbook’.
  3. Enter the following column headers for the workbook: Client Name, Meeting Description, Meeting Notes, Meeting Date, Paid.
  4. Highlight the first row, then go to the ‘Styles’ tab and choose ‘Format as Table’.
  5. Tick the box that says “My table has headers”.
  6. Click ‘OK’.

Step 3: Establish a formatted field using Data Validation to generate a dropdown menu

  1. Select the “Paid” field.
  2. Choose the Data tab from the ribbon.
  3. Click on Data Validation.
  4. From the Allow dropdown, choose List.
  5. Enter a comma-separated list with the values true, false.
  6. Click Apply.

Creating Power Automate Flow

Step 1: Select Flow Type

  1. To create our automation to process newly created appointments navigate to make.powerautomate.com
  2. Select Create
  3. Select Automated cloud flow as flow type
  4. Name flow
  5. Select Bookings Trigger “When a appointment is Created”
  6. Select Create
Power Automate landing page
Selecting automated flow
Selecting trigger

Step 2: Configure Flow

  1. Sign in to the Bookings calendar using the same credentials you used to create the Bookings page.
  2. Add the Excel action “Add a row into a table.”
  3. Fill in the required details for the action, including file location, document library, file name, table name, and any other necessary fields.
Bookings trigger – “When a Appointment is Created”
Action to add a new row into our spreadsheet each time a new booking is created.
Flow run to insert row into spreadsheet
New row in our spreadsheet from our new client booking


Conclusion

While it’s easy to turn to familiar tools or third-party platforms, the true value lies in unlocking the full potential of your existing Microsoft 365 subscription. By harnessing the power of the platform—specifically tools like Power Apps and Power Automate—you gain a comprehensive solution that can streamline processes, drive productivity, and reduce unnecessary costs.

Microsoft 365 isn’t just a collection of isolated applications; it’s a fully integrated platform designed to help you build solutions that fit your unique business needs. Whether you’re automating workflows or creating custom apps, the flexibility and extensibility of Power Platform allow you to create tailored solutions without the complexity of building from scratch or relying on external services.

By investing in this platform as a solution, you’re empowering your business to do more with less—maximizing the value of what you already have while ensuring that any custom developments are truly meaningful and efficient. With the right approach, complex problems can be solved in simpler, more effective ways, giving you the agility to grow and adapt without over complicating your tech stack. Businesses can not only streamline processes and enhance productivity but also avoid unnecessary costs. The goal should always be to maximize the value of the tools at hand, ensuring that any custom development or external service is truly adding value. After all, the power of the platform is in its flexibility and extensibility, often making complex solutions easier than we might initially imagine.

“Printing” in Canvas app – Leveraging Onedrive

  1. Introduction
  2. Solution
  3. Demo
    1. Step 1
    2. Step 2
    3. Step 3

Introduction

I’ll see if I can manage this in 20 minutes. I’m currently at a terminal in O’Hare, about to head to Las Vegas for the Power Platform Conference, and I’ve just thought of an intriguing scenario for which I recently devised a solution in a canvas app.

The print functionality in Canvas Apps, while serviceable, is not the best. I found a workaround by utilizing OneDrive’s native capabilities to print a PDF.

Solution

For demonstration purposes, my example is abbreviated. Initially, I had to generate a PDF using Power Automate and Microsoft Word templates, which would then be printable by the user. In the first version, I created the PDF and emailed it to the user, who could then print the attachment. However, this method necessitated that they access Outlook to retrieve the email.

My second approach was to determine how to print the PDF directly from the app, maintaining the format as it appeared when generated in Word. I considered creating a PCF but decided against it due to budget constraints. I experimented with opening and closing the PDF in OneDrive via the browser, attempting to manipulate the URL parameters for use in the app to prompt the browser to initiate printing. Unfortunately, this method was unsuccessful.

As I reflected, I began to count the number of clicks. As developers, our aim is to reduce the number of clicks required for a user to complete a task, since an excess can lead to irritation. My objective was to enable users to complete the printing task in fewer than five clicks, with the app managing the interaction directly, rather than having to press a button, wait for an email, retrieve the attachment, and then print.

I utilized Power Automate to create a Word document, convert it to PDF, save it to OneDrive, retrieve the sharing link for the app, launch the PDF in a new tab, and then print the document.

This enabled me to achieve my objective of printing the document in fewer than five clicks.

Demo

Again our demo is a bit short

Step 1

Add your document library as your data source.

Step 2

Use a button to set a global variable by employing a lookup function linked to your document library, utilizing the ID as the lookup criterion to fetch a single record. Apply dot notation and set the variable using ‘Link to item’ as the parameter.

Encapsulate the variable within the Launch function, set the target to ‘new’ to open a new tab, and it will initiate the launch of your PDF.

Step 3

Click the print button.

Quick Tip – Masking Phone Number

I had an interesting question posed to me recently. The customer was using a Model Driven app and wanted to format a 10-digit number as a phone number. I thought sure, create a JavaScript web resource and have it do its jazz after the user entered the 10-digit number prior to submission of the record.

The customer requested we use no code because they have a strict process for checking in code. So I suggested the following.

  1. Create a Power Automate flow that runs when a record is created or updated
  2. Write the following formula into phone number field: concat('(', substring(triggerBody()?['text'], 0, 3), ') ', substring(triggerBody()?['text'], 3, 3), '-', substring(triggerBody()?['text'], 6, 4))

Configuration of flow

The number being entered as 10-digit number minus formatting
Outputs

Power Pages: Leveraging Dataverse Views for Custom HTML Lists


Harnessing Dataverse Views for Power Pages

Dataverse Views are incredibly powerful tools that can save you significant time by allowing you to create custom views with the filters and sorting arrangements you need. These low-code views can be seamlessly integrated into Power Apps, Power Automate workflows, and Power Pages, making them a versatile asset in your development toolkit.

Accessing FetchXML from a Dataverse View

Did you know you can directly access the FetchXML for any view you create in Dataverse? This is particularly useful when you need to extend or customize the functionality of your views beyond the standard capabilities.

How to Retrieve FetchXML:

  1. In your Dataverse environment, navigate to the tables
  2. Locate ‘View Table’
  3. Locate the ‘Fetch XML’ column for the view you want to work with.
    • Tip: Use developer tools to inspect the cell with the FetchXml and copy it from in between the tags. Trying to copy the fetchxml directly can be a bit annoying.
  4. Copy the FetchXML code.
  5. Paste into Liquid FetchXML object in Power Page Web Page or Web Template Editor
  6. Add additional fields or filters

Why is FetchXML Important for Power Pages?

FetchXML is essential when you want to leverage a view to look up related records with custom logic or design. While out-of-the-box functionality allows you to do this on a page using form context, there are scenarios where you might want to go beyond the standard interface—perhaps to build a custom grid or list using HTML for a more visually appealing website.

Instead of starting from scratch, you can utilize the existing FetchXML from a Dataverse view, apply additional filters (such as the page record ID on a lookup column), and integrate this into your custom code. This approach enables you to display related records in a way that aligns with your desired page design while retaining the power and flexibility of Dataverse.

By using FetchXML, you can achieve a more robust and tailored user experience on Power Pages without reinventing the wheel.


Copying FetchXml from Views table
FetchXML from Views table being consumed in Power Pages webpage with an additional filter to filter by the GUID in the request parameters.

Canvas Apps: Data Source Polling

I don’t consider myself a Power Platform savant, but I will say this: it’s a lot of fun to take the best components of the Power Platform and craft a solution that works for customers. My primary responsibility is to recommend the best possible solution using the most suitable tools available. What does that mean in practice? For instance, I wouldn’t try to build a reporting canvas app with complex PowerFx patterns. Instead, I would create the reports in Power BI and embed them in the app if needed.

The same concept applies to the integration between Power Automate and Power Apps. Why struggle with complex patterns and many lines of PowerFx when you can leverage Power Automate to handle backend processing? Power Automate is designed to handle these tasks efficiently.

But what if you need feedback from Power Automate within your app? If your processing takes less than two minutes, you’re in good shape. But those pesky edge cases do exist, and there will be times when processing might take a bit longer. That’s when a bit of ingenuity and creativity come into play.

Polling

Have you heard of polling? It’s a process where we repeatedly check a database to see if a specific condition, we’re looking for has been met. In a canvas app, we can implement this by using a timer. When the timer finishes, the app checks Dataverse, SharePoint, SQL, or another data source to see if the condition has been satisfied. If it has, we provide feedback directly to the app. If it hasn’t, the polling continues until the condition is met or until a counter mechanism or variable that stops the polling after a set number of attempts.

Step 1: Setup Timer

In our canvas app we want to add the timer control and add the following parameters to the each respective property listed below:

PropertyValueDescription
Duration6000Timer duration will be 6 seconds
OnTimerEndIf(
IsBlank(
LookUp(
colTestTimer,
Id = 0
).Column1
),
UpdateContext({ctxEmptyCol: true}),
UpdateContext({ctxEmptyCol: false})
)
If Column1 in the colTestTimer collection is empty, set the context variable ctxEmptyCol to true; otherwise, set it to false. If the context variable is false, then the timer will stop.
RepeattrueThe timer will continue to repeat until the polling results are satisfactory.
Reset!ctxEmptyColReset the timer once all conditions are fulfilled.
StartctxEmptyColInitiate the timer as Column1 in the collection is empty.

Step 2: Test

To test our timer, we will require two buttons: one to create the collection with Column1 being empty, and another to initiate a write to Column1 and stop the polling.

Button 1

In the OnSelect property of the first button, paste the following code:

ClearCollect(colTestTimer, {Column1: Text(Blank()), Id: 0}); UpdateContext({ctxEmptyCol: true})

Button 2

In the OnSelect property of the second button, paste the following code:

UpdateIf(colTestTimer, Id = 0, {Column1: "Not Empty"})

Test

Launch the application. Press the first button to initiate the timer and fill the collection. Allow the timer to complete several cycles, and once you’re prepared, hit the second button to refresh the collection. Upon the timer’s completion, it will check the collection, recognize that the condition has been met, and cease polling.

Conclusion

In this article, the focus is on polling a collection; however, your options are not confined to this collection alone. It’s crucial to use polling judiciously and construct patterns that do not impede your app’s performance. An example of this is polling a table to check if a file has been created and linked to a record. Undoubtedly, you can discover numerous intriguing use cases to further this concept.

Data Transformation: Liquid Templates as Maps in Logic Apps and integrating them with Power Automate

Table of Contents

  1. Intro
  2. Scenario
  3. Demonstration
    1. Logic App
    2. Power Automate
      1. Secondary option
      2. Result
  4. Recommendations

Intro

In an earlier post, I mentioned transforming data using inline JavaScript powered by an Azure Function. I want to resurrect that idea. This time, I want to extend that conversation and open it up to integrating with Azure Logic Apps and Liquid Templating.

If you don’t want to read this, skip to the meat of the content.

As a consultant, I feel it is part of my duty to inform. Much of what we do with our daily interactions with data is transform it. We take raw data and manipulate it to serve different purposes based on our primary function. Other functions then transform it to suit their needs. In my line of work, it often amazes me how many different objects, presentations, automations, or apps can live off the same dataset, ultimately serving the same purpose: to drive the organization forward.

Power Platform experts may ask why I need a Logic App workflow to do what I can do in Power Automate, and someone proficient in Logic Apps may ask the same thing about Power Automate. Well, besides the similarities between the two technologies, architecting solutions shouldn’t be limited to the tools we are familiar with. The best solutions are made up of the best parts of various tools.

What do I mean by the best parts of? The scenario I am going to present today could, in theory and practice, be created using an Office Script that lives in SharePoint or an Excel file in OneDrive. In this case, the “best part” of Excel would be the Office Script, as we would not deal with any other functionality tied to the workbook. We could run our scenario through that Office Script and return its output to the flow, achieving virtually the same result. However, that is a different article for a different time.

The concept I want us to grasp is the solution itself. As a Power Platform developer who has now spent time working with Azure resources, I want to understand how I can benefit from both worlds. I know Azure can be daunting because it is a vast platform encompassing many things. When one declares Azure, you have to ask yourself…Azure what?

Scenario

The data we are trying to transform or manipulate comes in many forms. Sometimes, data is returned as a CSV, XML, JSON, plain text, HTML, YAML, or binary data such as files or images. In some cases, it even comes in custom formats specific to an application or system. It is our job as developers to figure out the best way and best tools to transform this data and serve it to our end user in a usable way on the destination server.

We will be working with a JSON payload. This concept was inspired by Microsoft Learn; however, I will not detail the creation of the liquid template. Instead, I will demonstrate how to access the template from Power Automate.

If you have not done so already, please follow the link I previously provided to set up your template and Logic App. Additionally, please add an extra action to your Logic App beyond what is described in the Microsoft Learn documentation, as we will need to return the value to our Power Automate flow.

Demonstration

Logic App

Following the documentation should result in your logic app containing the following trigger and actions:

  • When a HTTP Request is received (Trigger)
  • Transform JSON To JSON
  • Response (needed to provide a response to Power Automate)

In the Logic App portal, you need to copy the URL of the Logic App workflow. This URL will then be invoked in your Power Automate flow.

Power Automate

In Power Automate, we will create a flow with a manual trigger. Following the trigger, we will initialize four variables: one each for FirstName, LastName, Phone, and Devices. These are the same keys used in the Microsoft Learn documentation for creating the liquid template.

As you proceed, please populate the variables with values that align with the documentation. For the ‘devices’ variable, ensure you input values separated by commas, akin to the example shown below. Ensuring that devices are represented as objects in our output will be crucial.

In our Compose action, we’ll construct a JSON object similar to the one shown in the image below. Naturally, we won’t be constructing it exactly as depicted; I’ve included the formulas for reference purposes.

Excluding the trigger, we have five actions and several formulas dedicated to handling data transformation, which constitutes the ideal workflow.

Secondary option

To create a new flow, start with a manual trigger and then add an HTTP action. The flow should resemble the one depicted below.

In parameters for the http action

  1. Paste the URI to your logic app
  2. Select post as the method
  3. In the body, paste JSON to be parsed
Result

Now when you run the flows you will end up with basically the same results.

Recommendations

You might be wondering, “What did I lose?” or “What did I gain?” Both inquiries are pertinent. Our brief demonstration revealed that multiple actions are required to achieve what one action could previously accomplish. However, with templating, we gain increased flexibility for complex transformations. Separating the transformation logic from the workflow also enhances maintainability, making updates easier over time. Additionally, there’s the benefit of integrating with a broader range of Azure services. As mentioned before, Azure’s capacity to handle more advanced workloads allows us to enhance our Power Automate flows significantly.

Ultimately, the decision depends on what you, as a developer and an organization, are comfortable with. For smaller workloads, a strict Power Automate approach may suffice, but for enterprise-level solutions, exploring integrations with Azure might be the better option.

Thank you for reading!

Power Automate: Transforming DevOps Queries into Automated Notifications

Since I began my journey in IT, I’ve managed all my projects through Azure DevOps. When used correctly, DevOps is a powerful tool for solution development. However, it’s easy to lose track of tasks—especially if, like me, you tend to complete tasks but sometimes forget to update the hours or close them. In such cases, setting up a reminder can be incredibly helpful.

To create this reminder, we’ll use Power Automate to pull information from a custom query set up in DevOps. While I won’t cover setting up queries in DevOps here, you can refer to Microsoft Learn’s documentation for guidance on managing and organizing your queries.

Once your query is ready, let’s move over to Power Automate. For this demonstration, we’ll use an instant trigger.

Step 1: Call the Azure DevOps Query

Our first action after the trigger will be to fetch the query results from Azure DevOps. Use the “Get query results” action, then select the project name and the query ID.

The query will return a JSON payload, and there are multiple ways to handle it. For our purposes, we’ll send an email to notify the relevant parties of their time status.

Step 2: Process and Format the JSON Payload

Power Automate’s Send Email action supports simple HTML and CSS styling, which allows us to format the email content. We will structure the JSON data as an HTML table. To start, we’ll initialize a string variable that will serve as our row constructor, holding the content for each column in the table.

To build our rows, we will use the “Append to String” variable action to add rows based on the number of rows returned in our JSON object. You’ll notice that we’ll be using HTML tags and inserting our dynamic content between these tags. We’ll also apply inline CSS styling, which will be useful when we construct the final email notification.

Once we insert our dynamic content, Power Automate will automatically create an “Apply to each” loop. This loop will iterate over each item in the JSON object, appending a new row to our table for each iteration. This approach ensures that every entry is neatly formatted and included in the final email notification.

Step 3: Send Email Notification

In our final step, we’ll send an email notification containing the content of our string variable. You’ll see that I’ve included the table tags to encapsulate our formatted rows, along with headers for each column and some additional styling for clarity.

One important detail to remember when setting up the “Send Email” action is to enable the HTML content option. This tells Power Automate that we are using our own HTML and CSS styling. By doing so, our email will properly display the structured table and styling we’ve applied, ensuring a professional and readable notification for the recipients.

By following these steps, you can ensure that your DevOps tasks are always up to date and that all team members are aware of their status, ultimately enhancing productivity and organization.

Things I wish I knew: Converting File Content to JSON

  1. Introduction
    1. Leveling Up Your Power Automate Game with Expressions
    2. Why Expressions Matter
    3. Working with JSON Files Using Expressions
    4. Real-World Example: Inspecting a File from FTP
    5. Benefits of Using Expressions
  2. Solution
    1. Step 1
    2. Step 2
    3. Step 3

Introduction

Leveling Up Your Power Automate Game with Expressions

When I first started using Power Automate, there were a few things I wish I had known that would have saved me time and frustration. One of the biggest lessons? Mastering expressions. Expressions in Power Automate are like a secret superpower that can take your automation to the next level.

Why Expressions Matter

Expressions let you handle logic and data transformations directly within your flow. For example:

• Instead of using a Condition action to validate if outputs meet specific criteria, you can use a single expression to evaluate the data inline.

• This makes your flows more efficient, cleaner, and often faster.

Working with JSON Files Using Expressions

Let’s talk about working with JSON files—a scenario many of us face when automating workflows.

Instead of:

1. Taking the content of a JSON file.

2. Saving it to SharePoint (or another location).

3. Opening the file to inspect its contents.

You simply use an expression in Power Automate to directly parse and view the content of the JSON at runtime.

For instance:

• Use the json() expression to convert file content into a readable JSON object.

• This lets you work with the data directly in the flow without needing to save or manipulate the file externally.

Real-World Example: Inspecting a File from FTP

Imagine this scenario:

• A file is generated on a server and fetched into Power Automate using the SSH – FTP connector.

• Instead of copying this file to SharePoint just to review its contents, you can apply an expression like json(outputs(‘Get_file_content’)) to convert the file content directly into JSON format.

• This approach enables you to extract, validate, or manipulate data from the file immediately during the flow’s runtime.

Benefits of Using Expressions

1. Efficiency: Skip unnecessary steps like saving files or creating extra conditions.

2. Flexibility: Work with complex data structures, like arrays and objects, without additional processing.

3. Cleaner Flows: Reduce the number of actions in your flow, making it easier to manage and debug.

Solution

Step 1

Create a Power Automate flow with a “Button” trigger.

Step 2

Add a Compose action, and in this action, write the following expression.

json(base64ToString('ewogICAgImludm9pY2VOdW1iZXIiOiAiSU5WLTAwMSIsCiAgICAiaW52b2ljZURhdGUiOiAiMjAyNC0xMS0yOCIsCiAgICAiZHVlRGF0ZSI6ICIyMDI0LTEyLTI4IiwKICAgICJjdXN0b21lciI6IHsKICAgICAgICAibmFtZSI6ICJKb2huIERvZSIsCiAgICAgICAgImFkZHJlc3MiOiAiMTIzIE1haW4gU3RyZWV0LCBDaXR5dmlsbGUsIENvdW50cnkiLAogICAgICAgICJlbWFpbCI6ICJqb2huLmRvZUBleGFtcGxlLmNvbSIKICAgIH0sCiAgICAiaXRlbXMiOiBbCiAgICAgICAgewogICAgICAgICAgICAiZGVzY3JpcHRpb24iOiAiUHJvZHVjdCBBIiwKICAgICAgICAgICAgInF1YW50aXR5IjogMiwKICAgICAgICAgICAgInVuaXRQcmljZSI6IDUwLjAsCiAgICAgICAgICAgICJ0b3RhbFByaWNlIjogMTAwLjAKICAgICAgICB9LAogICAgICAgIHsKICAgICAgICAgICAgImRlc2NyaXB0aW9uIjogIlByb2R1Y3QgQiIsCiAgICAgICAgICAgICJxdWFudGl0eSI6IDEsCiAgICAgICAgICAgICJ1bml0UHJpY2UiOiA3NS4wLAogICAgICAgICAgICAidG90YWxQcmljZSI6IDc1LjAKICAgICAgICB9CiAgICBdLAogICAgInRvdGFsQW1vdW50IjogMTc1LjAsCiAgICAiY3VycmVuY3kiOiAiVVNEIgp9'))

Step 3

Run our flow to see the JSON