M365 OOB Series: Planner

  1. Introduction
  2. Solution
    1. Step 1: Microsoft Forms
    2. Step 2: Microsoft Planner
      1. Create New Plan
        1. Select ‘New Plan’
        2. Name your plan
        3. Set Privacy and Sensitivity
        4. Create
      2. Buckets
        1. Creating New Buckets
        2. Add new bucket
        3. Enter bucket name: {Name of your bucket}
        4. Tasks
    3. Step 3: Power Automate
      1. Trigger
        1. When a new response is submitted
        2. Action (Get response details)
        3. Action (Update task details)
    4. End to end test

Introduction

I believe Microsoft 365 licenses often go underutilized, with many businesses purchasing them without fully exploring their potential. Time and again, I implement solutions for enterprise clients whose needs might not initially seem to require all M365 apps. However, I see these applications as building blocks for powerful solutions in today’s workplace.

I’m passionate about empowering organizations to achieve more, and I believe Microsoft shares this vision, as reflected in the capabilities offered in their standard licenses. One app with great potential is Microsoft Planner. While Planner isn’t as feature-rich as Project or DevOps, it’s a strong alternative to tools like Trello and the former and can significantly benefit task management.

In this discussion, we’ll explore how to enhance Planner using Power Platform, particularly Power Automate. You’ll gain insights into Planner itself and see how Power Automate extends its functionality with M365 components. Best of all, this solution doesn’t require premium licensing—so if you have at least a Business Standard license, you’re good to go!


Solution

Our solution will utilize three main components: Microsoft Forms, Microsoft Planner, and Power Automate.

  • Front-end: We’ll use Forms to collect input from prospective customers.
  • Automation: Power Automate will process this input, creating a follow-up task in Planner for each prospect.
  • Back-end: Planner will guide the onboarding journey for each customer. Once they’re onboarded, their information will be stored in a SharePoint list, ready for use in a Canvas App for customer management (to be introduced in the next discussion).

This streamlined approach ensures a smooth transition from prospecting to customer onboarding.


Step 1: Microsoft Forms

We’ll start by creating a basic form in Microsoft Forms, which is fortunately very user-friendly. The field types are limited, so it’s hard to go wrong here. For this solution, Forms is a solid choice since we’re only collecting general data—not sensitive information. It’s akin to signing up for a newsletter (unless, of course, it’s from a political party during election season!). Overall, Forms is a safe and straightforward option for our data collection needs.

For clarity, Microsoft Forms provides several field types, including choice, text, date, ranking, Net Promoter Score, file upload, rating, and Likert scale. In our form, we’ll use the text field type for each of our four questions, with all fields set as required. The fields will be:

  1. First Name
  2. Last Name
  3. Email
  4. Reason for contacting us

This setup keeps things simple and captures just the essential information.


Step 2: Microsoft Planner

The next component of our solution is Microsoft Planner. There are multiple ways to create a plan in Planner, but since the goal is to encourage users to explore business applications they already have, we’ll stick with the standard, non-premium approach.

For fully automated plan creation, the Graph API for Planner can be used, either through a custom connector or by making HTTP requests in Power Automate. However, for this solution, we’ll take a straightforward approach by manually creating the plan directly in the Planner interface.

Create New Plan

Navigate to Planner and take the following steps:

Select ‘New Plan’

Name your plan

Set Privacy and Sensitivity

Create

Buckets

When you create your plan in Microsoft Planner, you’ll see it displayed in a board view or Kanban view, where each category is referred to as a bucket. You can customize these buckets based on your onboarding strategy for prospects, organizing them in a way that best fits your workflow.

Creating New Buckets

Follow these steps to create new buckets. You can create as many as needed for your process. For example, I created three buckets: ‘To do,’ ‘Won,’ and ‘Lost.’

Add new bucket

Enter bucket name: {Name of your bucket}

Adding bucket

Tasks

Each bucket holds tasks that are linked to it. While you can create tasks manually, we rely on automation to handle this. When a new form submission is received, a task will automatically be created in the ‘To do’ or starting bucket.


Step 3: Power Automate

Our flow will be simple: we’ll take the form responses and automatically create a Planner task to follow up with the client.

Trigger

When a new response is submitted

When a new form response is submitted, our flow will trigger automatically. Make sure to select the correct form from the dropdown menu in Power Automate to ensure the flow works with the form you’re using.

Triggered when a new response is submitted

Action (Get response details)

The next step in the flow is to add the action to Get Response Details from the form submission. Select the same form as the Form Id that you want to receive responses from. Then, use the dynamic content from the previous step to set the Response Id and retrieve the submission details.​

The only available text field we can use is the Title field. To make it work, we’ll use dynamic content from the Get response details action, specifically the First Name, Last Name of the submitter. This information will be combined in the Title field so the agent in Planner will easily identify who to follow up with.

Action (Update task details)

After creating the task, we need to update its details to include relevant information from the submitter, such as their email address. Additionally, we will add a checklist to help the agent track the necessary steps for completing the task efficiently.

Using the dynamic content from the previous action, we will set the Task Id, update the Description field with the necessary data (such as the submitter’s email), and create a checklist item to ensure the agent follows up by emailing the prospect.

End to end test

Contact form
Successful submission

Securing Internal Documents: Managing SharePoint Locations to Restrict External Portal Access

  1. Introduction
  2. Shoutout to Keegan
  3. Solution
    1. Step 1: Configuring the Basic Form
    2. Aside: Spoofing the System
    3. Step 2: Building the Flow
      1. 1. Trigger: When a row is added, modified, or deleted
    4. SharePoint Actions
      1. 2. Create new folder (Create Parent)
      2. 2. Create new folder (Create External Documents)
    5. Dataverse Actions
      1. 1. Add a New Row (Create External Document Location Record):
      2. 2. Add a New Row (Create Parent Document Location Record):
      3. 3. Update a Row (Change Parent Site on External Folder)
    6. In practice
      1. Steps to Set Up the Internal Subgrid:
      2. Test out internal subgrid

Introduction

I want to start by saying that my recent work with Power Pages has been a significant learning experience. While I had some idea of what I was getting into, I had to quickly upskill. Here’s the key takeaway: don’t fear what you don’t know. Fear of the unknown often holds us back, especially in the tech world, where there are always countless uncertainties. We use tools to build solutions and inevitably face gaps or unknowns along the way. Instead of shying away, embrace those unknowns, expect them, and take action. Soapbox moment over.

Shoutout to Keegan

I have to acknowledge my colleague, Keegan, for this. We worked on a Power Pages project with SharePoint integration and want to ensure that only external users can view external-facing documents. Internally, a reviewer will upload documents to the external user’s record. While the internal reviewer will have access to all documents in the library, the external user will be restricted to documents stored in an external-facing SharePoint folder.

We experimented with several approaches to implement the segregation, but nothing seemed to work. Initially, we considered adding a metadata field to the document library to flag files or folders as external, then using JavaScript in the portal to filter out the results. However, I was concerned that if the JavaScript failed or loaded slowly, it could inadvertently expose sensitive files to users who shouldn’t have access. That potential risk made me uncomfortable with relying on this approach.

We noticed there had to be an internal process Microsoft uses to decide which folder gets surfaced in the subgrid. It was Keegan who suggested that maybe the ‘Created On’ field in the Document Locations Dataverse table was being used to determine which folder gets displayed. And sure enough, that was the case. So, big shoutout to Keegan for that insight. He even suggested I write an article about it, but I have no problem giving credit where it’s due—he’s a great guy.

Solution

I won’t be covering the process of setting up SharePoint integration with Power Pages in this article, but I will go over the basics of our table permissions. Anything beyond that is outside the scope of this discussion. If you’re unfamiliar with the SharePoint integration setup, please visit Manage SharePoint documents. Once you’re comfortable with that, return to this article and pick up the instructions from there.

Step 1: Configuring the Basic Form

For our solution, we didn’t need to use the entire Dataverse form; we just needed a simple space to upload files. You can choose to use the full form or modify it to fit your needs. We’ll be uploading our forms to a table we named “Site.” A quick note: avoid naming a table in your Dataverse environment “Site.” Power Pages uses a table called “Site” that holds the site record for the Power Pages website, so using the same name could cause conflicts within the environment.

Whatever permissions you set on the form; I recommend changing the parent permission to “Global” for the purposes of this discussion. You can adjust the permissions later based on your specific needs, but for now, this will help show functionality. Once you change the parent permission’s access type, we should be all set. Still, please note—until we complete the Power Automate flow, I suggest against uploading anything into the subgrid. You’d likely have to delete it later anyway.

Aside: Spoofing the System


One thing we noticed when manually walking through this process is that in the Document Location table in Dataverse, the first Document Location record “wins” and becomes the default. This is why we need to “spoof” the system a bit. When the first file or folder is added, it automatically creates the parent document location in Dataverse, which we don’t want in this case. Instead, we want to create the external document location first. Once that’s done, we create the parent document location. After the parent is created, we’ll associate it with the external document location. This ensures that the external location takes priority in the subgrid, making it visible only to external users on the Power Pages website, while files or folders outside of the external folder structure and including the external folder will still be visible to internal users via the model-driven app.

Step 2: Building the Flow

Building out the flow is fairly straightforward. We want to automatically create the folder structure every time a new site is created, ensuring that when documents are uploaded from Power Pages, they are always directed into the external-facing folder. To implement this design effectively, we’ll use an automated trigger.

1. Trigger: When a row is added, modified, or deleted

This trigger will monitor the creation of new sites and automatically build the necessary folder structure, making sure the external folder is ready for document uploads.

SharePoint Actions

To achieve the necessary design for our folder structure, we will be using two SharePoint actions. These actions will create both the parent folder and the external folder nested inside the parent folder. Both actions will utilize the “Create new folder” action.

2. Create new folder (Create Parent)

  • Site Address:
    This should be the location of your document integration. If you’re unsure of the SharePoint site location for the integration, you can review the document management settings in Advanced Settings to find it.
  • List or Library:
    Library where your documents are stored, check the site contents of the SharePoint site to verify the document library where you want your documents to be stored.
  • Folder Path:
    The folder path will be a concatenation of the Site Name and the Site GUID, separated by an underscore (_). This ensures a unique folder structure for each site created.
replace(concat(triggerOutputs()?['body/dul_name'],'_',toUpper(triggerOutputs()?['body/dul_siteid'])),'-','')

This setup will create the parent folder, setting the foundation for further customization with the external folder inside.

2. Create new folder (Create External Documents)

  • Site Address:
    Use the same site address as the Create Parent action.
  • List or Library:
    Select the same document library as the Create Parent action.
  • Folder Path:
    Using the dynamic content from the previous action, select the Title of the folder created in the parent action, then add a forward slash (/) and specify the name of your external documents folder (e.g., 0External_Documents).

This will nest the external folder inside the parent folder, ensuring that all external-facing documents are uploaded into the appropriate folder structure.

Dataverse Actions

Once the folder structure is set up in SharePoint, the next step is to establish the corresponding Document Location in Dataverse. Before proceeding with any Dataverse actions, navigate to make.powerapps.com and open the Document Locations table. Find and extract the GUID for the default site’s Document Location record, which should be titled “Documents on Default Site.” This GUID is crucial for the Dataverse actions you’re about to implement, so make sure the record is available and correctly identified before continuing.

1. Add a New Row (Create External Document Location Record):

  • Table Name:
    Document Locations
  • Name:
    External Documents
  • Parent Site or Location (Document Locations):
    sharepointdocumentlocations({GUID of Default Site record})
  • Regarding (Sites):
    dul_sites({GUID of the site in the trigger})
  • Relative URL
    0External_Documents
    (Note: “Sites” is the table name being used for SharePoint integration in this example. Your table name may differ.)

It is essential to create the external document location record before proceeding with any further steps.

2. Add a New Row (Create Parent Document Location Record):

  • Table Name:
    Document Locations
  • Name:
    Trigger Name
  • Parent Site or Location (Document Locations):
    sharepointdocumentlocations({GUID of Default Site record})
  • Regarding (Sites):
    dul_sites({GUID of the site in the trigger})
  • Relative URL
    replace(concat({Name of Trigger},’_’,toUpper({GUID of Trigger})),’-‘,”)
    (Note: “Sites” is the table name being used for SharePoint integration in this example. Your table name may differ.)

3. Update a Row (Change Parent Site on External Folder)

  • Table Name: Document Locations
  • Row ID: {GUID of External Folder Document Location}
  • Parent Site or Location (Document Locations): sharepointdocumentlocations({GUID of Parent folder})

It’s crucial to update the parent site or location to align with the folder structure in SharePoint. This process leverages relative URLs, so it’s important that the relative URLs follow a consistent hierarchy. For example:

  • Default Site Relative URL: dul_site
  • Parent Relative URL: LinkedInTestSite
  • External Folder Relative URL: 0External_Folder

By correctly linking these records, the fully constructed URL will follow the pattern: dul_site/LinkedInTestSite/0External_Folder. This ensures the folder structure in Dataverse matches the structure in SharePoint, maintaining consistency and proper document organization.

Automated Dataverse trigger and creating folder structure in SharePoint
Creating external document location record in dataverse
Creating parent folder document location
Reparenting external document location with Parent record

In practice

Steps to Set Up the Internal Subgrid:

1. Open the main form for your table.

2. Add a subgrid to the form.

3. In the subgrid properties, select Related Records.

4. Choose Documents (Regarding) from the list.

5. Select All SharePoint Documents.

6. Save the changes.

7. Publish the form to apply the updates.

Test out internal subgrid

  1. Navigate to the table with the integration and select a record.
  2. Click Edit to open the form.
  3. In the subgrid, choose the Document Location dropdown and select All Locations. This will display all files in the folder structure.

In my case, the internal subgrid shows 5 records, including those uploaded from Power Pages or residing in the External Documents folder, but on the Power Pages side, only 3 files are visible.

  • This is because documents on the Power Pages side are written to the External Folder, so only documents in that folder will be visible to external users.

Important: If you upload a file to any location other than the External Folder, it will not be accessible to external users.

If you select the External Documents tab in the internal subgrid and upload a file, it will be added directly to the External Folder, making it visible to external users.

Internal facing subrid with all files including files uploaded from Power Pages
Files residing in the External Documents folder shown here in Power Pages

Things I wish I knew: Converting File Content to JSON

  1. Introduction
    1. Leveling Up Your Power Automate Game with Expressions
    2. Why Expressions Matter
    3. Working with JSON Files Using Expressions
    4. Real-World Example: Inspecting a File from FTP
    5. Benefits of Using Expressions
  2. Solution
    1. Step 1
    2. Step 2
    3. Step 3

Introduction

Leveling Up Your Power Automate Game with Expressions

When I first started using Power Automate, there were a few things I wish I had known that would have saved me time and frustration. One of the biggest lessons? Mastering expressions. Expressions in Power Automate are like a secret superpower that can take your automation to the next level.

Why Expressions Matter

Expressions let you handle logic and data transformations directly within your flow. For example:

• Instead of using a Condition action to validate if outputs meet specific criteria, you can use a single expression to evaluate the data inline.

• This makes your flows more efficient, cleaner, and often faster.

Working with JSON Files Using Expressions

Let’s talk about working with JSON files—a scenario many of us face when automating workflows.

Instead of:

1. Taking the content of a JSON file.

2. Saving it to SharePoint (or another location).

3. Opening the file to inspect its contents.

You simply use an expression in Power Automate to directly parse and view the content of the JSON at runtime.

For instance:

• Use the json() expression to convert file content into a readable JSON object.

• This lets you work with the data directly in the flow without needing to save or manipulate the file externally.

Real-World Example: Inspecting a File from FTP

Imagine this scenario:

• A file is generated on a server and fetched into Power Automate using the SSH – FTP connector.

• Instead of copying this file to SharePoint just to review its contents, you can apply an expression like json(outputs(‘Get_file_content’)) to convert the file content directly into JSON format.

• This approach enables you to extract, validate, or manipulate data from the file immediately during the flow’s runtime.

Benefits of Using Expressions

1. Efficiency: Skip unnecessary steps like saving files or creating extra conditions.

2. Flexibility: Work with complex data structures, like arrays and objects, without additional processing.

3. Cleaner Flows: Reduce the number of actions in your flow, making it easier to manage and debug.

Solution

Step 1

Create a Power Automate flow with a “Button” trigger.

Step 2

Add a Compose action, and in this action, write the following expression.

json(base64ToString('ewogICAgImludm9pY2VOdW1iZXIiOiAiSU5WLTAwMSIsCiAgICAiaW52b2ljZURhdGUiOiAiMjAyNC0xMS0yOCIsCiAgICAiZHVlRGF0ZSI6ICIyMDI0LTEyLTI4IiwKICAgICJjdXN0b21lciI6IHsKICAgICAgICAibmFtZSI6ICJKb2huIERvZSIsCiAgICAgICAgImFkZHJlc3MiOiAiMTIzIE1haW4gU3RyZWV0LCBDaXR5dmlsbGUsIENvdW50cnkiLAogICAgICAgICJlbWFpbCI6ICJqb2huLmRvZUBleGFtcGxlLmNvbSIKICAgIH0sCiAgICAiaXRlbXMiOiBbCiAgICAgICAgewogICAgICAgICAgICAiZGVzY3JpcHRpb24iOiAiUHJvZHVjdCBBIiwKICAgICAgICAgICAgInF1YW50aXR5IjogMiwKICAgICAgICAgICAgInVuaXRQcmljZSI6IDUwLjAsCiAgICAgICAgICAgICJ0b3RhbFByaWNlIjogMTAwLjAKICAgICAgICB9LAogICAgICAgIHsKICAgICAgICAgICAgImRlc2NyaXB0aW9uIjogIlByb2R1Y3QgQiIsCiAgICAgICAgICAgICJxdWFudGl0eSI6IDEsCiAgICAgICAgICAgICJ1bml0UHJpY2UiOiA3NS4wLAogICAgICAgICAgICAidG90YWxQcmljZSI6IDc1LjAKICAgICAgICB9CiAgICBdLAogICAgInRvdGFsQW1vdW50IjogMTc1LjAsCiAgICAiY3VycmVuY3kiOiAiVVNEIgp9'))

Step 3

Run our flow to see the JSON

SharePoint is SharePoint

  1. Introduction
  2. Why SharePoint to begin with
    1. Technical Debt from Using SharePoint
  3. Dataverse
  4. Conclusion

Introduction

This article goes against a formerly unenlightened Duke. When I started on the Power Platform, I like many of us in the community, utilized SharePoint as my back-end data source. At the time, it was a suitable database for the solutions that we needed to implement as an organization. Ahead six years and I have long arrived at the notion that SharePoint is SharePoint.

It is not to say that SharePoint isn’t a powerful solution in its own right, but it is not a relational database, nor was it ever designed to be. While SharePoint excels at document management, collaboration, and metadata tagging, it falls short when it comes to handling complex, relational data structures or scaling to meet the demands of enterprise-level applications. Over time, I’ve realized that forcing SharePoint to act as a database often leads to performance bottlenecks, limitations in data relationships, and challenges with governance.


Why SharePoint to begin with

Let me share a bit of my personal journey. For me, it all comes down to cost. SharePoint is included with most Microsoft 365 licenses, which essentially makes it a “free” option.

SharePoint is great in that a list is easy to spin up, making it a quick and accessible solution for simple data storage. It also allows developers to work with flat data structures, which are ideal for straightforward use cases like task tracking, basic form submissions, or lightweight data management. However, as the complexity of your application grows—requiring relational data, scalability, or advanced querying capabilities—SharePoint’s limitations become more apparent.

While SharePoint shines in scenarios where document management and collaboration are key, it struggles with handling intricate relationships, large datasets, or scenarios demanding real-time performance. For these cases, leveraging a relational database like Dataverse or SQL Server ensures your solution is built on a foundation designed for robust, enterprise-grade applications. The key is to use SharePoint for its strengths and know when to transition to more advanced data solutions for evolving requirements.

Technical Debt from Using SharePoint

Technical debt can easily accumulate with ‘freemium’ offerings, as these often require workarounds to implement effectively. One thing that has consistently saved me when working with SharePoint is Power Automate. It enables functionality in SharePoint that is otherwise native to relational databases like Dataverse.

For example, in Power Apps, there’s a delegation limit when using SharePoint as a database. To address this, you might apply filters to narrow down your data. But what happens when your filters still exceed the delegation limit? In such cases, you’d need to get creative with your approach—either by devising clever patterns to work around the limitation or by using Power Automate to fetch the required rows and return the dataset to your app. This approach may work for smaller apps and for a limited time, but as your data grows this approach can be made obsolete in a short matter of time.


Dataverse

Today, solutions like Dataverse provide the robust, scalable, and secure foundation needed for modern Power Platform applications. While SharePoint still has its place for document-centric workflows or lightweight data storage, Dataverse is purpose-built for managing structured, relational data with advanced capabilities like business rules, row-level security, and seamless integration across the Power Platform.


If your solution design requires a model-driven app, you can use SharePoint data, but it would need to be integrated via a virtual table in Dataverse. This is because model-driven apps rely on Dataverse and can only access tables that exist in Dataverse, either directly or through virtual tables. However, this would make your solution a premium offering because of the Dataverse integration.

This applies not only to model-driven apps but also to other Power Platform solutions. For example, if a canvas app or a Power Automate flow needs to handle complex relationships, larger datasets, or enhanced security, using Dataverse may become necessary. SharePoint works well for lightweight scenarios, such as simple lists or document-centric workflows, but its limitations in relational data handling and delegation can pose challenges for more complex solutions.

Dataverse is designed to handle structured, relational data and works seamlessly across the Power Platform. While virtual tables can connect SharePoint to Dataverse, this approach adds complexity and costs due to the premium licensing required.

When deciding between SharePoint and Dataverse, it’s important to consider the needs of your solution. SharePoint can be a great option for straightforward use cases with a focus on cost efficiency, while Dataverse is better suited for scenarios requiring scalability, advanced relationships, and deeper integration with Power Platform features. Balancing these factors will help you choose the best approach for your project.

Key Differences

Feature/CriteriaSharePoint Dataverse
Purpose Document management system with lightweight data storage capabilities.Enterprise-grade relational database for structured data.
Data Type Suited for flat or semi-structured data (e.g., lists). Designed for structured, relational data with complex relationships.
Scalability Handles small to medium datasets (up to 30M items per list).Built for large, enterprise-level datasets and scalable storage.
Complex Relationships Limited support for lookups and relationships. Supports many-to-many, one-to-many relationships, and hierarchical data models.
Security SharePoint permissions are list or document-level. Fine-grained, row-level security with role-based access control (RBAC).
Data Capacity Limited by SharePoint site collection storage quota. Capacity based on allocated Dataverse storage in your tenant.
Offline Capability Limited offline capabilities (e.g., via synced lists). Supports offline data access in Power Apps and Dynamics 365.
Integration Seamless with Microsoft 365 (Teams, OneDrive, Office). Seamless with Power Platform (Power Automate, Power Apps, Power BI).
Data Validation Basic validation (via Power Automate or custom scripts). Advanced validation with calculated columns, business rules, and workflows.
Licensing Cost Included with Microsoft 365 subscriptions. Requires premium licensing (e.g., Power Apps Plan 2, Dynamics 365 licenses).
API Integration Limited REST API capabilities, slower for bulk operations. Rich APIs, faster for batch and complex operations (OData, Web API).
Governance Harder to implement strict data governance. Strong governance with auditing, compliance, and managed environments.

Conclusion

SharePoint can be a great lightweight data source, and I’m not here to steer you away from it. Instead, my aim is to encourage careful planning when designing your solutions. I still use SharePoint for quick proofs of concept or in production environments to hold files that need to be manipulated within a flow. If SharePoint works for your needs and you’re confident in managing its limitations, by all means, use it. Just be mindful of the potential technical debt and additional programming it may require to handle tasks that Dataverse supports natively.

The key takeaway is this: leverage SharePoint for what it excels at—collaboration, document management, and lightweight data storage. When your application requires more complexity, scalability, or advanced capabilities, consider Dataverse or other dedicated database solutions. Understanding and respecting these distinctions has been pivotal in creating solutions that not only address current needs but also provide a solid foundation for future growth.