1.2 Beginning Microsoft Dataverse - Exploiting Microsoft's Low-Code
1.2 Beginning Microsoft Dataverse - Exploiting Microsoft's Low-Code
Microsof t
Dataverse
Exploiting Microsof t’s Low-code Database
for the Power Platform
—
Brian Hodel
Beginning Microsoft
Dataverse
Exploiting Microsoft’s Low-code
Database for the Power Platform
Brian Hodel
Beginning Microsoft Dataverse: Exploiting Microsoft’s Low-code Database for the
Power Platform
Brian Hodel
Bothell, WA, USA
iii
Table of Contents
Record Ownership����������������������������������������������������������������������������������������������������������������� 26
Other Advanced Table Options����������������������������������������������������������������������������������������������� 27
Configuring a Custom Table��������������������������������������������������������������������������������������������������� 28
Columns��������������������������������������������������������������������������������������������������������������������������������� 30
Relationships������������������������������������������������������������������������������������������������������������������������������ 39
Relationship Types����������������������������������������������������������������������������������������������������������������� 40
Creating Relationships���������������������������������������������������������������������������������������������������������� 42
Summary������������������������������������������������������������������������������������������������������������������������������������ 49
iv
Table of Contents
v
Table of Contents
Index��������������������������������������������������������������������������������������������������������������������� 221
vi
About the Author
Brian Hodel is a Microsoft Power Platform developer who
is passionate about solving problems. His background in
Lean Six Sigma and interest in application development
converged as he began using what would eventually come
to be known as the Power Platform. Since then, he has been
developing enterprise solutions based on Dataverse in the
Power Platform, speaking at conferences, participating
on customer advisory boards with Microsoft product
development teams, and leading the internal Power
Champions group at his current company.
vii
About the Technical Reviewer
Doug Holland is a software architect with over 20 years of professional experience,
including 13 years at Microsoft Corporation. He is a former Microsoft MVP and Intel
Black Belt Developer and holds a master’s degree in software engineering from Oxford
University.
ix
Introduction
As businesses have evolved to be more nimble in response to changes in customers,
economies, and workforces, they have put more pressure on technology organizations
to evolve faster and be bigger than ever before. Information technology (IT) is no longer
an isolated services organization that can move at a glacial pace, delivering a new
monolithic system every few years that the business must learn to adapt to. Technology
must be released quickly, adjusted frequently, and be maintainable by many. This book
shows how to utilize Microsoft Dataverse in the Microsoft Power Platform to enable
technology organizations to move and change faster than ever before. By learning how to
use this low-code/no-code development platform, you will be able to deliver solutions
with, not to, business teams, challenge the status quo, and step confidently into the
future with a solid foothold in the next generation of platform development tools.
xi
Introduction
Microsoft had the basic building blocks: applications, workflows, and reports,
but it still needed to put them together in a meaningful way. In comes Dynamics 365,
and, with it, Common Data Service, soon to be called Dataverse. These tools, together,
represent a tremendous amount of power with a relatively low entry threshold. This
means you are no longer required to rely on a small group of highly skilled developers to
solve software-related problems.
xii
Introduction
The time-to-value is faster and the barrier to entry into the field of software development
has gotten lower, which means that the tools are, once again, placed in the hands of the
front-line workers. Enter the “Low-Code Revolution.”
Low-code Revolution
The “Low-Code Revolution,” or, more recently, “Low-Code / No-Code Revolution”
as the capabilities of the platform have expanded, is a term that Microsoft has
adopted to describe the application development tools in the Power Platform. Just as
assembly language led to functions, which led to libraries, the evolution of application
development has moved from pure code to WYSIWYG, making development tools
more accessible to everyone. Microsoft began taking all those pieces and putting them
together to create an amazing self-serve productivity platform where users could build
apps, workflows, and reports, and store data, all in one place.
The concept behind low-code / no code is not unique to the Microsoft Power
Platform. There are other vendors that have solutions that support the concept of
allowing the development of applications, workflows, and databases without requiring
a user to have extensive knowledge or skills. However, Microsoft has, arguably, done this
better than any other vendor.
xiii
Introduction
Intended Audience
The Power Platform consists of an array of components that enable the development of
a variety of solutions by all levels of developers, from citizen to professional. This book
is for Microsoft Power Platform users who want to exploit Microsoft’s low-code database
offering for that platform; for power users and citizen developers who are looking for
tools to quickly build scalable business solutions that don’t require a strong developer
background; and for Dynamics 365 developers who want to better understand the
backend of that system.
xiv
CHAPTER 1
P
ower Apps
Power Apps is the application development component of the platform. It is primarily
a What You See is What You Get (WYSIWYG) interface, but also has the capability
to handle full-code development through custom components. Far from its roots
as a simple form builder, Power Apps enables users to build robust applications
that can handle complex business needs and be quickly developed and changed as
business needs shift. Power Apps can be broken into two distinct types: Canvas and
Model-Driven.
1
© Brian Hodel 2023
B. Hodel, Beginning Microsoft Dataverse, https://doi.org/10.1007/978-1-4842-9334-8_1
Chapter 1 Microsoft Power Platform
• Canvas Apps. These are typically apps that require very specific
custom interfaces and layouts. Canvas apps allow developers to
design completely custom layouts, colors, fonts, animations, etc.
Canvas apps can be used with Dataverse, but they can also be used
on any of the hundreds of other data sources that are available
through the connectors. (See Figure 1-1.)
2
Chapter 1 Microsoft Power Platform
Power Automate
Power Automate, previously known as Flow (which you will still see all over the product),
is the workflow engine of Power Platform. It allows the automation of tasks like record
updates, notifications, approvals, and many more. It also has powerful analytical
features to discover processes, analyze them, and improve them, as well as a variety of
artificial intelligence (AI) capabilities. Power Automate can be broken down into three
distinct types:
• Cloud Flow. These run entirely in the cloud and are incredibly
customizable. You can choose from thousands of trigger actions
and logic paths to build out process automations quickly and easily.
Its graphical interface is similar to that of Microsoft Visio in nature,
so even new users can pick up this tool quickly and begin building
workflows.
3
Chapter 1 Microsoft Power Platform
• Power Pages. These are websites that allow users outside your tenant
to access and interact with data in Dataverse tables. There are several
options for authenticating users or allowing anonymous access, so
they are more versatile than the other app options available in Power
Platform. Power Pages is a new, more feature rich version of the
previous Portals product.
4
Chapter 1 Microsoft Power Platform
Power BI
Power BI allows users to build advanced reporting and analytics with interactive visuals
and self-service features for end users, such as drilling down to further investigate data
points or exporting report data to Excel to do ad-hoc analysis. Newer features such as
Datamarts and Microsoft Teams integration have extended the self-serve aspects of
Power BI even further, keeping with the theme of the platform.
While Power BI does not have any Dataverse-dependent features, Dataverse does
natively integrate with Power BI seamlessly. Pulling a Dataverse data source into Power
BI also brings any table relationships, record permissions, and many other features that
make using Dataverse a clear winner as a data source for Power BI.
Dataverse
Dataverse is often viewed as a database, and, technically, that is correct. However, the
database is only one small part of what Dataverse is. Calling Dataverse a database is like
calling the ocean a bunch of water. The ocean is indeed a body of water, but it also has a
great many other things that it is and supports. The ocean supports an immense variety
of life both in the water and on land, is integral to the weather, and provides a method
by which to move people and goods around the world. The ocean is a foundation of the
ecosystem here on Earth, and, in a similar way, Dataverse acts as a support structure
for Power Platform with data storage, security and compliance, application lifecycle
management (ALM) enablement, and much more. Without Dataverse, Power Platform
would simply be a set of tools existing in Microsoft Office. But, with Dataverse, Power
Platform is a cohesive platform that enables users to solve problems faster, better, and
cheaper than they could with traditional tools available in the market.
5
Chapter 1 Microsoft Power Platform
6
Chapter 1 Microsoft Power Platform
Environment Types
There are five types of environment: default, sandbox, production, and Microsoft Teams,
and trial. Each of these environments has a unique set of features and limitations, so
having a good understanding of these is important to effectively using Power Platform.
However, here is an overview of the different types for reference:
• Sandbox. These are meant to be used for testing, so they have some
additional features, such as the ability to reset, copy, and delete, that
are not available in production environments. Typically, you would
have a sandbox environment of your development work and possibly
another one for your test environment. You could even get fancy and
create one in a different region that gets updates prior to yours to test
out new features, but that is a bit much for most use cases.
7
Chapter 1 Microsoft Power Platform
• Production. For any solutions that you have that are in use for
production purposes, you want to make sure you are using a
production environment. Production environments are specifically
set up to prevent you from accidentally losing your work because they
require extra steps to do any major changes, like deleting or resetting.
Also, you typically want to restrict development work in a production
environment to avoid any production downtime. This means
restricting who has elevated permissions in these environments as
well as using managed solutions (more on that later).
Each environment is isolated from the others, so you can make any changes you
want in your Development and Test environments, or even delete them, without
affecting your Production environment. This also applies to data stored in Dataverse
and any connections to external data. When migrating a solution from your test to your
production environment, it will not carry over your data or connections, so it will not
impact what you have set up in production. This makes it very convenient if you have
confidential data in production that your test users don’t have clearance to see.
It is also worth noting that environments don’t have a Dataverse database installed
by default. If you want to use Solutions, which are critical to proper ALM practices, you
are going to need to install the database by going to the Power Platform Admin Center
at https://admin.powerplatform.microsoft.com/environments, selecting your
environment, and selecting Add Database, as shown in Figure 1-4.
8
Chapter 1 Microsoft Power Platform
Power Platform supports full ALM capabilities and has a variety of ways to
implement it successfully to ensure that you are building and releasing quality products
while maintaining maximum uptime. The core feature that enables effective ALM in
Power Platform is a feature called solutions.
9
Chapter 1 Microsoft Power Platform
Solutions
Solutions are a way of storing all the artifacts, such as apps, flows, tables, etc., of a project.
You can think of it as a folder of all your project documents. Having a solution keeps all your
projects separate and organized, so you don’t make changes that cause issues in someone
else’s work. They also make it easier to ensure that you have all your dependencies accounted
for as you prepare to migrate between environments. You can add and remove artifacts from
solutions, and even have individual artifacts existing in multiple solutions since the solutions
are simply referencing the artifacts, not actually storing them.
Because solutions allow you to organize your environment assets into a folder-like
structure, you don’t necessarily need to have a different set of environments for each
solution. You may have a team with many apps that are not related, but have a similar set
of permissions, users, or datasets. If this is the case, you can simplify things by building
several projects in one environment set and using solutions to organize them. Solutions
can also have different Publishers, which can help to organize who owns each solution.
You can choose the Default publisher, which is typically set as Default Publisher for
orgxxxxx, but you can easily crate different publishers for each or groups of solutions.
Solution Types
Every environment starts with at least one solution, the default solution. This solution includes
every artifact in that environment. You cannot export the default solution, so it cannot be
used for ALM. Instead, you will want to create a new solution for each project to keep things
organized and easy to migrate to your downstream environments, such as test and production.
Tip The default solution is a good place to look for artifacts that you cannot find
in your environment, because it contains every artifact from every other solution.
There are two different types of solutions, and each has a distinct use. When you
create a new solution, it will be unmanaged:
10
Chapter 1 Microsoft Power Platform
Although there are settings at the individual artifact level that allow or restrict
specific changes in managed solutions, the primary differences are outlined in Table 1-1.
Deployment
There are two primary methods of deploying your solutions to downstream
environments: manual and automated. While the goal of an effective ALM strategy
should involve automation of deployments, I generally recommend that new developers
use the manual method initially because it gives them a chance to see the individual
steps as they occur and where to look for errors if they come up.
The basic method for implementing ALM is to manually export and import solutions
from one environment to another using solutions. I generally recommend that new users
start with this method because it is quick and easy and allows them to see the steps as
they progress, one-by-one.
11
Chapter 1 Microsoft Power Platform
“Before you export,” as shown in Figure 1-5. Ensure you always select “Publish all
changes” before you proceed as only published changes will export with your solution
file. The other option is to “Check for issues”; while this is not required, it is a good idea
to run this to see if it identifies any performance or stability issues before you move on.
Select “Next” to move to the next screen. On this screen you will be prompted
to select either “Managed” or “Unmanaged.” (See Figure 1-6.) If you are exporting
to a production environment, always select “Managed.” If you are exporting to a test
environment, you can choose either “Managed” or “Unmanaged,” depending on your
preference.
12
Chapter 1 Microsoft Power Platform
You also have the option to set the version in this screen. However, the system will
automatically increment the version number every time you export. It is set up as major.
minor.build.revision, such as 2.1.2.112, as the format, so you can adjust this as necessary
for your needs.
After choosing either “Managed” or “Unmanaged,” and optionally adjusting the
version number, select “Export,” and the system will begin processing your request. After
a short time, you will see a prompt appear at the top of the screen that allows you to
download the solution file to your computer.
13
Chapter 1 Microsoft Power Platform
• Upgrade. This is the most common option and will be used in most
cases. It will make any changes to the solution in the destination
environment, such as updating, adding, or removing any assets
to match what is in the new solution file you are importing. This
process basically consists of two stages: importing the new version,
then deleting the old version. They are done in sequence and
automatically. If you choose this and move forward with selecting
“Import,” you will not need to take any further action to complete the
deployment.
14
Chapter 1 Microsoft Power Platform
• Update. This is rarely used because it replaces the older solution with
the new version instead of upgrading it. This means that any assets
that are not in the upgrade will not be removed from your solution, so
your destination environment will no longer match what you have in
your source environment. There are cases where this is a valid option,
such as if you need to make updates but not, for instance, remove an
existing workflow that is part of the solution. This will complete your
deployment.
Automated Deployments
Once you are familiar with the operations and how things work when manually updating
your solutions between environments, you will want to look into how to automate those
deployments. Both Azure DevOps and GitHub have pre-built actions that you can utilize
to build out pipelines to make your deployments easy and predictable. In addition, you
can utilize other features in the tools, such as approval steps, test plans, version history,
and variable updates. Utilizing these automated tools allows you to have total confidence
in your ALM process and ensure the highest-quality product gets to your customers.
As you can see in Figure 1-7, the steps are basically the same as those for the manual
process, but the pipelines allow you to automate the tasks and add in additional controls
and steps. You can also add in approval phases between the steps if you have a team
of testers that must sign off before a deployment is completed. This is often the case
for controlled environments where a team of people performs user acceptance testing
(UAT) to verify changes in a test environment before things are moved to production.
You can see an example of this in Figure 1-8.
15
Chapter 1 Microsoft Power Platform
16
Chapter 1 Microsoft Power Platform
Solution Layers
As I said before, an asset can be part of multiple solutions simultaneously. For instance,
you may have a products table that is used in two different projects in the same
environment. This is completely normal in app development, and in Power Platform is
made visible and manageable in what are referred to as solution layers. There are two
distinct types of layers in Dataverse:
17
Chapter 1 Microsoft Power Platform
If you were to then import Solution 1 to the environment, it would become the top
layer, and, as such, would update the runtime behavior to a max length of 150, as shown
in Figure 1-10.
The runtime behavior of an asset applies to the entire environment, not just the
assets in that layer. This is why it is important to understand how changes to your
solution may impact others who are working in your environment if they are using that
same asset.
18
Chapter 1 Microsoft Power Platform
Unmanaged Layers
Unmanaged layers can occur in one of two ways:
Doing either of these changes would result in an unmanaged layer, and the
unmanaged layer always persists at the top level of the layer stack. You can see an
example of this in Figure 1-11. In this example, a user went into the environment and
updated the max length property to 400, which created an unmanaged layer to that asset
and changed the runtime behavior to a max length of 400. Now, that unmanaged layer
will persist even if the managed solutions are updated, because the unmanaged layer
always stays at the top. Therefore it is risky to make unmanaged changes.
19
Chapter 1 Microsoft Power Platform
S
ummary
While Power Platform has a great many tools, many more than are even discussed here,
you can see how Dataverse is very much the core of the platform. And while you can
use the separate components without Dataverse, you are going to struggle with building
truly enterprise-grade tools. Dataverse gives you a tremendous number of features and
capabilities, from model-driven apps and business process flows to scalable data storage
and advanced ALM capabilities.
Now that you have a decent understanding of Power Platform and the major
components that are available, we can discuss how to use those components to design
and build solutions that can change the world.
20
CHAPTER 2
Data Layer
In the previous chapter we discussed the high level components of the Microsoft Power
Platform and how they work together to enable enterprise solution development using
grade low-code/no-code tools such as Power Automate and Power Apps. I this chapter,
we start getting into the details of how to design and build the Data Layer of your solution.
The data layer handles the table design and architecture, as well as storage and access to
the data. In Dataverse, like in other tools in Power Platform, the abstraction layer removes
a great deal of the complexity of building and configuring the database from the users and
allows them to focus on what they need, not how it is done. Creating tables, configuring
fields, and managing keys can get a bit confusing for someone who isn’t familiar with SQL
development, but is very intuitive in Dataverse. In short, you don’t have to worry about the
backend because Dataverse handles that for you. So, in this chapter, we will simply discuss
the Dataverse interfaces and how to set up your data structures; we won’t worry about the
layer beneath Dataverse where the actual data is stored, or how it operates.
T ables
Because Dataverse is an abstraction of data storage, it is important to be aware that
you may run into different terminology as you browse documentation. You can see this
terminology breakdown in Table 2-1.
Table 2-1. Dataverse Terminology
Power Apps UI Dataverse SDK Dataverse Web API
21
© Brian Hodel 2023
B. Hodel, Beginning Microsoft Dataverse, https://doi.org/10.1007/978-1-4842-9334-8_2
Chapter 2 Data Layer
Tables are essentially structures that hold data. Each table is defined by a set of
columns that specify the type of data that can be stored in it. As data is added, it is stored
in the form of rows. The size of a table, therefore, is a combination of the number of
columns and number of rows, as a table is a matrix of the two concepts.
Creating a Table
To create a table, go into your solution, select “New” and then “Table.” You will see a
dialogue like the one shown in Figure 2-1 to configure your table properties.
All tables have a base set of attributes that describe them, as follows:
• Display Name: This is the friendly name of the table and generally
refers to the content of the table, such as Account or User. It is best
to name tables in singular form since there is also a plural name
in Dataverse. I also find that using Pascal case naming, such as
PascalCaseExample, eliminates annoying spaces when referring to
tables in code.
22
Chapter 2 Data Layer
• Plural Name: This is the plural of the friendly name, such as Accounts
or Users. The plural name can be different than the display name, but
this can get confusing when referencing tables in the future.
Filling out this information is enough to configure a standard table with basic
options enabled. However, it is good to be familiar with the advanced table options as
some options cannot be changed after the fact, such as table type or record ownership
settings. Figure 2-2 shows the first section of advanced options for table creation.
Schema name is automatically populated from the display name as you type it.
However, like the plural name, it too can be changed, although it is advisable to leave
this synced to the display name to avoid confusion in the future. The schema name also
23
Chapter 2 Data Layer
has a prefix portion that is set automatically depending on what is set as the publisher
of your solution. Generally this will be set as Default Publisher for orgxxxxx… but can be
changed if you are working on different projects or need different publishers for some
reason. Refer to Chapter 2, Application Lifecycle Management (ALM) section for more
details on solutions and publishers for. This prefix, however, cannot be changed after the
fact, even if you change publishers. So, it is advisable to pick a publisher and stick with it
so that all your solution assets share the same prefix.
Table Types
There are multiple types of tables in Dataverse and each has its own purpose. It is
important to understand and plan for the correct type of table to meet the needs of your
architecture. While you can use custom tables for your entire solution, you will not be
able to utilize the out-of-the-box features that come with the other types, which may
make your development work harder and more complex.
Standard
There are a number of standard, or out-of-the-box, tables that are included in an
environment. These are tables such as Users, Accounts, and Contacts, which are created
when an environment is created and are available to help standardize data structures
as part of the common data model. Any custom tables that are imported into an
environment as part of a managed solution also show up as a standard table.
Custom
Any table that is created from scratch in an environment is classified as custom. Custom
tables still have default columns, such as Modified By, Created By, and Created On,
but they are mostly configured to store data in a structure that does not conform to the
structures contained in the standard tables.
Tip Whenever possible, use the out-of-the-box tables when building out
your data structures. This helps to keep your data structures consistent across
the platform and allows you to use some of the other out-of-the-box features
associated with those tables.
24
Chapter 2 Data Layer
Activity
Activity tables are intended to store data that can be organized on a calendar, such as
appointments and phone calls, as seen in Table 2-2. They have columns such as Start
Time, End Time, and Due Date to facilitate scheduling, as well as a variety of other
columns that are commonly associated with activities.
Note Since activity tables’ date/time fields are set as part of the common data
model, they are pre-configured with the Time Zone Adjustment set to User Local,
so the UTC Conversion Time Zone Code should be utilized to specify the time zone
information for records.
25
Chapter 2 Data Layer
Virtual
Virtual tables enable you to work with external data sources within Dataverse without
the need to duplicate the data in Dataverse tables. Virtual tables give you a view of the
data in your external data sources and allow you to relate data to data in other Dataverse
tables and even update data in the external source without having to build any sync
operations.
While these are worth mentioning in the data layer discussion, they are really more
of a data integration tool and are therefore discussed more in the Integration chapter.
Record Ownership
The Record Ownership setting dictates how security can be configured for the records in
your table. Record Ownership has the following possible options:
Note Record ownership cannot be changed once a table is created. If you are
unsure if you will need RLS on a table, it is best to choose User or Team ownership
and leave yourself the option down the road.
Color and Image are completely optional aesthetic settings that can be modified at
any time.
26
Chapter 2 Data Layer
Caution Options with an asterisk in Table 2-3 cannot be turned off if enabled.
Be sure to only turn these on if you need them.
(continued)
27
Chapter 2 Data Layer
Property Description
28
Chapter 2 Data Layer
• Columns and Data: This section allows you to view a sample of the
records that are in the table. This view is customizable, so you can
add/remove and move the columns that are shown to customize this
view. You can also edit the data directly in this view by typing directly
in the cells. The Edit button in this section will open your data in a
larger window to view and update records.
29
Chapter 2 Data Layer
Columns
Except for virtual tables, all tables in Dataverse have a standard set of columns that
facilitate reporting and permissions. Columns such as Created On and Modified On
allow you to track important metadata around record transactions, while Created By and
Owner allow you to assign permissions to records. However, beyond that, you are free
customize the table by adding any columns that you want to create a table that works for
your needs.
Adding Columns
Like many things in Power Apps, there are various ways to add a column. However, I find
the easiest way is to start from the Table Properties screen and go to the Columns list, as
seen in Figure 2-4. From there you can see a full list of the columns in your table, which
helps you to orient yourself to what you have currently and keep track of fields as you
add them, as seen in Figure 2-5.
30
Chapter 2 Data Layer
From this screen, simply select “New Column” from the top of the screen and begin
filling in the properties of your new column.
As seen in Figure 2-6, there is a section for basic column configuration settings, as
well as an Advanced Options section, as with the interface for creating a new table.
31
Chapter 2 Data Layer
• Data Type: This is the setting for the type of data that can be stored
in the column. There are both basic data types, such as text, number,
and date/time, as well as complex data types, such as lookup and
file. These types will be discussed in more detail in the “Data Type”
section.
32
Chapter 2 Data Layer
• Format: This section allows you to define the format of the data
that is to be stored in the column. This is different than in typical
databases in that you can define the data type as text, but the format
is restricted to URL or Rich Text. It is important to think about
the context of your data before setting these options as they are
not always editable after the column is created. This will also be
discussed in more detail in the “Data Type” section.
• Simple: Allows you to store data in the field via normal write
operations
33
Chapter 2 Data Layer
• Related Table: This option is only available if the Data Type setting
is set to Lookup. This is because lookup type fields are references to
other tables and need a target table to be specified. This type of field
also creates a relationship to the related table, which can be used for
a variety of things in apps, workflows, and calculations. You can find
more about this type of field in the “Relationships” section.
The Advanced Settings section for columns is driven by the data type selected.
Options can include max length and min/max values allowed. These are important
to evaluate because adequate input validation helps to increase data integrity in your
environment.
There are a few notable mentions in this section that I will go over here:
• Schema Name: This is the technical name for your field. Dataverse
automatically creates this based on the display name that is entered
but removes any spaces and appends the publisher prefix to it.
Note All Dataverse data is encrypted using SQL Server Transport Data
Encryption, which encrypts data as it is written to disk. This is known as encryption
as rest. More details on Dataverse data privacy can be found at Compliance and
data privacy.
34
Chapter 2 Data Layer
• Date Only: This option will only store a date and will not store
time zone information. Users from around the world will always
see the same date.
• User Local: This option will save data with the user’s local time
zone. Users around the world will see this value automatically
offset for their local time zones.
35
Chapter 2 Data Layer
• Sync with Global Choice: This option specifies whether the list
of choices is available to other tables or not. If you choose “Yes” it
will create a global list of options that can be used across multiple
fields and multiple tables. This is very helpful for keeping options
synced across your environment and reducing the work of
updating values as needed. Selecting “No” creates a list of options
that is only available to be referenced from the field that it is
created for. This can be useful if you have different values for each
table. For instance, you may have five phases for one table but
three phases for another table.
• Sync This Choice With: This option is available when you choose
“Yes” on the Sync with Global Choice option and allows you to
select an existing set of options if one already exists. You can also
choose “New Choice,” which will allow you to create a new list of
options. For more on this, see the “Choices” section.
Once you have completed setting the options, you simply click Save, and your
column will be added to your table. Since you are already in the table columns list, it will
be easy to tell which fields you have and which are not yet added.
36
Chapter 2 Data Layer
Alternate Keys
While every record in Dataverse automatically has a unique identifier, you can create
alternate keys as well. This can be useful in reporting or when integrating existing tools
and applications with data.
To create a key, navigate to the table dashboard and select “Keys” to view the key list,
as seen in Figure 2-8.
37
Chapter 2 Data Layer
The New Key dialogue is shown in Figure 2-10 and has the following settings:
• Columns: Select the columns that you want to use to create the key.
The key value will be a concatenated value generated from the fields
that are selected.
38
Chapter 2 Data Layer
Note Keys will not be available immediately after creating them. After you
click the Save button in the dialogue, a system job is initiated to create database
indexes to support the key.
Relationships
Relationships in Dataverse are a valuable tool that helps to preserve data integrity,
automate business processes, enforce security, and structure data in a meaningful
way. Creating well-structured relationships can also increase performance and make
creating apps faster and easier by informing the apps of how data across your database is
interconnected.
39
Chapter 2 Data Layer
Relationship Types
There are three types of relationship shown in the Power Apps UI. This is because
relationships are created in the designer as being in the context of the table being worked
on. A one-to-many relationship is simply a many-to-one relationship viewed from the
other table. Both one-to-many and many-to-one are classified as “N:1” relationships,
whereas the many-to-many relationship is classified as an “N:N” relationship.
One-to-Many
In a one-to-many relationship, a row in table A can be related to multiple rows in table
B, but a row in table B can only be related to one row in table A. An example of this is a
single account might have multiple invoices, but an invoice can have only one account.
When creating a one-to-many relationship in Dataverse, a column is created in table
B that stores the ID of the record from table A, as seen in Figure 2-11.
Many-to-One
In a many-to-one relationship, multiple rows in table A can have a relationship to a
single row in table B, but a row in table B can have only one related record in table A. An
example of this is multiple accounts might have the same primary contact record.
When creating a many-to-one relationship in Dataverse, a column is created in table
A that stores the ID of the child row, as seen in Figure 2-12.
40
Chapter 2 Data Layer
From this, you can see that the many-to-one and one-to-many relationships are, in
fact, the same type of relationship, only looked at from the opposite direction.
Note In one-to-many and many-to-one relationships, the row on the “one” side
is commonly referred to as the parent row and the rows on the “many” side are
commonly referred to as child rows. This is because it is common to architect
things like permissions to flow downstream from the primary record to the related
records.
Many-to-Many
In a many-to-many relationship, a row in table A can be related to multiple rows in table
B, and a row in table B can be related to multiple rows in table A. This is different from
the other types of relationship because there is no restriction to have only a single record
in the related table from either direction. An example of this would be multiple invoices
might have multiple items on them and each item might be on multiple invoices.
A many-to-many relationship is a bit more complex because you cannot simply store
a record ID in either of the rows because there are multiple IDs that need to be stored
in either table’s records. This means that there needs to be an intermediate table that
matches the IDs from either record together, as seen in Figure 2-13.
41
Chapter 2 Data Layer
Creating Relationships
To create a relationship, navigate to the table dashboard of one of the tables that you
want to create the relationship to. As described in the “Relationship Types” section, you
can choose either table to create the relationship from and the result is the same.
From the table dashboard, as seen in Figure 2-14, select “Relationships” to go to the
relationship list for that table.
42
Chapter 2 Data Layer
Select “New Relationship” from the toolbar and select the type of relationship you
want to create, as seen in Figure 2-15.
One-to-Many/Many-to-One Relationships
If you select one-to-many, you will see a dialogue like the one in Figure 2-16. In this
example, you can see that one-to-many is selected by the title of the dialogue.
43
Chapter 2 Data Layer
However, if you select a many-to-one, you will see a dialogue like the one in
Figure 2-17. If you look carefully, you can see that the dialogues are simply looking at the
same relationship from different sides.
44
Chapter 2 Data Layer
• Current (One) Table: This displays the current selected table and the
type of that side of the relationship.
• Related (Many) Table: This is where you select the table that you
want to create the relationship with. This list shows all the tables in
the environment, so be careful to select the correct one.
• Lookup Column Display Name: This will be the name of the column
that is created to store the ID of the parent record, as discussed in the
“Relationship Types” section.
45
Chapter 2 Data Layer
• Lookup Column Name: This is the technical name of the field and is
automatically created for you like in other places in Dataverse. There
is usually no need to change this.
Note The Lookup Column properties are always listed under the table on the
“many” side of the relationship on both the many-to-one and one-to-many
relationships. This is because the field that holds the ID of the parent row is always
on the “many” side of the relationship, as discussed in the “Relationship Types”
section.
• Type of Behavior: This setting dictates what will happen to the other
table when action is taken.
46
Chapter 2 Data Layer
Assign When the primary table row is reassigned Cascade All, Active, User-Owned, or None
Reparent When the lookup value of a related table in Cascade All, Active, User-Owned, or None
a parental relationship is changed
Share When the primary table row is shared Cascade All, Active, User-Owned, or None
Delete When the primary table row is deleted Cascade All, Remove Link, orRestrict
Unshare When a primary table row is unshared Cascade All, Active, User-Owned, or None
Cascade Active Perform the action on all related table rows with a status of active.
Cascade All Perform the action on all related table rows regardless of status.
Cascade None Perform no actions on related table rows.
Remove Link Remove the lookup value from all related rows.
Restrict Prevent deletion of the table row when related table rows exist.
Cascade User Perform the action on all related table rows that are owned by the same user as
Owned the primary table row.
47
Chapter 2 Data Layer
Note Generally speaking, each pair of tables can have multiple one-to-many or
many-to-one relationships between them, but only one can be parental. If your
tables already have a parental relationship established, you will not be allowed
to create another relationship with actions from the Parental column, as seen in
Table 2-6.
Many-to-Many Relationships
Because many-to-many relationships use a relationship (or intersect) table, they do not
have the same settings as N:1 relationships. This makes creating the relationship much
simpler.
• Current (Many) Table: This displays the current selected table and
the type of that side of the relationship.
• Related (Many) Table: This is where you select the table that you
want to create the relationship with. This list shows all the tables in
the environment, so be careful to select the correct one.
the interfaces and is not interacted with directly, it does not have a
display name. However, knowing this table exists and where to find
the name may come in handy if you run into the need to do advanced
development tasks in the future.
Summary
Now that you have learned how to build a solid data structure in Dataverse, you can
move on to the next layers of your solution development with the confidence of having
a reliable foundation. With effective table and relationship designs in place, you can
confidently move to the business logic layer, where you can define how data is entered
and maintained to ensure data integrity is maintained across your solution.
49
CHAPTER 3
B
usiness Rules
Business rules allow you to apply logic and validation to columns and their
corresponding forms. While there are background processes that can accomplish some
of the same behavior, business rules execute in real time. Here are some of the things you
can accomplish with business rules:
• Set column values
L imitations
While business rules are a robust tool, they do have some limitations to be aware of
when deciding on the best tools to address your data integrity controls.
51
© Brian Hodel 2023
B. Hodel, Beginning Microsoft Dataverse, https://doi.org/10.1007/978-1-4842-9334-8_3
Chapter 3 Business Logic Layer
• Choices (multi-select)
• File
• Language
While these fields are currently not available in business rule logic, there are ways of
implementing these using other code methods. We won’t cover those in this book, but
there are many resources available online that walk through how to accomplish the same
functionality using code.
52
Chapter 3 Business Logic Layer
53
Chapter 3 Business Logic Layer
To add a node to the canvas, you can select the operation you want and drag it over the
canvas. As you drag the new node over the existing nodes on the canvas, you will see the
placeholders light up, indicating where you can place the new node, as seen in Figure 3-3.
Once you place the new node, you will see that it is automatically connected to the
existing node by a line, indicating the logic flow path, as seen in Figure 3-4.
By selecting any node, you will see that the Properties flyout changes to display
properties for that node. An example of this is the Properties section for the condition
node, as is seen in Figure 3-5.
55
Chapter 3 Business Logic Layer
• Display Name: This is just a friendly name to help you navigate your
logic map. As the business rules get larger and more complicated,
useful names become more important to keep your work organized.
• Entity Name: The Entity Name field is already filled out since you
started from the table. This field is not editable.
• Rules: This section allows you to configure the criteria for each
condition node. You must have at least one rule for each condition
node, but you can add several more conditions if necessary by
clicking the +New button.
56
Chapter 3 Business Logic Layer
• Rule Logic: This option shows up when you add more than one rule
to the condition and dictates how the rules are related. If AND is
selected, all the conditions must be met for the logic path to be taken
on the TRUE side of the condition node. If the OR is select, only one
of the conditions needs to be met for the logic path to be taken on the
TRUE side of the condition node.
Note It is important to always click Apply before selecting a different part of the
business rules canvas. If you don’t, your work will be lost for that step.
Note If you are building a Canvas app, you must choose “Entity” as the scope for
the logic to be applied to the app forms.
57
Chapter 3 Business Logic Layer
Note You cannot save a business rule that is not valid, so make sure you don’t
get too far into building it expecting to fill in details later. It is best to work in
sections, completing each section as you go.
After you have successfully saved your business rule, you must activate the rules to
turn them on. The Activate button will display after your rule has been successfully saved
and will appear in the upper-right toolbar, next to the Save button.
Note You cannot edit an activated business rule. You must deactivate the
business rule in order to make changes.
After activating your business rules, they will be enforced for users. However, since
rule are run on demand, they will not modify any data that is already entered. However,
when a form is opened where a field is required by a new business rule, the user will be
unable to save any changes until that new required field is populated.
58
Chapter 3 Business Logic Layer
Next, you will be given a flyout to configure the new BPF with the following options,
as displayed in Figure 3-7:
• Display Name: This is the friendly name for your BPF.
• Table: Select the table that you want to create your BPF for.
Once you click Create, you will be taken to the BPF design canvas, where you will be
able to start building your BPF, as seen in Figure 3-8.
As you can see in Figure 3-8, there are two categories of components: flow
and composition. Flow components are what you build your BPF flow out of, and
composition components go inside the stage components to enforce data points or
trigger events. Here is an outline of the options:
• Stage: These define the BPF stages that the record must go through.
Each stage can have different requirements and triggered events.
• Condition: These decision points define the active path of the BPF by
deciding which path will be assigned to the record. In a model-driven
app, users will only see the stages that are on the active path and
the BPF will update as the record values change.
• Data Step: These define which fields are required at each stage.
60
Chapter 3 Business Logic Layer
• Action Step: These allow you to trigger actions for Dataverse process
automations. This is displayed as a button in the BPF stage in model-
driven apps that users can select, as seen in Figure 3-9.
• Flow Step: These allow you to trigger a Power Automate cloud flow
from a BPF step. This is displayed as a button on the BPF stage in
model-driven apps that the users can select, as seen in Figure 3-10.
61
Chapter 3 Business Logic Layer
From here, you can start building your BPF by adding components by dragging them
onto the canvas area. You will notice that placeholders will light up next to the existing
components where you can add new components. Simply drag the new component over
the placeholder until it lights up and then release to place the component there, as seen
in Figure 3-11.
To insert decision points in your BPF to alter its active path for a given record, insert
conditions. Configure the condition in the Properties flyout, as seen in Figure 3-12.
• Display Name: This is just a friendly name to help you navigate your
logic map. As the business rules get larger and more complicated,
useful names become more important to keep your work organized.
• Rules: This section allows you to configure the criteria for each
condition node. You must have at least one rule for each condition,
but you can add several more conditions if necessary by clicking the
+New button.
• Rule Logic: This option shows up when you add more than one rule
to the condition and dictates how the rules are related. If AND is
selected, all the conditions must be met for the logic path to be taken
on the TRUE side of the condition node. If the OR is selected, only
one of the conditions needs to be met for the logic path to be taken
on the TRUE side of the condition node.
62
Chapter 3 Business Logic Layer
Once you build out your workflow with the flow components, you can begin adding
the composition components to the flow components to enforce field values and trigger
actions.
To do this, select the composition component from the list and drag it to the flow
component that you want to add it to. As you can see in Figure 3-13, the placeholder will
highlight gray, which indicates that you can place the component there.
63
Chapter 3 Business Logic Layer
• Data Step: Select the field you want to add and check the “Required”
box if the user must enter data in the field prior to advancing the
BPF stage.
• Flow: Select the flow you want to trigger and check the “Required”
box if the user must trigger the flow prior to advancing the BPF stage.
• Action Step: Select the action that you want to trigger, or click the
+New button to create a new action.
• Workflow: Select the Trigger option of either Stage Entry or Stage Exit
to dictate when the workflow runs, and select the workflow from the
list or click +New to create a new one.
Once you are done, you will see something like Figure 3-14. After saving your BPF,
you must activate it before it becomes available for use.
Note You cannot edit a BPF that is active. To edit a BPF, you must first
deactivate it.
64
Chapter 3 Business Logic Layer
Cloud Flows
Cloud flows are incredibly powerful workflows that have hundreds of integrations
with different platforms and services. It is even possible to call custom APIs to perform
actions that the out-of-the-box connectors cannot yet accomplish. Cloud flows are built
in Power Automate and are distinguished from desktop flows in that they run entirely
in the cloud and do not require access to a running desktop, either hardware or virtual.
While you are able to create Dataverse workflows, they are limited in their capabilities
and are slowly being replaced by Power Automate cloud flows.
As you saw in the business process flow section, you can trigger cloud flows from a
BPF stage. However, you can also trigger them from a button in an app, when a record
is modified, or even on a schedule to run background operations. The interface is very
intuitive and easy to learn, so we will walk through building a basic cloud flow, which
will give you the building blocks to create your own cloud flows.
65
Chapter 3 Business Logic Layer
To create a new cloud flow, you will navigate to your solution and select New ➤
Automation ➤ Cloud Flow. Next, you will need to decide which type of trigger you want
for your cloud flow. The options are as follows:
Once you have selected your trigger type, you will be given a dialogue where you will
give the flow a name and select the trigger type from the list.
As you can see in Figure 3-15, I have given the flow a name that is descriptive of
its purpose and searched the triggers to find the “When a row is added, modified, or
deleted” trigger for Dataverse. This is a very common trigger in Dataverse, so I will use
this one for the example. However, it is good to spend some time with the list to see what
options are available.
66
Chapter 3 Business Logic Layer
Next, click the Create button, and Power Automate will create a new cloud flow with
the trigger you selected, as seen in Figure 3-16.
67
Chapter 3 Business Logic Layer
Given that this is a Dataverse trigger, you must configure the options available. Note
that not all the options are required, only those with a red asterisk next to them. Options
are as follows:
• Change Type: This allows you to select the type of record change the
flow will be triggered on, such as when a record is created, modified,
and/or deleted.
• Table Name: Select the table that you want the trigger to be triggered
from. A flow will only be triggered on the change to the specified
table, so, if you need flows for different tables, you will need to set up
multiple flows.
• Scope: This defines when the flow will run based on who the user is
that is making the change and how they are related to your account.
See Table 3-1 for the options available.
68
Chapter 3 Business Logic Layer
• Select Columns: This is where you can specify which columns you
want to trigger the flow. For instance, you may only want a flow to
trigger if the Phase is Complete.
• Filter Rows: Specify the types of records that can trigger the flow.
For instance, you may only want records where the project type is
Engineering to trigger the flow.
• Delay Until: This allows you to delay the flow run to a later date. For
instance, you may want to only run flows on the weekend.
• Run As: This allows you to specify whose account the flow runs as.
You can select Flow Owner, Modifying User, or Row Owner. This
can be helpful when you need to track who is making the changes
to downstream records or if the flow owner doesn’t have access to
certain records.
Business Unit When an action is taken on a row owned by the same business unit as the
users
Organization When an action is taken on any row, regardless of ownership
Parent: Child When an action is taken on a row owned by the same business unit as the
business unit user or a child business unit of the user’s business unit
User When an action is taken on a row owned by the user
Once you have your trigger configured, you need to set up the rest of your flow. To
do this, simply click the +New Step button below the initial step, and you will be given a
dialogue where you can select the next step in your flow.
The Choose an Operation dialogue will be displayed with a list of options, as seen in
Figure 3-17.
69
Chapter 3 Business Logic Layer
There are multiple categories of connectors and actions in this dialogue, and it is
good to spend some time exploring them to get an understanding of what is available to
use here. Here is an outline of the categories:
• Custom: This section lists custom connectors that have been created
in your environment. These are generally used in place of API calls
and can be very powerful for performing operations that are not yet
built in the connectors available in the platform.
Note Flow licensing requirements depend on who is triggering the flow run. For
instance, if you have a flow with a Premium connector operation that is triggered
from a button push in an app, then the user clicking the button must be licensed
for premium connectors. However, if you have a flow that is triggered on record
change, and a user changes a record, then the user does not have to have a
Premium license, only the creator of the flow requires a license.
For this example, I am going to choose a logic operation, so I will choose Control
from the list of operations in the Built-In section, as seen in Figure 3-18.
After clicking the Control option, you will be given a list of actions to choose from.
This behavior is consistent with all operations selections and allows you to select the
specific action that you want to call, as seen in Figure 3-19.
71
Chapter 3 Business Logic Layer
For this example, I will choose Condition, as that is a very commonly used option.
Note Hovering over the “I” icon on the action will give you more details about the
action and what it does.
As you can see in Figure 3-20, selecting the condition action adds a set of steps
under the trigger action created earlier. This is how you will work through building your
workflow, adding actions and building out your workflow just as you would do in a
process-mapping program like Visio.
72
Chapter 3 Business Logic Layer
Figure 3-20. New condition action added after the trigger action
To configure your logic in the condition action, you will start by clicking in the left-
side field of the Condition control. Once you click in the field, the Dynamic Content
dialogue will appear, listing all the options for data fields that you can choose from.
Environmental Variables will always show up first, followed by outputs from previous
steps. You can see in Figure 3-21 the section in the Dynamic Content dialogue called
“When a row is added, modified, or deleted,” followed by a list of fields. That section
title comes from the preceding step, which, in this case, is your trigger that is named the
same. The field options shown below that are the outputs of that action and will return
the values of that field when a flow is run.
73
Chapter 3 Business Logic Layer
In this case, I will choose “Project Type,” select “is equal to” in the condition field,
and enter 100000000 in the right-side field. This will check to see if the project type is
100000000 and run either the “If Yes” side or the “If No” side of the logic, depending on
what the logic check evaluates to.
74
Chapter 3 Business Logic Layer
Note The values returned from Dataverse connectors are the values of the field.
Since Project Type is a Choice field, it uses Choices as options and stores the ID
of that choice in the field, not the label of the choice. So, it is good to go back
to review your field options and the values when writing logic. You can see an
example of the Choices table in Figure 3-22 from the field settings in the demo
table created previously.
Figure 3-22. Options labels and values in the table field settings for project type
To complete the condition section of your flow, select the “Add an Action” button on
the “If Yes” side and add any actions you want to be completed when the condition is
evaluated to TRUE, and do the same for the opposite side. Repeat this process until your
flow is complete. Click Save once complete and return to the flow dashboard to turn the
flow on by clicking “Turn On” in the top control bar.
Note To test your flows, you can go into the edit mode of the flow and select
“Test” from the upper control bar. If your flow has run before, you can select
“Automatically” and select a previous run to re-run for testing, as seen in
Figure 3-23. This can help you to troubleshoot changes as you work through issues
or changes to your flow.
75
Chapter 3 Business Logic Layer
Desktop Flows
Desktop flows are different from cloud flows in that they involve robotic process
automation. The process for creating these flows is a bit more elaborate as you need to
have either a computer or a virtual machine to run these operations on. Either way, the
point of the desktop flow is to log in to the specified desktop and emulate the actions of a
user. This is very useful for integrating legacy applications that don’t have APIs and require
a user to log in to a browser to enter or retrieve information. You can also run Windows
commands and operations since it is logging in to a machine to perform the operations.
While the scope of this book does not cover how to build these flows, it is worth
mentioning that the tool is available and very powerful if you have a need for it.
Summary
Now, you can build out the basic components of the business logic layer in Dataverse.
With these skills, you can help to ensure that your processes are followed, your data
is accurate, and your tools are predictable. The next step is to create the security
requirements to ensure that data is available only to those who are intended to see it.
76
CHAPTER 4
Security Layer
In the previous chapter, we discussed the Business Logic Layer and how business rules
and processes can be used to ensure data integrity and consistency. In this chapter, we
will look at the Security Layer, which creates a barrier between the user and the data.
This can be in the form of restricting visibility, performing updates, deleting data, or
appending records to each other. Data can be restricted at the app, field, record, and
table levels, and more. There are many tools to help accomplish these goals, and those
tools are often used in conjunction with each other to accomplish complex security
implementations.
When it comes to Dataverse, the layers work together seamlessly and quickly. Every
call to data is passed through a series of security layers to return the appropriate data to
the user as well as to grant the user only the allowed operational capabilities.
S
ecurity Roles
Security roles allow you to define the permissions that are granted to a user either
directly or indirectly through teams. Once you have a security role built, you can assign
that role to a user or team, and where it is assigned will impact how it affects the user and
in what context the role is active.
77
© Brian Hodel 2023
B. Hodel, Beginning Microsoft Dataverse, https://doi.org/10.1007/978-1-4842-9334-8_4
Chapter 4 Security Layer
This will open a new window that will allow you to start configuring your new
security role. The primary items to fill out are as follows:
• Business Unit: Specify the business unit that you want the role to
apply to. By default, the top-level business unit is selected, but you
can assign the role to only specific business units if you want to be
more granular, such as for a matrix security structure.
• Direct User (Basic) Access Level and Team Privileges: This role
type applies the permissions to records owned/created by the
user and the user’s team(s). You can grant this security role to a
user and/or a team and the user will have permissions to both
records that they own as well as records that their team(s) own.
Once you have the basics configured for the security role, you will move into the
details of what that security role enables the user to see and/or do. There are a lot of
capabilities in the security role configuration that control everything from what a user
can do in an environment to what they can see in the data. For now, we will just talk
about the basics of managing data security through table permissions.
Across the top of the Security Role settings screen, you will see a series of tabs. The
tabs organize the types of features/tables by what they are related to, such as Sales,
Service, Custom Entities (tables), etc. If you select Core Records, you will see a list of the
default tables in the environment, as seen in Figure 4-2.
The general layout is a matrix of tables and their permissions. At the intersection of
each permission level (top) and table name (left), there is a circle that defines the scope
of that permission.
79
Chapter 4 Security Layer
Note Not all permission types and scopes are available for all tables. For
instance, an organization-owned table does not have Share or Assign permissions
available because records cannot be shared or assigned individually as the records
in those tables are always owned by the organization.
Permissions scopes can be applied by simply selecting the circle where you want to
apply the permission. Selecting the circle changes the scope and is indicated by a circle
that progressively fills in. The scope options are outlined in the key at the bottom of the
screen and operate as follows:
80
Chapter 4 Security Layer
In either case, once you open the Teams or Users list, you will select the record to
which you wish to apply the security role, select the ellipsis next to the record, and select
Manage Security Roles, as seen in Figure 4-4.
You will be shown a dialogue listing all of the security roles available, as seen in
Figure 4-5. Simply select the security role or roles and click Save, and the security roles
will be applied.
81
Chapter 4 Security Layer
Note Security roles applied to teams can sometimes take a while to propagate
to the user. If security roles need to be applied and used immediately, such as for
testing, it is best to have the user log out of and back in to their O365 profile to
ensure that the roles update.
82
Chapter 4 Security Layer
Teams
Applying security roles directly to a user’s account is effective for applying permissions,
but can be very difficult to manage at scale. Teams are a great way to manage
permissions of groups of users quickly and easily. It also allows for easier viewing of
permissions of groups of users instead of individually because you can quickly go into a
team and view who is a member, or even create reports of those team’s members.
Team Types
While all team types are essentially groups of users, there are different team types for
different use cases. Understanding the different team types will help you design the most
stable and maintainable security design possible for your needs.
Owner Teams
Owner teams allow more flexibility in managing records as they are not tied to a user
and that user’s defined business unit. Instead, records can be assigned directly to an
owner team. As with users, every owner team has one and only one business unit that it
is associated with. In fact, by default, every business unit has an associated owner team
that you can utilize for assigning records. However, you are free to create additional
owner teams, as needed, to expand that model.
One of the common use cases for owner teams is to apply permissions to groups of
users in bulk. Since you can apply security roles to a team directly, you can quickly and
easily apply and update permissions of a group of users. You can also quickly audit and
update user permissions by reviewing who is a member of which team.
User-Created Access
User-created access teams are similar to owner teams in that they are created
individually, and users are added to them. However, user-created access teams do not
own records. Instead, records are shared with these teams as you would share records
83
Chapter 4 Security Layer
with a user. This allows a lot of flexibility in your design because they are very flexible
and can be created/removed as needed without affecting the overall architecture of
the system.
84
Chapter 4 Security Layer
In this example, you can see that Activity 1 had one of each access team type, but
Activity 2 had only the read-only access team type, and Activity 3 has only the read/write
access team type. Also, you can see that every access team has members associated with
it. If you were to remove all the members from Access Team 2, then Access Team 2 would
be removed because access teams are on-demand teams. If you were to add members
to a read-only access team for Activity 3, then a new access team would be created, and
the members would be associated with it. This whole process is automated for you by
the system.
85
Chapter 4 Security Layer
Creating Teams
The process for creating teams in Dataverse will vary depending on the type of team that
is being created.
86
Chapter 4 Security Layer
Next, select Environments from the left-hand navigation bar, find your environment,
and select “Settings” from the menu, as seen in Figure 4-8.
Search for “Teams” or select “Teams” under the Users + Permissions section, as seen
in Figure 4-9.
87
Chapter 4 Security Layer
Once the Teams window opens up, select “Create Team” from the top control bar and
a dialogue will open to begin the process of creating your team, as seen in Figure 4-10.
88
Chapter 4 Security Layer
You will be prompted for the following attributes to create your team:
• Business Unit: Specifies the business unit of the team and, in turn,
the records that get assigned to the team. This becomes important
when you have a hierarchy of business units to organize your record
structures and permissions.
• Team Type: In this case you will select “Owner,” but this dropdown is
also used for Access, AAD Security Groups, and AAD Office Groups.
Next, you will be prompted to begin adding users to the team, as seen in Figure 4-11.
You can continue to add/remove members after your team is created as well.
89
Chapter 4 Security Layer
Next, you will select the security roles that will be applied to the team, as seen in
Figure 4-12. You must select at least one security role, but can select as many as you
want, except for special roles, such as System Administrator.
Once you are done, click Save, and your team will be provisioned.
90
Chapter 4 Security Layer
Another differentiation between owner and access teams is that access teams do not
have security roles applied to them, so you will skip that step when setting up your team.
91
Chapter 4 Security Layer
Once you select the team type of AAD Security Group or AAD Office Group,
depending on your group type, you will be shown two additional fields: Group Name
and Membership Type.
• Group Name: The name of the group from AAD. You can select
your group by typing the name of your group in the Group Name
field; a list of search suggestions will show up for you to select the
correct one.
92
Chapter 4 Security Layer
93
Chapter 4 Security Layer
To set up your access team template, you will go to the environment settings by
selecting the settings gear in the upper-right corner and then “Admin center,” as shown
in Figure 4-17.
Next, select “Environments” from the left-hand navigation bar, find your
environment, and select “Settings,” as seen in Figure 4-18.
Once in the environment settings, either search for “Access Team Templates” or
navigate to Templates and select “Access Team Templates,” as seen in Figure 4-19.
94
Chapter 4 Security Layer
Once in the access team templates settings, select “New” from the top control bar, as
seen in Figure 4-20.
Once the Access Team Template dialogue shows up, you can begin filling out the
fields to create your access team, as shown in Figure 4-21.
95
Chapter 4 Security Layer
The following attributes will need to be filled out to create your access team:
• Name: This is the display name for your auto-created access team.
Note It is a good idea to include the table name and the general permissions in
the name so you can easily distinguish this team from others. Example: Device
Orders – Read Only
• Entity: This allows you to select the table to which you want to apply
the access team. This means whenever a new record is created on
this table, a new access team will be available for that record using
this template.
Note If your table is not listed here, it means that the table is not configured for
access teams. Go to the table settings and enable “Have an access team” as seen
in the beginning of this section.
96
Chapter 4 Security Layer
• Access Rights: Here you configure the privileges of users who are
added to the access team. The options include Delete, Append,
Append To, Assign, Share, Read, and Write.
Business Units
Business units (BUs) allow users and teams to be organized in a hierarchy to allow
cascading permissions based on who owns records. For instance, you might want to
allow a team to work on records that are submitted by others on their team, or a manager
to view all the records submitted by their subordinates’ team members. Assigning your
users and teams to business units and organizing those business units into logical
hierarchies can reduce the need for users to manage permissions for their work and
reduce the opportunity for data’s being accessed or modified by the wrong people.
There are two distinct types of security structures to be aware of when planning your
design: hierarchical and matrix.
97
Chapter 4 Security Layer
98
Chapter 4 Security Layer
However, it is not possible to give a user in the Contoso BU access to ONLY Division
A contacts and not Division B records, because the configuration for roles applies to ALL
child BUs, not specific child BUs. If this is a requirement for your implementation, you
will want to look at the other permission model, which is called a matrix model.
99
Chapter 4 Security Layer
100
Chapter 4 Security Layer
In Figure 4-25, you can see that a user is assigned to Division 1. At this level, if a user
is given Parent:Child Business Units scope for permissions for the Contact table, they
will be able to operate on contacts within Division 1.
If the user were to be reassigned to the Contoso, or parent, business unit level, then
the permissions would allow them to operate on Contoso contacts, as well as all the
contact records owned by the child business units of Contoso (Figure 4-26).
101
Chapter 4 Security Layer
Figure 4-26. User with parent business unit assigned, with cascading permissions
If you were to keep the user at the same level, but change the permission scope to be
Business Unit instead of Parent:Child Business Units, they would only be able to operate
on records owned by the Contoso business and nothing below it. An illustration of this
concept can be seen in Figure 4-27.
102
Chapter 4 Security Layer
Figure 4-27. User with Parent Business Unit assigned, without cascading
permissions
Record Inheritance
Record inheritance allows permissions to be passed from a parent record to a child
record across the relationships between them. For instance, you may want a user who
has edit rights to a project record to automatically have edit rights to the activities that
are related to that project, but not activities that are unrelated to that project. This can
make managing records a lot easier because you can manage the permissions for the
parent record, and the child records are automatically managed for you. It can also make
your data more secure because it avoids the need for sharing records at different levels,
which can cause a lot of segmented permissions that are difficult to audit. In general,
the more you can rely on well-structured security models, the less users need to share
records directly, which means less segmentation of permissions.
103
Chapter 4 Security Layer
104
Chapter 4 Security Layer
While there are cases where removing the parent records is desired, such as when
a record has more than one parent record and you want to keep the record for the
sake of the other relationship, it is wise to keep your data clean by avoiding orphaned
records caused by removing a parent record but not the child. Depending on how you
have the relationships set up, you may or may not be able to delete the parent record
while there are child records assigned to it. If you have configured the relationships to
not be parental, or not restrict delete, then you will be able to delete the parent records
without first deleting or reassigning the child records. This is why it is good to have one
relationship set as Parental or configured to restrict delete when there is a practical
parent/child relationship between tables, instead of simply a referential relationship.
105
Chapter 4 Security Layer
This is another reason it is best to plan your data structure carefully, so you don’t run
into this issue down the road. Fortunately, there are ways around this, as follows:
An example of why you may want multiple parent tables for a record is illustrated
in Figure 4-30. Here you can see how a feature may be related to a release, but also be
related to a product. The product may be associated with feature 1 and feature 2, but not
feature 3, while all three features are associated with release 1. In this model, both the
team product and the release table are parent tables to the feature table, as can be seen
by the direction of the one-to-many relationship. However, only one of the relationships
can be parental due to circular logic restrictions.
106
Chapter 4 Security Layer
Hierarchy Security
Hierarchy security allows users to be organized in a hierarchical manner so that a user
has access to data that the users who are below them on the hierarchy have access to.
This feature can be used in conjunction with other security features to further automate
the security model, primarily reducing the number of business units that are required to
organize data.
Note Users who have their account status set as Disabled are excluded from the
hierarchy security model, and their records will not be available to the users higher
in the hierarchy.
There are two types of hierarchy security models: manager and position. With
a manager hierarchy, the manager must be in the same business unit as their direct
report (employee) or within the parent business unit of the direct report’s business unit,
whereas with position hierarchy, data can be accessed from business units. Deciding
which method works for you will depend on if you want to segment your data by
business unit or not.
107
Chapter 4 Security Layer
Manager Hierarchy
The manager hierarchy utilizes the Manager field on the system user table to dictate
the reporting structure. Each user can have one and only one manager, but a manager
can have one or more direct reports. Within this model, a manager will have access to
the data to which their reports have access. This includes records that are owned by
or shared with the direct report or a team that the direct report is a member of. The
manager is also granted read, write, append, and append-to privileges for any records
to which their direct report has access. This allows managers to not only see what work
their direct report is doing, but also perform work on behalf of their direct report.
Note For a manager to have access to a direct report’s records on a table, the
manager must have at least read-level access to a table or they will not be able to
see the records of their direct report.
For non-direct reports in the same management chain, the manager has read-
only access to the data of the users below them. This can be further explained using
Figure 4-31 as an example.
108
Chapter 4 Security Layer
In this example, the support manager will have read, write, append, and append-
to rights on any records that either support supervisor has access to. However, support
supervisor 1 will have read only access to the records of the hardware support team users
and application support team users. The same applies if you move one layer up, where
the director will have read, write, append, and append-to rights on both managers’
records but have read-only access to the supervisor’s records.
There is also a depth setting in the hierarchy security settings that allows you to
specify how deep the permissions go. For example, if the depth setting is set to 2, then
the director will have access to the manager- and supervisor-level users’ records, but not
the support- and sales-level records.
109
Chapter 4 Security Layer
Position Hierarchy
Position hierarchy is not based on reporting structure, but rather on positions that are
set up and assigned to users. The positions are arranged in a hierarchy, and the users
that are assigned those positions inherit permissions through the hierarchy of those
positions, as seen in Figure 4-32.
A user can have only one position, but a position can be assigned to multiple users.
Similarly, a position can have only one parent position, but multiple positions can be
assigned to a single parent position. This allows for building a pyramid-style structure
like that of the manager hierarchy model. However, unlike the manager hierarchy model,
110
Chapter 4 Security Layer
since users are not reporting to other users, users can be organized across multiple
business units. This allows for a more flexible security design for users to work in a
structure, like a support team where a manager can have team members working across
multiple business units.
If you compare Figure 4-31 and Figure 4-32, you will see that the same users are
structured differently. This is because in a position hierarchy, users are structured
using positions, not reporting managers. This means that all the support supervisors
are lumped together and the hardware support team reports to the group of support
supervisors instead of a specific support supervisor. The same applies to the sales team
and sales supervisors because they have been “tagged” with the same position.
The permissions cascade is the same as the manager hierarchy, where the director
has read, write, append, and append-to rights on both managers’ records, or any records
of their associated teams, but read-only rights on the levels below that to the extent the
depth setting allows.
Because the positions are not assigned business units and users are not assigned
to report to a specific user, it is now easy to see how access can be organized across
multiple business units. An application support team can have users that span multiple
business units, and the support supervisors will retain their inherited permissions
from all the records that the users in positions that are below their own position have
access to.
111
Chapter 4 Security Layer
Next, you will set up the basics of the hierarchy security, as seen in Figure 4-34.
112
Chapter 4 Security Layer
113
Chapter 4 Security Layer
• Select tables to exclude from the hierarchy: Move any tables that
you do not want to be included in the hierarchy security model to the
“Excluded Tables” section. Otherwise, all tables are included in the
model by default.
Next, you will want to configure your hierarchy model. The method of configuration
for this depends on the model you choose.
Note If you do not see the Manager field in this view, click the Edit Columns
button and add it to the view.
114
Chapter 4 Security Layer
Here, you will proceed to either select the user’s managers, or you can populate these
by loading an Excel template, which you can download and upload by clicking the Excel
Templates button in the top control bar, as seen in Figure 4-36.
Additionally, you can populate this using flows or apps or in various other ways, but,
essentially, you will need the Manager field populated for any users who are going to
participate in the hierarchy security.
115
Chapter 4 Security Layer
To add a new position, select “+New” from the top command bar. A new form opens,
as seen in Figure 4-38.
116
Chapter 4 Security Layer
Enter a name for the position and select a parent position from the dropdown. You
can also add a description to help identify the purpose of the role.
After clicking Save in the top control bar, the form will change to allow the addition
of users to the position, as seen in Figure 4-39.
117
Chapter 4 Security Layer
Once you are done with the configuration of this section, be sure to click Save and
Close from the top control bar of the hierarchy security window to apply your new
security model.
118
Chapter 4 Security Layer
Once the column settings flyout opens, select “Advanced Options” and check
“Enable Column Security,” then click Save to apply changes, as seen in Figure 4-41.
119
Chapter 4 Security Layer
This enables the column for column-level security but does not yet specify who can
interact with the column.
120
Chapter 4 Security Layer
A dialogue will open asking for name and description, as shown in Figure 4-43.
121
Chapter 4 Security Layer
Note The term field security profile has been replaced by column security profile
as Microsoft has moved to their Unified Interface model. Until the migration is
complete, the terms can be used interchangeably.
Once the Name and, optionally, Description, fields have been filled out, click Save in
the upper control ribbon to create the security profile, as seen in Figure 4-44.
122
Chapter 4 Security Layer
This will open a Look Up Records dialogue where teams can be searched for and
added to the security profile. Simply select the teams you wish to add by checking the
box next to each, then click the Select button, as seen in Figure 4-46. Once you are done,
select “Add,” and the teams will be added to your security profile.
123
Chapter 4 Security Layer
To add individual users to the security profile, do the same steps as outlined for
adding teams, but select “Users” in the first step instead.
124
Chapter 4 Security Layer
Select the row of the field you wish to adjust permissions for and select “Edit,” as
seen in Figure 4-47. A dialogue will open to edit the field security settings of the selected
field, as seen in Figure 4-48. Here, you can configure whether the members of the
security profile can read, update, and/or create data in the field. To add another column
to the security profile, simply select another field and configure it like the previous one.
A single profile can have multiple columns configured in it, and a single column can be
configured for multiple security profiles. This is another reason to use the description
field when creating profiles—so you can understand the implications of applying a field
security profile to a user/team and easily troubleshoot security issues if they arise.
125
Chapter 4 Security Layer
Auditing
Dataverse’s auditing capabilities are extensive and allow you to monitor actions such
as system access, setting updates, and data changes. The auditing capabilities are
also highly customizable so you can adjust what is audited and how those logs are
maintained.
Enabling Auditing
By default, an environment’s auditing is disabled, so no audit data is collected. To enable
auditing, go to the Power Platform Admin Center and select your environment to open
the environment settings summary screen. From there, find the Auditing section, as seen
in Figure 4-49.
126
Chapter 4 Security Layer
You can see the status of the environment’s auditing settings here. If Auditing
Enabled is set to No, select “Manage” to open the Auditing settings for the environment.
This will open a new form, as seen in Figure 4-50, where you will be able to enable and
configure auditing.
• Retain These Logs For: Specifies how long the audit logs will be
retained. Since there is a cost to all data storage, including audit logs,
it is good to evaluate how long you want audit data to be retained in
your system.
127
Chapter 4 Security Layer
Note This interface looks a bit different because it is still in the legacy design
Dynamics 365 interface as not all features have been migrated to the new Unified
Interface as of the writing of this book.
Once in the advanced audit settings screen, you will be able to access advanced audit
log configuration settings, as well as the audit summary view, as seen in Figure 4-52.
128
Chapter 4 Security Layer
Selecting “Audit Summary View” displays a list of all the audit log entries for your
environment. To filter the entries, select “Enable/Disable Filters,” as seen in Figure 4-53.
Note Only the legacy audit log interface is discussed in this section because the
new Purview integration is not yet available as of the writing of this book.
129
Chapter 4 Security Layer
If auditing is not enabled for the environment, a warning will appear below
the checkbox stating that no data will be audited until auditing is enabled for the
environment, as seen in Figure 4-55. The option can still be enabled, but no audit data
will be collected until environment-level auditing is enabled.
130
Chapter 4 Security Layer
Summary
As you can see, there are many ways to apply security to the Dataverse data, and the
ability to layer those security methods can create a very complex array of security
options. This can be a great asset in your toolbox for building solutions, but can also be
very troublesome if it is not configured correctly. This is why it is important to always
plan your architecture carefully and document as you go so your solutions are robust
as well as resilient. Next, you will learn how to display the data to your users by building
reports and apps.
131
CHAPTER 5
Presentation Layer
In the previous chapter, we discussed the Security Layer and how to utilize the advanced
security features in Dataverse, such as Teams, Business Units, and auditing, to ensure
data is secure. In this chapter, we discuss the Presentation Layer which consists of
various methods of presenting and interacting with your data. In Power Platform, there
are several tools with which to do this, such as Power Apps, Power BI, and Power Pages.
Designing an interface for your data is one of the more challenging aspects of designing
a solution because it is where users interact with data. It needs to be built in such a way
that users can intuitively find and navigate data structures. In my experience, this part
of the process involves numerous iterations to find a balance between functionality and
what makes sense to users.
The tool you choose depends on the audience, user account type, type of data
interaction, and various other factors. Choosing the correct tool, or tools, for the job is
imperative not only to build an effective tool, but also to ensure user adoption is smooth
and successful.
A
pps
Applications, typically referred to as apps, are visual representations of data. Typically,
end users do not understand terms like schema, foreign keys, and attributes, and they
should not have to. You would never expect a project manager to go into a database
and start updating the project status by writing SQL statements, or a director to look at
project budgets by running queries in SQL Server Management Studio. Neither of these
users would likely have the skill set or bandwidth to do such things. Also, to do this type
of work, the user would have to know the tables, relationships, attributes, and so on
to even start. This is why we have apps—to make accessing and interacting with data
simple and intuitive. In fact, I use the amount of training and support required to gauge
how effective an app is. If users can go into an app and quickly learn how to use it and
133
© Brian Hodel 2023
B. Hodel, Beginning Microsoft Dataverse, https://doi.org/10.1007/978-1-4842-9334-8_5
Chapter 5 Presentation Layer
get the information that they need, then it is a good app. If extensive training is required
and users struggle navigating the menus to find the information they need, then it is a
bad app.
In Power Platform, there are multiple tools for developing apps, and each has its
advantages and disadvantages, so it is good to familiarize yourself with each type to ensure
you have the best tool for the outcome you seek. However, like the rest of the platform, it is
common to have a mix of tools. Often, a canvas app will be deployed in parallel with a model-
driven app and have different target audiences. I have often developed a model-driven app
for administrative/advanced users who need to do very granular data manipulation but are
not concerned with a “pretty” interface alongside a canvas app that is available to basic users
and executives who need to be able to quickly go in and get what they need from the app but
do not need to see all the details, as seen in Figure 5-1. However, there are many reasons to
use each type and many capabilities of each that do overlap.
Canvas Apps
Canvas apps allow the development of apps with nearly unlimited customization
capability. While model-driven apps do allow for some customization and theming,
canvas apps allow for pixel-perfect control over nearly every aspect of the app design,
allowing layouts, colors, sizes, and fonts to be fully customized to meet the needs of
your customers. Any limitations that do exist in the native app development tools can be
accomplished using full-code components that can be integrated into the app.
134
Chapter 5 Presentation Layer
135
Chapter 5 Presentation Layer
136
Chapter 5 Presentation Layer
Once the app has opened, there will be a single blank screen. Being familiar with the
Power Apps Studio layout for canvas apps will help you be more efficient while building
your apps. Figure 5-5 maps out the various sections of Power Apps Studio.
137
Chapter 5 Presentation Layer
138
Chapter 5 Presentation Layer
To get started, add a data source by selecting “Add Data” from the command bar.
Select a Dataverse table, and it will be added to your app. The added table will also move
from the Tables section to the In Your App section, as seen in Figure 5-6.
139
Chapter 5 Presentation Layer
Note Since Dataverse is the native data source for Power Apps, the tables in the
environment are automatically connected to the app and available to be added
without creating a connection. Other data sources require the additional step of
setting up the connection, which varies depending on the connector.
Once you have a data source in your app, you can begin adding controls with which
to interact with the data. The most common control in an app is the form control since it
allows you to easily interact with data and requires little configuration to be functional.
To start, select “Insert” from the top command bar, and select “Edit Form,” as seen in
Figure 5-7.
Note You can also access the control list by selecting “Insert” from the app
authoring menu and selecting “Edit Form” from the app authoring options list. As
the Power Apps tool has evolved, the interface has evolved, so you will see multiple
options for performing the same task in a few places.
Once you have added the form control to the page, you will see a square that says,
“This form is not connected to any data yet. Connect to data.” Select the form control,
and the Properties pane on the right will display the list of properties for the control.
Select the “Data Source” menu from the Properties pane and select the table you want
the form to interact with, as seen in Figure 5-8.
140
Chapter 5 Presentation Layer
Once you have selected the table, Power Apps will begin setting up your form by
adding fields to it. If you want to adjust the fields by adding or removing fields, select
“Edit Fields” from the Properties pane, as seen in Figure 5-9.
The next step is to set the form mode to New instead of the default of Edit since we
want to create a new record. To do this, select the form control, then select “Advanced”
in the Properties pane. Find the “Data” section and start typing “FormMode.New” in the
DefaultMode field. As you type, you can see that predictive text suggests text to complete
the formula, which makes it much easier to format your formulas correctly, as seen in
Figure 5-10.
141
Chapter 5 Presentation Layer
Note You can also use the formula bar by selecting “DefaultMode” from the
properties list then typing the formula into the formula bar.
Form controls are groups of controls instead of controls themselves. You can see the
layers of controls in the tree view of the App Authoring Options pane after selecting the
tree view app-authoring menu, as seen in Figure 5-11. This view is especially useful for
seeing how your controls are layered and grouped on the page.
142
Chapter 5 Presentation Layer
Each control has its own set of properties and can be configured independently.
However, Power Apps does this automatically when inserting a form to make the
process faster.
As you can see in Figure 5-11, each control has a unique name. These names are
important for formulas, so you can rename them by selecting the ellipses next to the
control in the tree view and rename them to something useful.
Note Using a naming convention for your controls helps to keep things
organized. I prefer to use three letters that refer to the type of control, an
underscore, then a description of the control, such as frm_NewAccountForm for a
new account form control.
143
Chapter 5 Presentation Layer
The next step is to add a button control to submit the form. To do this, add a new control
to the form and place it where you want it on the page. You can change the text on the
button, along with many other aspects of the appearance of the control, in the Properties
pane for the button. To make the button function to save the form, select the button, select
“Advanced” in the Properties pane, and type “SubmitForm(Form1)” in the OnSelect field, as
seen in Figure 5-12. This will submit the data as a new record to the accounts table.
Now that you have a functional app that creates new records, simply click Save and
Publish in the app actions menu.
Note Since the app is only an interface to the data layer, any users will still
need the appropriate permissions to perform data operations. Refer to Chapter 4
for guidance on how to manage the security layer to ensure your users have the
appropriate permissions in place.
Components
To get started with building a component library, go to your environment, select “Apps”
from the left-hand navigation bar, and select the Component Libraries tab under the
Apps section. To create a new library, select “New Component Library,” as seen in
Figure 5-13. Give your library a name and click Create.
144
Chapter 5 Presentation Layer
Building a component is like building a canvas app except each “page” is a separate
component. Start by adding your controls to the screen, then customize them as
necessary. Selecting the component level in the tree view on the left will display the
Properties pane for the component on the right, as seen in Figure 5-14. Here you can
change the default height and width of the component.
This section also has the Custom Properties section, which allows you to set inputs,
outputs, and actions for your component. You can think of components as functions
in programming where you pass in a value, something happens, and it passes back a
145
Chapter 5 Presentation Layer
value. The scope of the controls in a component is the component itself. So, if you want
to access the values of a date picker in a component, you must set up an output property
with the value of the date picker.
To set up a custom property, select “New Custom Property”. to display the New
Custom Property pane, as seen in Figure 5-15.
Data Connections
Another differentiating factor between model-driven and canvas apps is that the latter
allow the addition of data sources other than Dataverse. Canvas apps allow you to
connect to hundreds of data sources across many vendors. This allows a single app to
integrate data from different systems and tie that data together in a single interface.
This can simplify tasks for users as they can use a single app to accomplish tasks across
multiple systems instead of having to switch between systems.
This integration is accomplished through connectors, which are essentially just
wrappers built around an application programming interface (API) that allows one
system to talk to another. Connectors will be covered in Chapter 6 in greater depth.
146
Chapter 5 Presentation Layer
Model-Driven Apps
Model-driven apps are designed and built around Dataverse. This means that if you
want to integrate with external data, the use of virtual tables in Dataverse is your best
option. These apps also do not have the capability to be customized as much as canvas
apps, because they are built around the Microsoft Unified Interface design. However,
while model-driven apps lack flexibility, they more than make up for it in other areas.
Because model-driven apps are built specifically for Dataverse, they work very
well with Dataverse. Field security profiles, covered in Chapter 4, automatically flow
through to the controls in model-driven apps so a user who cannot edit a field on a
record will see the control in read-only mode. Business rules, covered in Chapter 3, also
pull through natively to the model-driven controls so dynamic filtering and conditional
visibility of fields can be built using business rules and be applied across multiple apps
and tables at once. Even business process flows (BPFs) pull into model-driven apps
natively and provide a clean, simple, and functional interface.
147
Chapter 5 Presentation Layer
Once your app is created, you will start adding pages, navigation, data, automation,
and so forth. But, before you start doing that, it is important to become familiar with the
layout of the model-driven app designer. Figure 5-17 outlines the basic sections.
3. Asset Pane: Displays the assets associated with the selected layer
of the app in the navigation pane
148
Chapter 5 Presentation Layer
Navigation Elements
There are layers of navigation elements in model-driven apps that help you organize
your app and make it more intuitive as well as target your interface to the audience. The
elements are as follows:
• Subarea: These are active navigation elements that take you to pages
for tables and dashboards, as well as web pages and resources. This is
how a user would navigate, for instance, from a table to a dashboard.
Note By default, areas are disabled. If you wish to use areas in your app, select
“Navigation” from the Navigation pane, select “Navigation Bar” from the Assets
pane, and check the box next to “Enable Areas” in the Properties pane.
As you can see in Figure 5-18, the groups and subareas map directly to the structure
shown in the app preview, but the areas are selected using a menu at the bottom of the
navigation.
149
Chapter 5 Presentation Layer
Select the Navigation from the Navigation pane, and you will see that the basic
navigation structure has already been built for you, as seen in Figure 5-19. This is the
beginning of your navigation structure that will be displayed in your app.
150
Chapter 5 Presentation Layer
Select a subarea. You will see the Properties pane displays the configuration options
for that subarea. Select “Table” from the Content Type dropdown, select a table from the
table dropdown, and enter a name for the section in the Title field; you will see that the
app is updated with an interface for a table using the Unified Interface format, as seen in
Figure 5-20.
151
Chapter 5 Presentation Layer
As you continue to add navigation elements to the navigation list, you will see them
reflected in your app, as seen in Figure 5-21, so it is important to plan this grouping out.
It should be logical for users to navigate and find the information they seek.
152
Chapter 5 Presentation Layer
Pages
Pages are a category of subarea and include Dataverse table, dashboard, and custom
pages. While the first two are self-explanatory, the third one is a bit more obscure.
Custom pages refer to canvas pages that are built specifically to integrate with
model-driven apps. They provide a crossover between the two app types and allow you
to leverage the layout and connector options of canvas apps, but within the context of
a model-driven app. Since custom pages are a separate development effort, refer to the
“Custom Pages” section in this chapter to read more about them.
153
Chapter 5 Presentation Layer
Just as with other security roles, you can assign this role to a user or a team that the
user is a member of to apply it.
Once the user has permission to run model-driven apps in general, you can then
share the specific model-driven app with either a team or a user, just as you would
share a canvas app. However, with model-driven apps, you can also share the app with a
security role directly, as seen in Figure 5-23, by selecting the ellipses next to the app that
you wish to share, selecting the app in the Share pane, then selecting the security role
you wish to share the app with.
154
Chapter 5 Presentation Layer
This method allows you an additional option for security management that is not
available in canvas apps and makes sense because the role dictates the access to that
type of app.
Forms
Since model-driven apps are made to integrate with existing layers of the platform, you
customize views and forms not in the app, but rather in the table settings. This has the
advantage of your being able to configure these items in one place and utilize them
across several apps. All updates to the forms and views also propagate across all apps
they are used in as well, so it creates an efficient process for updating apps. In addition,
the forms and views are leveraged in the native Dataverse interfaces for managing
records.
Note While views are accessible within canvas apps in the form of a query, there
is no integration of native Dataverse forms in canvas apps. This means that any
forms created in a canvas app need to be built in that app, or copied and pasted
from another canvas app.
155
Chapter 5 Presentation Layer
Form Types
There are four types of forms that can be used in model-driven apps, as follows:
You can select which forms are available to your app by adding or removing
them from the In This App section of the Form Properties pane. Select “Pages” in the
Navigation pane, expand the table in the Assets pane, then select the table you want to
operate on. Then, adjust the forms in the Properties pane, as seen in Figure 5-24.
156
Chapter 5 Presentation Layer
Editing Forms
To edit a form, you can navigate to your table settings in Dataverse and select forms to
manage, as discussed in Chapter 2, or you can select “Pages” from the Navigation pane,
select the table that you want to edit the views on, expand the options for your table in
the Assets pane, select the forms options, then select the ellipses and edit in new tab, as
seen in Figure 5-25. This will open a new window where you can edit the form.
Figure 5-25. Launching the form designer from the model-driven app designer
The form editor makes updating forms very easy with drag-and-drop functionality
and simple conditions.
157
Chapter 5 Presentation Layer
The basic layout of the form editor can be seen in Figure 5-26 and includes the
following:
158
Chapter 5 Presentation Layer
6. Preview Size Switcher: Since forms are reactive, they can appear
different depending on the screen size. This allows you to test
those changes.
Editing Fields
To add fields to the form, simply select table columns from the Navigation pane and drag
the required fields from the Asset pane to the form canvas and place it where you want
the field to be. All changes to the canvas will be reflected in the tree view, regardless of
whether they are visible or hidden.
Selecting a field will change the context of the Properties pane to that field so you can
adjust properties such as label, visibility, and size. To build logic into your forms, you can
utilize business rules, as discussed in Chapter 3, or events, which are custom JavaScript
code segments. These are visible in the corresponding tabs in the Properties pane.
Certain types of fields also have alternate control types that you can use to change
the look and feel of your app. For instance, some developers prefer to display a toggle
instead of a dropdown, or a set of buttons instead of a multi-select combo-box dropdown
control. This can be accomplished in the Components section of the Properties pane, as
seen in Figure 5-27.
Once you select a component, you will select what interface it will be displayed
for from the options Web, Mobile, and/or Tablet, as not all controls are optimal for all
devices.
App Layout
Your form’s layout is not simply restricted to a series of rows and columns of fields. Here
is a list of some of the commonly used components in forms that help to organize data
into logical groups that help users quickly find the information they are looking for:
• Quick View: Display a set of key fields from a related record, such
as name, email, and department for a user record selected for the
project sponsor.
Tip Users tend to not like excessive vertical scrolling or an excessive number
of tabs, so it is important to not lean too heavily on either method; try to find a
balance of the two. It also helps to keep field width and height to a minimum to
make the most of the form’s visible space.
To add any components, simply select them from the Navigation pane, then proceed
to select or drag the components to place them on the form canvas.
When you are done working on your form, click Save and Publish; the form will
automatically be updated anywhere that form is used.
160
Chapter 5 Presentation Layer
Custom Pages
Custom pages allow a crossover between model-driven and canvas apps so that the
design elements and data sources from canvas apps can be incorporated into model-
driven apps. Custom pages run in the model-driven app as a subsection, like a table or a
dashboard, so it is easy for a user to switch between the contexts in the single navigation
bar of the model-driven app.
As you can see in Figure 5-29, the custom page uses the Power Apps Studio discussed
in the “Canvas Apps” section of this chapter. However, there are additional options in
the development canvas area that are specific to pages. There are also no options to add
pages, as a custom page is limited to a single page.
161
Chapter 5 Presentation Layer
Tip Refer to this section to become familiar with the Power Apps Studio if you
are not already.
One of the neat features available in this section is that you can select the With Data
template and select a table, and the pages designer will create a standard interface based
on that table. From there you can continue to customize the page as you would any other
canvas app, then save and publish.
162
Chapter 5 Presentation Layer
The next screen will give you the option to either create a new page or use an existing
one, as seen in Figure 5-31.
Note As you can see in Figure 5-31, you can create a new page from the app
as well, but I typically lean toward creating new assets in the Solution pane to be
consistent.
163
Chapter 5 Presentation Layer
Reports
While apps are primarily meant to edit data, reports are meant to access data. Because
reports are targeted at simply accessing the data, they will contain visualizations that
focus more on telling a story and interpreting data than a form in an app would.
Dataverse Reports
The native Dataverse reports allow some basic reporting capabilities, primarily focused
around displaying data in a list or hierarchy view, as seen in Figure 5-32.
To create a new Dataverse report, navigate to your solution and select New ➤ Report.
A new window will open with the report builder, as seen in Figure 5-33.
164
Chapter 5 Presentation Layer
In the Source section, select “Run Report Wizard” from the Report Type dropdown.
The other options in this list are for if you have an existing report file that you want to
import or a web page that you want to point to.
Next, click the Report Wizard button to start building your report. A new window will
open, as seen in Figure 5-34. The first screen allows you to select “Start a new report” or
“Start from an existing report,” as seen in Figure 5-34. Often, you can start with existing
reports and modify them for your needs. However, for this case, we will create a new
report, so select “Start a new report” and click Next.
165
Chapter 5 Presentation Layer
Once you get to the Report Properties window, you will see several options to begin
your report, as seen in Figure 5-35.
• Report Name: Give your report a name that will describe what it
does. This name should be short and specific as it will show up in the
list of reports in the model-driven app.
• Primary Record Type: The primary table that your report will be
based on
166
Chapter 5 Presentation Layer
167
Chapter 5 Presentation Layer
As you can see in Figure 5-36, this filter will return only records where the User field
is equal to the current logged-in user and the status of the record is Active.
Note Since Status is an Option Set value, you will select the ellipses next to the
field value to select the appropriate option, since an Option Set cannot be queried
based on a text value.
You can use And / Or group logic by selecting the “Group AND” or “Group OR”
buttons to create more complex query logic.
Once you are done setting up your query parameters, click the Next button to move
to the Lay Out Fields screen, as seen in Figure 5-37.
168
Chapter 5 Presentation Layer
Selecting “Click here to add a grouping” opens a new dialogue to select a field for the
selected level, as well as properties of that field, as seen in Figure 5-38.
169
Chapter 5 Presentation Layer
Selecting “Click here to add a column” will bring up a similar dialogue to select the
field that will show up in the record row.
Tip Since the grouping contains information about the record, you usually do not
need to add those same columns to the record since it is duplicative.
Once you have finished configuring the settings for your report layout, click Next to
move to the Format Report window, as seen in Figure 5-39.
170
Chapter 5 Presentation Layer
The basic report will have Table Only selected. However, if you have summary
columns set up in the Lay Out Fields screen, you can select the option to add a chart to
your report.
Click the Next button to view the report summary, then click the Finish button to
close the wizard. You will notice that the fields on the Report Builder screen are now
filled out. Simply click the Save and Close button and your report is done.
Once your report is saved, it will show up in your model-driven apps when the
appropriate table is selected, as seen in Figure 5-40.
171
Chapter 5 Presentation Layer
Now, users can run formatted reports right from the context of the tables they are
looking at in the model-driven app, which can reduce the number of tools that they need
to manage.
Power BI Reports
As I said before, Dataverse reports are fairly limited in the amount of customization that
can be done to them. If you have more complex reporting requirements, Power BI can be
integrated into your solutions and apps as well.
While how to use Power BI is beyond this book’s scope, adding Power BI reports to
Dataverse is simple.
Navigate to your solution, select Add Existing ➤ Analytics ➤ Power BI Report, as
seen in Figure 5-41.
172
Chapter 5 Presentation Layer
After adding your report to the solution, you can add the associated Power BI
dashboards from your solution as well by navigating to your solution and selecting New
➤ Dashboard ➤ Power BI Embedded, as seen in Figure 5-42.
A dialogue will open where you can select a Power BI report or Power BI dashboard
to add to your solution, as seen in Figure 5-43.
173
Chapter 5 Presentation Layer
After you add these assets from Power BI, you can add them to your apps the same
way you add other dashboards to your app.
This allows you to build far more complex reports and dashboards to add to
your apps.
Summary
In summary, the presentation layer allows users to interact with data in meaningful and
intuitive ways. Apps allow the reading and editing of data at a record level, and reports
and dashboards allow analysis of the data. Both are important in the scope of business,
and the more you can merge the two methods, the better the experience your users will
likely have. Next, you will learn how to integrate data from other systems to allow your
systems to work together and reduce or eliminate data discrepancies.
174
CHAPTER 6
Integration with
Third-Party Tools
In the previous chapter, we discussed the Presentation Layer and how to make your data
useful to end users using apps and reports. In this chapter, we will discuss the ability to
integrate your solutions with third-party data to create seamlessly integrated processes
to simplify work across teams and tools. Through the use of Connectors, Dataflows,
and Virtual Tables, you can adapt or extend legacy tools easier than ever before. These
features help you to be agile as your business changes, but do not require teams to move
off of tools that work for what they do now.
C
onnectors
Connectors are essentially wrappers built around an application programming interface (API)
that allow one system to talk to another. Connectors consist of two components: actions
and triggers.
Actions: Requests to perform operations on data, such as create, read, write,
and delete.
Triggers: Provide a notification that an event has occurred. There are two types
of triggers: push and polling. Push triggers listen to an endpoint until an action has
occurred, such as a file being uploaded or a record being updated. Polling triggers run
on a schedule, such as every day at 1 AM.
Connectors are available for use in Power Apps, Power Automate, and Azure Logic
Apps, the availability of actions and triggers may differ depending on which product you
are using.
It is also important to be familiar with the various categories of connectors available
in Power Platform and understand where each one stands with regard to source,
ownership, and licensing.
175
© Brian Hodel 2023
B. Hodel, Beginning Microsoft Dataverse, https://doi.org/10.1007/978-1-4842-9334-8_6
Chapter 6 Integration with Third-Party Tools
Out-of-the-Box Connectors
Microsoft provides an extensive list of connectors certified for use in Power Platform.
Microsoft Published
Microsoft-published connectors are created and published by Microsoft. This includes
connectors for many Microsoft systems as well as many non-Microsoft systems. This differs
from Verified Publisher connectors that are created by a third party and verified by Microsoft.
Verified Publisher
Verified-publisher connectors are connectors that were built by third parties that own
the underlying service they are integrating with. There is an extensive review process
involving testing by both Microsoft and Microsoft partners before a verified-publisher
connector is published to the platform. After that initial review, any updates are also
subject to a review process. This is to ensure that any verified-publisher connectors are
stable and reliable to be used in production systems.
Independent Publisher
Independent-publisher connectors are a new concept on the platform and allow third
parties who do NOT own the underlying service to develop new connectors. This adds a
new stream of development that has increased the availability of integration tools between
Power Platform and other services. It is also a fantastic way to contribute to the community.
If you have a need to integrate with another system, it is highly likely that others do as well.
So, if you are going to build a new connector, you might as well share it on the platform.
Custom Connectors
While the Power Platform has hundreds of connectors available Out-of-the-Box (OOB),
it also provides the ability to create Custom Connectors to extend those capabilities.
While you can make API calls from Power Automate flows using the HTTP actions, you
cannot make them directly from apps unless you create a custom connector or invoke
a flow. In addition, API calls made from within flows need to be configured individually
and can be time-consuming. Custom connectors allow you to create a single place where
176
Chapter 6 Integration with Third-Party Tools
the code and authentication are managed. Once you have done that, you simply add that
connector to your apps and flows and ask it to perform the operations you need, just like
any other connector.
The easiest way to create a custom connector is to create one from an existing
Postman collection or an OpenAPI definition. These are two standard API definition
formats that can be downloaded from many service providers’ websites if you search
through the developer resources section or by searching in Postman for existing
collections of API calls.
Once you have the API definition file downloaded, navigate to your Power Platform
environment, expand the Data section on the left-hand navigation bar, and select
“Custom Connectors,” as seen in Figure 6-1.
Here you will see all the custom connectors, if there are any, and can create new ones
by selecting one of the options in the list, as seen in Figure 6-2.
177
Chapter 6 Integration with Third-Party Tools
As you can see, there are many ways to begin creating a connector, but importing an
existing definition is the easiest method. In this example, I will show the process of importing
an existing Postman collection, as it is one of the most common methods. Postman is a great
tool for building, editing, and testing APIs. If you haven’t used it before, I highly recommend
you download it and begin learning how to use it (https://www.postman.com/).
After selecting “Import a Postman collection” you will be asked for the name of
the connector and to import the Postman collection that you will have previously
downloaded. Then, click Continue and the collection will be imported to your connector
and generate the structure for your connector.
Once the custom connector page opens, you will see there are five tabs dedicated to
configuring the connector, as follows:
• General: Description and base API settings, as well as option to
connect over an on-premises data gateway.
• Definition: Contains the API call definitions that will allow you to
request and post data.
• Code (Preview): Allows running of code along with the API call.
• Test: Allows testing of API calls prior to adding the connector to your
flows and apps.
178
Chapter 6 Integration with Third-Party Tools
The General tab contains the base configuration for the connector and needs to
be filled out if it was not already populated during import, as seen in Figure 6-3. The
elements are as follows:
• Icon and Description: Load an icon, color, and description for the
connector to help identify it to users.
• Scheme, Host, and Base URL: Root connection parameters for the
API calls.
179
Chapter 6 Integration with Third-Party Tools
The Security tab contains multiple authentication methods and may need some
additional details if the API call requires authentication. In this example, we are using
an API Key authentication, so we will specify the parameter name and location for the
parameter, as well as provide a friendly label for users when they are prompted for the
API Key while setting up the connection, as seen in Figure 6-4.
180
Chapter 6 Integration with Third-Party Tools
Next is the Definition tab, as seen in Figure 6-5. This section has all the API
definitions and details for the actual requests. The Actions section lists all the API calls
that are defined, and the General and Request sections show the details for each request.
Since we started from a Postman file, you should not need to modify this section, but
it is good to understand what is going on here and customize the calls if you want to by
adding or removing elements as you see fit.
181
Chapter 6 Integration with Third-Party Tools
The Code tab is specifically for advanced operations that require running code at
the time of the calls. We will skip this section because of the scope of this book, but it is
available if needed.
182
Chapter 6 Integration with Third-Party Tools
When you select the Test tab, you will see a message stating that you need to create
the connector before testing, which is saving and provisioning your connector. To do
this, select “Create Connector” from the top command bar, as seen in Figure 6-6.
After creating your connector, you will notice that the Test Operation button is
grayed out and the Selected Connection field says “None,” as seen in Figure 6-7. To
test your connector, you will need to create a connection to provide the authentication
information. To do this, select “+ New Connection,” as seen in Figure 6-7.
Since our authentication method was API Key, we are prompted to enter the API Key,
as seen in Figure 6-8. Once your connection is set up, you can begin testing your API to
ensure everything is working as expected.
183
Chapter 6 Integration with Third-Party Tools
Note If the Test Operation button is still greyed out after creating your
connection, you may need to select the refresh icon in the upper-right corner of the
Connections box to refresh the interface.
The Test section contains the connection configuration and testing capabilities for
your API calls, as seen in Figure 6-9. Test API calls by entering values in the parameter
fields and clicking the Test Operation button. Your request will be sent to the service, and
a response will be returned to the Response section.
184
Chapter 6 Integration with Third-Party Tools
Once your testing is complete, you can start using your connector to integrate data
from other systems into your solution.
185
Chapter 6 Integration with Third-Party Tools
Licensing
Licensing across Power Platform is complex and involves a lot of variables. However,
the class of connector used in your app or flow will dictate whether you may need an
additional license for your app or flow. Here is an overview of the two categories of
connectors to be aware of:
The licensing for Power Platform, especially for Premium connectors like Dataverse,
can get complicated. It is important to understand how your users are licensed and if
there may be additional costs to deploying your Dataverse solutions. Spend some time
with the organization’s licensing teams and reviewing the Microsoft licensing guides to
ensure that you don’t run into unexpected licensing roadblocks down the road.
There are a couple of loopholes in the licensing for Dataverse, however. One is
Dataverse for Teams, which is discussed later in this book. The other is developer
environments, which allow developers to build and test solutions without licensing
considerations. These environments are meant for testing and development, not for
production, and are automatically deleted if inactive for 90 days, but they are a fantastic
way to get in and build solutions on Dataverse without having to worry about licensing
barriers.
186
Chapter 6 Integration with Third-Party Tools
Virtual Tables
Virtual tables enable you to work with external data sources within Dataverse without
the need to duplicate the data in Dataverse tables. Virtual tables give you a view of the
data in your external data sources and allow you to relate data to data in other Dataverse
tables and even to update data in the external source without having to build any sync
operations.
While connectors allow you to integrate with data using the same systems as virtual
tables do, such as SQL and SharePoint, virtual tables allow you to interact with that data
just like you would any other table in Dataverse. Not only can you perform full Create,
Read, Update, and Delete (CRUD) functions on the data, but you can also relate the data
in a virtual table to other Dataverse tables, such as creating lookup fields to those tables.
This allows for more seamless data integration with regards to architecture.
To create a new Virtual Table, select New Table ➤ New Table from External Data
from within your solution, as seen in Figure 6-10.
If you do not already have a connection, you will need to click New Connection,
which will open a new browser tab/window where you can create a new connection, as
seen in Figure 6-11.
187
Chapter 6 Integration with Third-Party Tools
If you were to connect to a SharePoint Online list, you would select SharePoint,
Connect directly, then click Create, as is seen in Figure 6-12.
After you authenticate the connection, it will be created. Switch back to the New
Table from External Data tab and click Refresh; your new data source will show up as
shown in Figure 6-13.
Figure 6-13. Connection listed in New Table from External Data window
Click Next, and you will be able to name the connection reference, as seen in
Figure 6-14. You will be able to give a friendly display name, and the Logical Name field
will be automatically populated. There is no need to change the logical name from the
default value.
After clicking Next, you will enter the SharePoint site URL or select from the list of
recent sites if available. Click Next again and select the list to which you want to connect,
as seen in Figure 6-15.
189
Chapter 6 Integration with Third-Party Tools
Next, you can set the display name of the table and select the primary field from
the list of fields that exist in the SharePoint list, as seen in Figure 6-16. The grayed-out
properties cannot be changed and are used by Dataverse with default values.
After clicking Create, you will be shown both the SharePoint list details and the new
Dataverse virtual table details side-by-side. Click Finish to complete the operation.
190
Chapter 6 Integration with Third-Party Tools
Once the creation operation is finished, you can navigate to the table and view the
table columns and data like looking at a normal Dataverse table.
You can now create lookup columns in other Dataverse tables that connect to the
new virtual table just like any other table. In this manner, you can relate records from
your SharePoint list source to Dataverse tables in the same way you do with other tables
without ever having to duplicate your SharePoint list data in Dataverse.
You can also edit the data in your virtual table, which will automatically update the
data in your SharePoint list because you are operating on the SharePoint data directly,
not on Dataverse data. This seamless integration opens a great number of capabilities in
system integration.
A safe way to look at DfT is that it allows you to build and use tools for a team but
is not intended for enterprise-level tools. However, if you were to build a set of tools in
a DfT environment and needed to extend them, there is a process to upgrade your DfT
environment to a full environment. This is a great feature that prevents you from needing
to rebuild or migrate your tools into Dataverse if they scale beyond the limitation of DfT.
Get Started
To start using Dataverse for Teams, from within Teams, open the App Store, find Power
Apps, and click Add. This will add the Power Apps app to your Teams so you can start
building DfT.
191
Chapter 6 Integration with Third-Party Tools
Use Power Apps in Teams by selecting the ellipses on the left-hand navigation bar to
open the More Apps section, then search for “Power Apps,” as seen in Figure 6-17.
Tip You can pin the Power Apps app to your left-hand navigation bar by selecting
the ellipses on Power Apps and selecting Pin, as seen in Figure 6-17.
If you are ready to start building a new canvas apps, you can click Start Now on the
Home screen to select an environment and start building a new app in the DfT Power
Apps Studio or select “Learn More” to be taken to some resources to learn how to build
canvas apps, as seen in Figure 6-18.
192
Chapter 6 Integration with Third-Party Tools
Further down the Home page are recently accessed apps, links to DfT templates
available for download from GitHub, and additional learning resource links. The
templates are fully functional solutions and can be downloaded and installed to your
DfT environment or even published to your tenant for others to install. These are great
ways to get going in DfT, but can also be customized or expanded as you see fit using the
Power Apps app in Teams. Modifying the template apps is also an effective way to learn
how to build apps in DfT. I suggest you set up a “playground” team to do this in just in
case you break something while learning.
Note Since DfT environments are not listed in the Power Platform interfaces, you
must access any DfT resources through the Teams Power Apps app.
193
Chapter 6 Integration with Third-Party Tools
To create a new app for a team, select Create from the bottom of the Build screen, as
seen in Figure 6-9.
This view shows a list of each DfT environment you are a member of and a list of
assets in the selected DfT environment, as seen in Figure 6-19. It also shows any installed
apps or dataflows that exist in the selected environment.
To create a new app for a team, select Create from the bottom of the Build screen, as
seen in Figure 6-9. To see all of the existing apps, flows, tables, etc., or create new ones,
for an existing DfT environment, select “See All”, as seen in Figure 6-20. This is typically
the view that I work in as it allows full control over asset creation.
194
Chapter 6 Integration with Third-Party Tools
If you do not see your environment listed, it could be because you have not created
a DfT environment for your team yet. DfT environments are only created when an
asset is added to it, so simply select “Create” at the bottom of the screen to start your
environment creation. Select your environment from the list and add an app to it.
Dataflows
Dataflows are not unique to Dataverse. In fact, they exist throughout Power Platform.
Their purpose is to take data from one system, manipulate it using Power Query,
and bring it into the Microsoft ecosystem; in the case of Dataverse, to move it into a
Dataverse table.
195
Chapter 6 Integration with Third-Party Tools
or Power BI. Dataflows also have a much greater ability to transform the data as it is
ingested whereas virtual tables essentially produce a view into the source data.
Where virtual tables beat dataflows is that there is no duplication of data in your
environment like there is with dataflows. Since dataflows copy data from the source to
destination, there is a duplication of data. Virtual tables never copy data from the source
and instead query on demand to return the values.
Creating a Dataflow
To create a dataflow to Dataverse, select “Dataflows” from under Dataverse in the left-
hand navigation bar in Power Apps, as seen in Figure 6-21.
Once the dataflows interface opens, select “New Dataflow” from the top navigation
bar then “Start From Blank.”
A dialogue will open, and you will give the dataflow a name. Select whether it is
an analytical entity. Analytical entities are loaded to the data lake instead of Dataverse
because they are meant for analysis. In this case, do not select “Analytical Entities Only.”
Click the Create button.
196
Chapter 6 Integration with Third-Party Tools
A new window will open listing all the data sources available, as seen in Figure 6-22.
Depending on what source you select, your next steps will be different, but the guide
walks you through the setup.
Selecting “Excel” will open a new dialogue to select your document and create a
connection to Excel if one does not exist already, as seen in Figure 6-23.
197
Chapter 6 Integration with Third-Party Tools
After clicking Next, you will see a list of the sheets found in the Excel document,
as seen in Figure 6-24. Select the sheet with the data you want to import and click
Transform Data.
198
Chapter 6 Integration with Third-Party Tools
A dialogue will open with Power Query where you can apply transformations to your
data, such as filtering, grouping, formatting, and so on, as seen in Figure 6-25. As you
manipulate the data, you will see that steps build on the right in Query Settings. This is
helpful if you need to work backward to see what you did if your data is not looking right.
199
Chapter 6 Integration with Third-Party Tools
Clicking Next brings you to the Map Tables dialogue, where you can define the
properties of the tables you are moving the data into, as seen in Figure 6-26.
200
Chapter 6 Integration with Third-Party Tools
In the Load Settings section, specify whether you want to load to a new table or an
existing one. There is also an option for Do Not Load, which is only if you want to use this
dataflow to feed into another one.
Note Selecting “Delete rows that no longer exist in the query output” will remove
any rows created in your table from your dataflow but will also have a negative
performance impact, which can be a problem for large datasets.
Selecting “Load to New Table” or “Load to Existing Table” will display the column
mapping section where you will select the source column and destination column.
Power Apps will attempt to set these by detecting column types, but any that are missing
must be set.
Clicking Next brings you to the Refresh Settings screen, which allows you to
configure the refresh to be either manual or automat based on a schedule.
Click Publish when you are done, and it will be active. Once the dataflow runs, you
will have a table in Dataverse with the specified data ready to use.
201
Chapter 6 Integration with Third-Party Tools
Summary
In conclusion, there are many ways to integrate Dataverse with other systems. Whether
you must import data related to a migration, sync data from a legacy system to relate to
records in your tool, or connect directly to other tools to create tickets, there is a solution
to meet your needs. Next, you will learn how to go about planning sessions with your
users and stakeholders to gather requirements and reduce rework.
202
CHAPTER 7
O
rganizing Thoughts
One of the benefits of developing in a low-code/no-code environment is that you
typically end up working closely with the business teams, which means you can be more
involved in the requirements-gathering process. However, gathering requirements can
be a challenging ordeal if you are unprepared for such an endeavor, especially since
business users rarely understand concepts of data architecture.
203
© Brian Hodel 2023
B. Hodel, Beginning Microsoft Dataverse, https://doi.org/10.1007/978-1-4842-9334-8_7
Chapter 7 Planning Your Solution Design
Compare to Current
Often, there is an existing process or system that the business is operating in that they are
trying to replace or improve. This can be a good opportunity to learn what they are trying
to get out of the new solution. I will often set up time to just watch a user interact with
the current tools and take notes of what they do along the way. This can help uncover
opportunities to improve the process. Next, I will have the users tell me what they are
doing as they do the work and describe what frustrates them about the process. Usually,
what users find frustrating are the things that get in the way of getting work done, which
is a good indicator that there may be waste in the process.
In lean manufacturing, waste is defined as anything that does not add value to
a process; i.e., something that the customer is not willing to pay for. There are many
definitions of the types of waste, but I am partial to the Toyota Production System’s set,
which uses TIMWOODS as a pneumonic device to remember them by. Those wastes are
as follows:
204
Chapter 7 Planning Your Solution Design
• Skills: Employees working below their skill level. Not only is this
a waste of resources, but it can also lead to dissatisfied employees
because they are not challenged.
Having business users tell you what works and what does not work helps you to
understand their needs and desires in a new tool. While most users will not readily
identify wastes in a process, they will be happy to see those annoyances and barriers to
getting work done disappear.
205
Chapter 7 Planning Your Solution Design
• Q: Why?
So, as you can see, once the team interrogated the issue more, it changed from
needing to accommodate expedited projects, which causes a lot of chaos and added
cost, to simply needing an easier way for approver to review and approve phases. By
asking the users for more specifics on their reasoning, they found a root cause of the
problem. Once you have a root cause, it is much easier to develop a solution and address
the problem instead of accommodating variation in the process.
206
Chapter 7 Planning Your Solution Design
207
Chapter 7 Planning Your Solution Design
Tip While detailed diagrams and documentation can be overwhelming for some
end users, try to use as much detail as you can to toe the line with them. Tools
like Unified Modeling Language (UML) diagrams give you a very structured and
powerful way to document requirements. Even if you don’t use them with end
users in workshops, it is a good tool to keep track of requirements as you go along.
208
Chapter 7 Planning Your Solution Design
Once we have figured out the data structures, we can discuss security layers by
adding onto the existing model that we just built. You can take the model that we just
built and add person and team icons, color coding them with the associated levels, to
easily illustrate record permissions and inheritance, as seen in Figure 7-2.
209
Chapter 7 Planning Your Solution Design
This sort of visual model makes it easy for a business user to think through the
permissions structure in real scenarios and allows me, as a developer, to get the
information that I need to begin designing.
210
Chapter 7 Planning Your Solution Design
This technique can save you, as a developer, time by not having to gather the
requirements yourself and allowing users to think through the requirements that make
up the fields. Using Excel Online makes this process much more friendly as you can track
all changes, leave notes, and assign tasks to users by tagging them in the comments. This
collaborative process can save time and reduce confusion.
Some of the key attributes that you want to gather are as follows:
• Form Section: Whether or not the forms are separated into sections,
it often helps to organize the fields into groups to establish logical
relationships between the data.
211
Chapter 7 Planning Your Solution Design
• Guidance Text: Hover text, placeholder text, and subtext are all
common types of guidance text related to fields.
• Comments: It never hurts to have some open text space for users to
make notes or leave additional instructions.
Although the fields are created on the tables, it can be easier for business users to
think of things as forms instead of tables. This also allows you more freedom to architect
the data in a way that works best for the solution. Encourage users to focus on the
process and what needs to be tracked instead of how it is accomplished so they do not
get lost in the details.
212
Chapter 7 Planning Your Solution Design
I usually add the backlog option so users have a place to put things that are not that
important anymore, but that they do not want to lose sight of. Reassuring users that their
time and efforts are valuable goes a long way to ensuring the process moves forward
without hiccups.
Note While I have found that in-person workshops are the most effective,
using tools like Microsoft Teams with chat and whiteboards can be very effective
and accommodate teams where in-person is difficult to accommodate. Hybrid
workshops can also be effective if you can ensure that the remote team members
are not neglected.
213
Chapter 7 Planning Your Solution Design
Before the session, collect a variety of sticky notes in assorted colors, markers, poster
paper, and painter's tape. Set up a poster paper as a key or legend with one of each sticky
note and describe what that color means, such as start, end, approval, decision, and
so on. It is also a good idea to have smaller keys printed out that can be posted at each
section of the room so the teams can easily refer to them. Next, attach poster paper to the
walls around the room. The poster paper is where the team will map the processes and
allows you to remove the processes to keep for later sessions. If you have a room where
you can leave them up throughout the entire process, that is ideal.
Note You should test your sticky notes on the walls or poster paper before your
session to ensure they stick. Some sticky notes do not play well with certain
surfaces, and you do not want a workshop where the notes keep falling off
the walls!
214
Chapter 7 Planning Your Solution Design
In the end, you will have a bunch of process maps like those seen in Figure 7-4.
Note The Post-it corporation has a great app for capturing Post-it notes from a
wall and saving them in the app for future reference. It is a clever way to document
your process maps if you want to preserve them or use them later.
215
Chapter 7 Planning Your Solution Design
If you want to get fancy, you can identify the types of waste and quantify them, but
that is not necessary for building a tool unless you want to quantify the business impact
of the changes.
217
Chapter 7 Planning Your Solution Design
is indeed the one that needs to be solved. The “5 Whys” is a great tool, but there are
many other tools that are easy to use and very effective, such as the fishbone diagram
seen in Figure 7-7.
The fishbone diagram is great for larger root-cause analysis work where a defect may
have multiple contributing factors. Start by listing the defect at the head of the “fish,”
then set up the primary branches with the primary causes of the defect. The “6Ms” listed
in Figure 7-7 are standard but feel free to adjust them as needed. From there, work your
way down each “bone,” identifying causes, then contributing causes. The more granular
you get, the more root causes you can identify. This also works well with the “5 Whys”
because as you ask the Whys, you will uncover various root causes, and this can be a way
to document those.
Since you may have a tough time working on all the items at once, using a system
of voting by the team might help to narrow down the top contributors to the problem.
By giving every team member a set number of sticks, say two each, they can place them
on the cause that they feel has the highest impact on the defect. Ideally, this would be
done in an analytical method using data, but voting can get you going in a good direction
during workshops or when data is lacking.
218
Chapter 7 Planning Your Solution Design
Summary
In summary, the best solutions are built when the solution design and requirements
gathering are done in parallel in an iterative approach. This process relies not only
on clear documentation, but also on effective communication and expectation
management. Realizing the business users are not technical and meeting them at a
level that they are comfortable with will make for a productive engagement and a better
solution build.
219
Index
A Apps
attributes, 133
AAD Group Type, 93
Canvas, 134–146
AAD Office Group, 89, 91, 92
custom pages, 161–163
AAD Security Group, 89, 91, 92
foreign keys, 133
Access team, 83–85, 90, 91, 93–96
forms, 155–160
Access team templates, 84, 94, 95
model-driven app, 134, 147–155
Active listening, 206, 207
Power Platform, 134
Activity tables, 25, 27, 84
schema, 133
Admin center, 86–88, 94
Auditing
Advanced Column types, 35, 36
Additional Audit Settings, 128, 129
Advanced table options, 23, 27
enabling, 126, 127
ALM capabilities, 9, 20
tables, 130
Alternate keys, 37–39
Auto-created access (system-managed)
Application lifecycle management
teams, 84, 85, 93, 96
(ALM), 5, 7
Auto-created access team setup
capabilities, 9
Access Team Templates, 94, 95
deployment
Admin Center
automated deployments, 15, 17
accessing environment settings, 94
downstream environments, 11
accessing the power platform, 94
manually exporting solution, 11–13
enabling, 93
manually importing solution, 14, 15
new access team, 95
phases, 9
New Access Team dialogue, 96
solution layers
Auto-created access team setup
Last one wins, 18
access rights, 97
managed layer, 17
entity, 96
resolving unmanaged layers, 20
name, 96
unmanaged layer, 17, 19
Azure AD, BUs, 100
solutions, 10, 11
Azure AD Security, 85, 86
Application Lifecycle Management
Azure DevOps, 15–17
(ALM), 1, 5, 7, 9–11, 15
Azure Logic Apps, 175
Application programming interface (API),
Azure Synapse, 195
146, 175–181, 183, 184
221
© Brian Hodel 2023
B. Hodel, Beginning Microsoft Dataverse, https://doi.org/10.1007/978-1-4842-9334-8
INDEX
222
INDEX
223
INDEX
D Dataverse processes
BPFs (see Business process flows (BPFs))
Dataflows
cloud flows, 65–75
analytical entities, 196
desktop flows, 76
configuring Excel document
Dataverse Relationship Actions, 47
connection, 197, 198
Dataverse Relationship Behaviors, 47
connection options, 197
Dataverse reports
Load Settings, 201
add grouping, 169, 170
Load to New Table/Load to Existing
columns, 169
Table, 201
controls, 169
Map Tables dialogue, 200, 201
displaying data, 164
in Power Apps, 196
Format Report window, 170, 171
Power Query, 199, 200
grouping, 169
Refresh Settings, 201
lay out fields screen in report wizard,
selecting sheets from Excel document,
168, 169
198, 199
primary record type, 166
Start From Blank, 196
Report Builder, 164, 165, 171
vs. virtual tables, 195, 196
report description, 166
Dataverse, 5, 20
report filter criteria in report wizard, 168
data storage, 21
report list in model-driven app,
with default values, 190
171, 172
and Power Platform, 6
report name, 166
terminology, 21
report properties, 167
Dataverse for Teams (DfT)
Report Properties window, 166
adding or pinning Power Apps
Report Wizard, 165, 166
app, 192
Dataverse workflows, 60, 65
advantages, 191
Desktop flows, 3, 58, 65, 76
canvas apps, 192
Developing data and security structures
enterprise-level tools, 191
applying security roles to data
environment, 194, 195
model, 210
limitation, 191
business users, 208
M365 licensing models, 191
Excel Online, 211
navigating the interface, 193–195
fields and forms, 210–212
Power Apps, 192, 195
managing requirements, 212
Power Apps Studio, 192
mapping data for business users,
section, 194
208, 209
started, 193
permissions and inheritance, 209
templates, 193
224
INDEX
225
INDEX
N P, Q
Navigation elements Paraphrasing/reflecting, 207
types, 149 PascalCaseExample, 22
model-driven apps, 149–152 Position hierarchy, 110, 111
Non-verbal communication, 206 configuring, 115–118
226
INDEX
227
INDEX
228
INDEX
229