You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: powerapps-docs/maker/data-platform/create-and-use-dataflows.md
+38-85Lines changed: 38 additions & 85 deletions
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
title: "Create and use dataflows in Power Apps | MicrosoftDocs"
3
3
description: "Learn how to create and use dataflows in Power Apps"
4
4
ms.custom: ""
5
-
ms.date: 10/03/2023
5
+
ms.date: 08/06/2024
6
6
ms.reviewer: ""
7
7
ms.suite: ""
8
8
ms.tgt_pltfrm: ""
@@ -32,45 +32,33 @@ A dataflow is a collection of tables that are created and managed in environment
32
32
schedules, directly from the environment in which your dataflow was created.
33
33
34
34
Once you create a dataflow in the Power Apps portal, you can get data from it
35
-
using the Common Data Service connector or Power BI Desktop Dataflow connector, depending on
35
+
using the Dataverse connector or Power BI Desktop Dataflow connector, depending on
36
36
which destination you chose when creating the dataflow.
37
37
38
38
There are three primary steps to using a dataflow:
39
39
40
-
1. Author the dataflow in the Power Apps portal. You select the destination
41
-
to load the output data to, the source to get the data from, and the Power
42
-
Query steps to transform the data using Microsoft tools that are
43
-
designed to make doing so straightforward.
40
+
1. Author the dataflow in the Power Apps portal. You select the destination to load the output data to, the source to get the data from, and the Power Query steps to transform the data using Microsoft tools that are designed to make doing so straightforward.
44
41
45
-
2. Schedule dataflow runs. This is the frequency in which the Power Platform
46
-
Dataflow should refresh the data that your dataflow will load and transform.
42
+
2. Schedule dataflow runs. This is the frequency in which the Power Platform Dataflow should refresh the data that your dataflow will load and transform.
47
43
48
-
3. Use the data you loaded to the destination storage. You can build apps,
49
-
flows, Power BI reports, and dashboards or connect directly to the dataflow’s
50
-
Common Data Model folder in your organization’s lake using Azure data services like Azure
51
-
Data Factory, Azure Databricks or any other service that supports the Common Data Model
52
-
folder standard.
44
+
3. Use the data you loaded to the destination storage. You can build apps, flows, Power BI reports, and dashboards or connect directly to the dataflow’s Common Data Model folder in your organization’s lake using Azure data services like Azure Data Factory, Azure Databricks or any other service that supports the Common Data Model folder standard.
53
45
54
-
The following sections look at each of these steps so you can become familiar
55
-
with the tools provided to complete each step.
46
+
The following sections look at each of these steps so you can become familiar with the tools provided to complete each step.
56
47
57
48
## Create a dataflow
58
49
59
-
Dataflows are created in one environment. Therefore, you'll only be able to see
60
-
and manage them from that environment. In addition, individuals who want to get
61
-
data from your dataflow must have access to the environment in which you created
50
+
Dataflows are created in one environment. Therefore, you'll only be able to see and manage them from that environment. In addition, individuals who want to get data from your dataflow must have access to the environment in which you created
62
51
it.
63
52
64
53
> [!NOTE]
65
54
> Creating dataflows is currently not available with Power Apps Developer Plan licenses.
66
55
67
-
1.Sign in to Power Apps, and verify which environment you're in, find the environment switcher near the right side of the command bar.
56
+
1. Sign in to Power Apps, and verify which environment you're in, find the environment switcher near the right side of the command bar.
1. On the left navigation pane, select **Dataflows**. [!INCLUDE [left-navigation-pane](../../includes/left-navigation-pane.md)]
72
-
1. select **New dataflow**, and then select **Start from blank**.
73
-
1. On the New Dataflow page, enter a **Name** for the dataflow. By default, dataflows store tables in Dataverse. Select **Analytical entities only** if you want tables to be stored in your organization's Azure Data Lake storage account. Select **Create**.
61
+
1. Select **New dataflow**. On the **New Dataflow** page, enter a **Name** for the dataflow. By default, dataflows store tables in Dataverse. Select **Analytical entities only** if you want tables to be stored in your organization's Azure Data Lake storage account. Select **Create**.
74
62
75
63
> [!IMPORTANT]
76
64
> There is only one owner of any dataflow—the person who created it. Only the owner can edit the dataflow. Authorization
@@ -95,31 +83,21 @@ choose data and a source, the Power Platform Dataflow service will subsequently
95
83
reconnect to the data source in order to keep the data in your dataflow
96
84
refreshed, at the frequency you select later in the setup process.
97
85
98
-
99
86

100
87
101
-
Now that you've selected the data to use in the table, you can use the dataflow editor to
102
-
shape or transform that data into the format necessary for use in your dataflow.
88
+
Now that you've selected the data to use in the table, you can use the dataflow editor to shape or transform that data into the format necessary for use in your dataflow.
103
89
104
90
## Use the dataflow editor to shape or transform data
105
91
106
-
You can shape your data selection into a form that works best for your table using a
107
-
Power Query editing experience, similar to the Power Query Editor in Power BI
108
-
Desktop. To learn more about Power Query, see [Query overview in Power BI Desktop](/power-bi/desktop-query-overview).
92
+
You can shape your data selection into a form that works best for your table using a Power Query editing experience, similar to the Power Query Editor in Power BI Desktop. To learn more about Power Query, see [Query overview in Power BI Desktop](/power-bi/desktop-query-overview).
109
93
110
-
If you want to see the code that Query Editor is creating with each step, or
111
-
if you want to create your own shaping code, you can use the advanced editor.
94
+
If you want to see the code that Query Editor is creating with each step, or if you want to create your own shaping code, you can use the advanced editor.
112
95
113
96

114
97
115
98
## Dataflows and the Common Data Model
116
99
117
-
Dataflows tables include new tools to easily map your business data to the
118
-
Common Data Model, enrich it with Microsoft and third-party data, and gain simplified access to machine learning. These new capabilities can be leveraged to provide intelligent and actionable insights
119
-
into your business data. Once you’ve completed any transformations in the edit
120
-
queries step described below, you can map columns from your data source tables to standard
121
-
table columns as defined by the Common Data Model. Standard tables have a
122
-
known schema defined by the Common Data Model.
100
+
Dataflows tables include new tools to easily map your business data to the Common Data Model, enrich it with Microsoft and non-Microsoft data, and gain simplified access to machine learning. These new capabilities can be leveraged to provide intelligent and actionable insights into your business data. Once you’ve completed any transformations in the edit queries step described below, you can map columns from your data source tables to standard table columns as defined by the Common Data Model. Standard tables have a known schema defined by the Common Data Model.
123
101
124
102
For more information about this approach, and about the Common Data Model, see [The Common Data Model](/common-data-model/).
125
103
@@ -129,83 +107,58 @@ To leverage the Common Data Model with your dataflow, select the **Map to Standa
129
107
130
108
When you map a source column to a standard column, the following occurs:
131
109
132
-
1.The source column takes on the standard column name (the column is renamed if
133
-
the names are different).
110
+
1. The source column takes on the standard column name (the column is renamed if
111
+
the names are different).
134
112
135
-
2.The source column gets the standard column data type.
113
+
2. The source column gets the standard column data type.
136
114
137
-
To keep the Common Data Model standard table, all standard columns that aren't
138
-
mapped get *Null* values.
115
+
To keep the Common Data Model standard table, all standard columns that aren't mapped get *Null* values.
139
116
140
-
All source columns that aren't mapped remain as is to ensure that the result
141
-
of the mapping is a standard table with custom columns.
117
+
All source columns that aren't mapped remain as is to ensure that the result of the mapping is a standard table with custom columns.
142
118
143
-
Once you’ve completed your selections and your table and its data settings are
144
-
complete, you’re ready for the next step, which is selecting the refresh frequency of your
145
-
dataflow.
119
+
Once you’ve completed your selections and your table and its data settings are complete, you’re ready for the next step, which is selecting the refresh frequency of your dataflow.
146
120
147
121
## Set the refresh frequency
148
122
149
-
Once your tables have been defined, you’ll want to schedule the refresh
150
-
frequency for each of your connected data sources.
123
+
Once your tables have been defined, you should schedule the refresh frequency for each of your connected data sources.
151
124
152
-
1. Dataflows use a data refresh process to keep data up to date. In the **Power Platform Dataflow authoring tool**, you can choose to refresh your dataflow manually or automatically on a scheduled
153
-
interval of your choice. To schedule a refresh automatically, select **Refresh automatically**.
125
+
1. Dataflows use a data refresh process to keep data up to date. In the **Power Platform Dataflow authoring tool**, you can choose to refresh your dataflow manually or automatically on a scheduled interval of your choice. To schedule a refresh automatically, select **Refresh automatically**.
2. Enter the dataflow refresh frequency, start date, and time, in UTC.
158
130
159
131
3. Select **Create.**
160
-
<!--
161
-
## Connect to your dataflow in Power BI Desktop
162
-
Once you’ve created your dataflow and you have scheduled the refresh frequency
163
-
for each data source that will populate the model, you’re ready for the final task, which is connecting to your dataflow from within Power BI Desktop.
164
-
165
-
To connect to the dataflow, in Power BI Desktop select **Get Data** > **Power Platform** > **Power Platform dataflows** > **Connect**.
166
-
167
-

168
132
169
-
Navigate to the environment where you saved your dataflow, select
170
-
the dataflow, and then select the tables that you created from the list.
133
+
Some organizations might want to use their own storage for creation and management of dataflows. You can integrate dataflows with Azure Data Lake Storage Gen2 if you follow the requirements to set up the storage account properly. More information: [Connect Azure Data Lake Storage Gen2 for dataflow storage](/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow)
171
134
172
-
You can also use the search bar, near the top of the window, to quickly find the
173
-
name of your dataflow or tables from among many dataflow tables.
135
+
## Troubleshooting data connections
174
136
175
-
When you select the table and then select the **Load** button, the tables
176
-
appear in the **Columns** pane in Power BI Desktop, and appear and behave just
177
-
like tables from any other dataset. -->
137
+
There might be occasions when connecting to data sources for dataflows runs into issues. This section provides troubleshooting tips when issues occur.
178
138
179
-
## Using dataflows stored in Azure Data Lake Storage Gen2
139
+
-**Salesforce connector.** Using a trial account for Salesforce with dataflows results in a connection failure with no information provided. To resolve this, use a production Salesforce account or a developer account for
140
+
testing.
180
141
181
-
Some organizations might want to use their own storage for creation and management
182
-
of dataflows. You can integrate dataflows with Azure Data Lake Storage Gen2 if
183
-
you follow the requirements to set up the storage account properly. More information: [Connect Azure Data Lake Storage Gen2 for dataflow storage](/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow)
142
+
-**SharePoint connector.** Make sure you supply the root address of the SharePoint site, without any subfolders or documents. For example, use a link similar to `https://microsoft.sharepoint.com/teams/ObjectModel`.
184
143
185
-
## Troubleshooting data connections
144
+
-**JSON File connector.** Currently you can connect to a JSON file using basic authentication only. For example, a URL similar to `https://XXXXX.blob.core.windows.net/path/file.json?sv=2019-01-01&si=something&sr=c&sig=123456abcdefg` is currently not supported.
186
145
187
-
There might be occasions when connecting to data sources for dataflows runs into
188
-
issues. This section provides troubleshooting tips when issues occur.
146
+
-**Azure Synapse Analytics.** Dataflows don't currently support Microsoft Entra authentication for Azure Synapse Analytics. Use basic authentication for this scenario.
189
147
190
-
-**Salesforce connector.** Using a trial account for Salesforce with
191
-
dataflows results in a connection failure with no information provided. To
192
-
resolve this, use a production Salesforce account or a developer account for
193
-
testing.
148
+
> [!NOTE]
149
+
> If you use data loss prevention (DLP) policies to block the **HTTP with Microsoft Entra (preauthorized)** connector then **SharePoint** and **OData** connectors will fail. The **HTTP with Microsoft Entra (preauthorized)** connector needs to be allowed in DLP policies for **SharePoint** and **OData** connectors to work.
194
150
195
-
-**SharePoint connector.** Make sure you supply the root address of the
196
-
SharePoint site, without any subfolders or documents. For example, use a link
197
-
similar to `https://microsoft.sharepoint.com/teams/ObjectModel`.
151
+
### Troubleshoot the error: Connection to Dataverse failed. Please check the link below on how to fix this issue
198
152
153
+
Users might receive an error message if the connection they're using for export requires a fix. In this case, the user receives an error message that states **Connection to Dataverse failed. Please check the link below on how to fix this issue**.
199
154
200
-
-**JSON File connector.** Currently you can connect to a JSON file using
201
-
basic authentication only. For example, a URL similar to `https://XXXXX.blob.core.windows.net/path/file.json?sv=2019-01-01&si=something&sr=c&sig=123456abcdefg` is currently not supported.
155
+
To resolve this issue:
202
156
203
-
-**Azure Synapse Analytics.** Dataflows don't currently support Azure
204
-
Active Directory authentication for Azure Synapse Analytics. Use
205
-
basic authentication for this scenario.
157
+
1. In Power Apps (make.powerapps.com), select **Connections** from the left navigation pane. [!INCLUDE [left-navigation-pane](../../includes/left-navigation-pane.md)]
158
+
2. Locate the **Microsoft Dataverse (legacy)** connection.
159
+
3. Select the **Fix connection** link in the **Status** column, and follow the instructions on your screen.
206
160
207
-
> [!NOTE]
208
-
> If you use data loss prevention (DLP) policies to block the **HTTP with Microsoft Entra (preauthorized)** connector then **SharePoint** and **OData** connectors will fail. The **HTTP with Microsoft Entra (preauthorized)** connector needs to be allowed in DLP policies for **SharePoint** and **OData** connectors to work.
0 commit comments