You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: powerapps-docs/maker/data-platform/create-and-use-dataflows.md
+35-59Lines changed: 35 additions & 59 deletions
Original file line number
Diff line number
Diff line change
@@ -37,28 +37,17 @@ which destination you chose when creating the dataflow.
37
37
38
38
There are three primary steps to using a dataflow:
39
39
40
-
1. Author the dataflow in the Power Apps portal. You select the destination
41
-
to load the output data to, the source to get the data from, and the Power
42
-
Query steps to transform the data using Microsoft tools that are
43
-
designed to make doing so straightforward.
40
+
1. Author the dataflow in the Power Apps portal. You select the destination to load the output data to, the source to get the data from, and the Power Query steps to transform the data using Microsoft tools that are designed to make doing so straightforward.
44
41
45
-
2. Schedule dataflow runs. This is the frequency in which the Power Platform
46
-
Dataflow should refresh the data that your dataflow will load and transform.
42
+
2. Schedule dataflow runs. This is the frequency in which the Power Platform Dataflow should refresh the data that your dataflow will load and transform.
47
43
48
-
3. Use the data you loaded to the destination storage. You can build apps,
49
-
flows, Power BI reports, and dashboards or connect directly to the dataflow’s
50
-
Common Data Model folder in your organization’s lake using Azure data services like Azure
51
-
Data Factory, Azure Databricks or any other service that supports the Common Data Model
52
-
folder standard.
44
+
3. Use the data you loaded to the destination storage. You can build apps, flows, Power BI reports, and dashboards or connect directly to the dataflow’s Common Data Model folder in your organization’s lake using Azure data services like Azure Data Factory, Azure Databricks or any other service that supports the Common Data Model folder standard.
53
45
54
-
The following sections look at each of these steps so you can become familiar
55
-
with the tools provided to complete each step.
46
+
The following sections look at each of these steps so you can become familiar with the tools provided to complete each step.
56
47
57
48
## Create a dataflow
58
49
59
-
Dataflows are created in one environment. Therefore, you'll only be able to see
60
-
and manage them from that environment. In addition, individuals who want to get
61
-
data from your dataflow must have access to the environment in which you created
50
+
Dataflows are created in one environment. Therefore, you'll only be able to see and manage them from that environment. In addition, individuals who want to get data from your dataflow must have access to the environment in which you created
62
51
it.
63
52
64
53
> [!NOTE]
@@ -94,31 +83,21 @@ choose data and a source, the Power Platform Dataflow service will subsequently
94
83
reconnect to the data source in order to keep the data in your dataflow
95
84
refreshed, at the frequency you select later in the setup process.
96
85
97
-
98
86

99
87
100
-
Now that you've selected the data to use in the table, you can use the dataflow editor to
101
-
shape or transform that data into the format necessary for use in your dataflow.
88
+
Now that you've selected the data to use in the table, you can use the dataflow editor to shape or transform that data into the format necessary for use in your dataflow.
102
89
103
90
## Use the dataflow editor to shape or transform data
104
91
105
-
You can shape your data selection into a form that works best for your table using a
106
-
Power Query editing experience, similar to the Power Query Editor in Power BI
107
-
Desktop. To learn more about Power Query, see [Query overview in Power BI Desktop](/power-bi/desktop-query-overview).
92
+
You can shape your data selection into a form that works best for your table using a Power Query editing experience, similar to the Power Query Editor in Power BI Desktop. To learn more about Power Query, see [Query overview in Power BI Desktop](/power-bi/desktop-query-overview).
108
93
109
-
If you want to see the code that Query Editor is creating with each step, or
110
-
if you want to create your own shaping code, you can use the advanced editor.
94
+
If you want to see the code that Query Editor is creating with each step, or if you want to create your own shaping code, you can use the advanced editor.
111
95
112
96

113
97
114
98
## Dataflows and the Common Data Model
115
99
116
-
Dataflows tables include new tools to easily map your business data to the
117
-
Common Data Model, enrich it with Microsoft and non-Microsoft data, and gain simplified access to machine learning. These new capabilities can be leveraged to provide intelligent and actionable insights
118
-
into your business data. Once you’ve completed any transformations in the edit
119
-
queries step described below, you can map columns from your data source tables to standard
120
-
table columns as defined by the Common Data Model. Standard tables have a
121
-
known schema defined by the Common Data Model.
100
+
Dataflows tables include new tools to easily map your business data to the Common Data Model, enrich it with Microsoft and non-Microsoft data, and gain simplified access to machine learning. These new capabilities can be leveraged to provide intelligent and actionable insights into your business data. Once you’ve completed any transformations in the edit queries step described below, you can map columns from your data source tables to standard table columns as defined by the Common Data Model. Standard tables have a known schema defined by the Common Data Model.
122
101
123
102
For more information about this approach, and about the Common Data Model, see [The Common Data Model](/common-data-model/).
124
103
@@ -128,62 +107,59 @@ To leverage the Common Data Model with your dataflow, select the **Map to Standa
128
107
129
108
When you map a source column to a standard column, the following occurs:
130
109
131
-
1.The source column takes on the standard column name (the column is renamed if
132
-
the names are different).
110
+
1. The source column takes on the standard column name (the column is renamed if
111
+
the names are different).
133
112
134
-
2.The source column gets the standard column data type.
113
+
2. The source column gets the standard column data type.
135
114
136
-
To keep the Common Data Model standard table, all standard columns that aren't
137
-
mapped get *Null* values.
115
+
To keep the Common Data Model standard table, all standard columns that aren't mapped get *Null* values.
138
116
139
-
All source columns that aren't mapped remain as is to ensure that the result
140
-
of the mapping is a standard table with custom columns.
117
+
All source columns that aren't mapped remain as is to ensure that the result of the mapping is a standard table with custom columns.
141
118
142
-
Once you’ve completed your selections and your table and its data settings are
143
-
complete, you’re ready for the next step, which is selecting the refresh frequency of your
144
-
dataflow.
119
+
Once you’ve completed your selections and your table and its data settings are complete, you’re ready for the next step, which is selecting the refresh frequency of your dataflow.
145
120
146
121
## Set the refresh frequency
147
122
148
-
Once your tables have been defined, you should schedule the refresh
149
-
frequency for each of your connected data sources.
123
+
Once your tables have been defined, you should schedule the refresh frequency for each of your connected data sources.
150
124
151
-
1. Dataflows use a data refresh process to keep data up to date. In the **Power Platform Dataflow authoring tool**, you can choose to refresh your dataflow manually or automatically on a scheduled
152
-
interval of your choice. To schedule a refresh automatically, select **Refresh automatically**.
125
+
1. Dataflows use a data refresh process to keep data up to date. In the **Power Platform Dataflow authoring tool**, you can choose to refresh your dataflow manually or automatically on a scheduled interval of your choice. To schedule a refresh automatically, select **Refresh automatically**.
2. Enter the dataflow refresh frequency, start date, and time, in UTC.
157
130
158
131
3. Select **Create.**
159
132
160
-
Some organizations might want to use their own storage for creation and management
161
-
of dataflows. You can integrate dataflows with Azure Data Lake Storage Gen2 if
162
-
you follow the requirements to set up the storage account properly. More information: [Connect Azure Data Lake Storage Gen2 for dataflow storage](/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow)
133
+
Some organizations might want to use their own storage for creation and management of dataflows. You can integrate dataflows with Azure Data Lake Storage Gen2 if you follow the requirements to set up the storage account properly. More information: [Connect Azure Data Lake Storage Gen2 for dataflow storage](/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow)
163
134
164
135
## Troubleshooting data connections
165
136
166
-
There might be occasions when connecting to data sources for dataflows runs into
167
-
issues. This section provides troubleshooting tips when issues occur.
137
+
There might be occasions when connecting to data sources for dataflows runs into issues. This section provides troubleshooting tips when issues occur.
168
138
169
-
-**Salesforce connector.** Using a trial account for Salesforce with
170
-
dataflows results in a connection failure with no information provided. To
171
-
resolve this, use a production Salesforce account or a developer account for
139
+
-**Salesforce connector.** Using a trial account for Salesforce with dataflows results in a connection failure with no information provided. To resolve this, use a production Salesforce account or a developer account for
172
140
testing.
173
141
174
-
-**SharePoint connector.** Make sure you supply the root address of the
175
-
SharePoint site, without any subfolders or documents. For example, use a link
176
-
similar to `https://microsoft.sharepoint.com/teams/ObjectModel`.
177
-
142
+
-**SharePoint connector.** Make sure you supply the root address of the SharePoint site, without any subfolders or documents. For example, use a link similar to `https://microsoft.sharepoint.com/teams/ObjectModel`.
178
143
179
-
-**JSON File connector.** Currently you can connect to a JSON file using
180
-
basic authentication only. For example, a URL similar to `https://XXXXX.blob.core.windows.net/path/file.json?sv=2019-01-01&si=something&sr=c&sig=123456abcdefg` is currently not supported.
144
+
-**JSON File connector.** Currently you can connect to a JSON file using basic authentication only. For example, a URL similar to `https://XXXXX.blob.core.windows.net/path/file.json?sv=2019-01-01&si=something&sr=c&sig=123456abcdefg` is currently not supported.
181
145
182
-
-**Azure Synapse Analytics.** Dataflows don't currently support Microsoft Entra authentication for Azure Synapse Analytics. Use basic authentication for this scenario.
146
+
-**Azure Synapse Analytics.** Dataflows don't currently support Microsoft Entra authentication for Azure Synapse Analytics. Use basic authentication for this scenario.
183
147
184
148
> [!NOTE]
185
149
> If you use data loss prevention (DLP) policies to block the **HTTP with Microsoft Entra (preauthorized)** connector then **SharePoint** and **OData** connectors will fail. The **HTTP with Microsoft Entra (preauthorized)** connector needs to be allowed in DLP policies for **SharePoint** and **OData** connectors to work.
186
150
151
+
### Troubleshoot the error: Connection to Dataverse failed. Please check the link below on how to fix this issue
152
+
153
+
Users might receive an error message if the connection they're using for export requires a fix. In this case, the user receives an error message that states **Connection to Dataverse failed. Please check the link below on how to fix this issue**.
154
+
155
+
To resolve this issue:
156
+
157
+
1. In Power Apps (make.powerapps.com), select **Connections** from the left navigation pane. [!INCLUDE [left-navigation-pane](../../includes/left-navigation-pane.md)]
158
+
2. Locate the **Microsoft Dataverse (legacy)** connection.
159
+
3. Select the **Fix connection** link in the **Status** column, and follow the instructions on your screen.
160
+
161
+
After the fix completes, retry the export.
162
+
187
163
## Next steps
188
164
189
165
The following articles are useful for further information and scenarios when using dataflows:
Copy file name to clipboardExpand all lines: powerapps-docs/maker/data-platform/data-platform-import-export.md
+5-15Lines changed: 5 additions & 15 deletions
Original file line number
Diff line number
Diff line change
@@ -24,7 +24,9 @@ There are two ways to import data from Excel.
24
24
-[Option 2: Import by bringing your own source file](#option-2-import-by-bringing-your-own-source-file)
25
25
26
26
> [!IMPORTANT]
27
-
> Import from Excel or CSV file using the **Import**>**Import data from Excel** command isn’t available in GCC, GCC High, and DoD environments. To work around this limitation, from the **Tables** area in Power Apps select **Import**>**Import data**, and then choose a data source, such as **Excel workbook** or **Text/CSV**.
27
+
>
28
+
> - To import or export data, you must have the **Environment Maker** security role.
29
+
> - Import from Excel or CSV file using the **Import**>**Import data from Excel** command isn’t available in GCC, GCC High, and DoD environments. To work around this limitation, from the **Tables** area in Power Apps select **Import**>**Import data**, and then choose a data source, such as **Excel workbook** or **Text/CSV**.
28
30
29
31
### Option 1: Import by creating and modifying a file template
30
32
@@ -138,20 +140,8 @@ The following fields are system fields and aren't supported for import and expor
138
140
139
141
Use a connector to import data from a selection of many different sources, such as Azure, SQL Server database, SharePoint, Access, OData, and more. More information: [Create and use dataflows in Power Apps](create-and-use-dataflows.md)
140
142
141
-
## Troubleshoot connection issues
143
+
## See also
142
144
143
-
Users might receive an error message if the connection they're using for export requires a fix. In this case, the user receives an error message that states **Connection to Dataverse failed. Please check the link below on how to fix this issue**.
144
-
145
-
To fix this issue:
146
-
147
-
1. In Power Apps (make.powerapps.com), select **Connections** from the left navigation pane. [!INCLUDE [left-navigation-pane](../../includes/left-navigation-pane.md)]
148
-
2. Locate the **Microsoft Dataverse (legacy)** connection.
149
-
3. Select the **Fix connection** link in the **Status** column, and follow the instructions on your screen.
150
-
151
-
After the fix completes, retry the export.
152
-
153
-
## Permissions
154
-
155
-
To import or export data, the user must have the **Environment Maker** security role.
0 commit comments