Skip to content

Commit de54953

Browse files
committed
Merge branch 'main' into trdehove-patch-debugJSiOS
2 parents 9a5b91e + 8a941e8 commit de54953

File tree

5 files changed

+69
-126
lines changed

5 files changed

+69
-126
lines changed

powerapps-docs/maker/TOC.yml

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1903,8 +1903,12 @@
19031903
href: ./data-platform/create-edit-entities-portal.md
19041904
- name: "Create a custom table that has components"
19051905
href: ./data-platform/create-custom-entity.md
1906-
- name: Import or export data
1907-
href: ./data-platform/data-platform-import-export.md
1906+
- name: Import and export table data
1907+
items:
1908+
- name: Import data from Excel and export to CSV
1909+
href: ./data-platform/data-platform-import-export.md
1910+
- name: Import data with a dataflow connector
1911+
href: ./data-platform/create-and-use-dataflows.md
19081912
- name: Long term data retention for tables
19091913
items:
19101914
- name: Long term data retention overview

powerapps-docs/maker/data-platform/create-and-use-dataflows.md

Lines changed: 38 additions & 85 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: "Create and use dataflows in Power Apps | MicrosoftDocs"
33
description: "Learn how to create and use dataflows in Power Apps"
44
ms.custom: ""
5-
ms.date: 10/03/2023
5+
ms.date: 08/06/2024
66
ms.reviewer: ""
77
ms.suite: ""
88
ms.tgt_pltfrm: ""
@@ -32,45 +32,33 @@ A dataflow is a collection of tables that are created and managed in environment
3232
schedules, directly from the environment in which your dataflow was created.
3333

3434
Once you create a dataflow in the Power Apps portal, you can get data from it
35-
using the Common Data Service connector or Power BI Desktop Dataflow connector, depending on
35+
using the Dataverse connector or Power BI Desktop Dataflow connector, depending on
3636
which destination you chose when creating the dataflow.
3737

3838
There are three primary steps to using a dataflow:
3939

40-
1. Author the dataflow in the Power Apps portal. You select the destination
41-
to load the output data to, the source to get the data from, and the Power
42-
Query steps to transform the data using Microsoft tools that are
43-
designed to make doing so straightforward.
40+
1. Author the dataflow in the Power Apps portal. You select the destination to load the output data to, the source to get the data from, and the Power Query steps to transform the data using Microsoft tools that are designed to make doing so straightforward.
4441

45-
2. Schedule dataflow runs. This is the frequency in which the Power Platform
46-
Dataflow should refresh the data that your dataflow will load and transform.
42+
2. Schedule dataflow runs. This is the frequency in which the Power Platform Dataflow should refresh the data that your dataflow will load and transform.
4743

48-
3. Use the data you loaded to the destination storage. You can build apps,
49-
flows, Power BI reports, and dashboards or connect directly to the dataflow’s
50-
Common Data Model folder in your organization’s lake using Azure data services like Azure
51-
Data Factory, Azure Databricks or any other service that supports the Common Data Model
52-
folder standard.
44+
3. Use the data you loaded to the destination storage. You can build apps, flows, Power BI reports, and dashboards or connect directly to the dataflow’s Common Data Model folder in your organization’s lake using Azure data services like Azure Data Factory, Azure Databricks or any other service that supports the Common Data Model folder standard.
5345

54-
The following sections look at each of these steps so you can become familiar
55-
with the tools provided to complete each step.
46+
The following sections look at each of these steps so you can become familiar with the tools provided to complete each step.
5647

5748
## Create a dataflow
5849

59-
Dataflows are created in one environment. Therefore, you'll only be able to see
60-
and manage them from that environment. In addition, individuals who want to get
61-
data from your dataflow must have access to the environment in which you created
50+
Dataflows are created in one environment. Therefore, you'll only be able to see and manage them from that environment. In addition, individuals who want to get data from your dataflow must have access to the environment in which you created
6251
it.
6352

6453
> [!NOTE]
6554
> Creating dataflows is currently not available with Power Apps Developer Plan licenses.
6655
67-
1. Sign in to Power Apps, and verify which environment you're in, find the environment switcher near the right side of the command bar.
56+
1. Sign in to Power Apps, and verify which environment you're in, find the environment switcher near the right side of the command bar.
6857

6958
![Environment switcher.](media/environment-switcher.png)
7059

7160
1. On the left navigation pane, select **Dataflows**. [!INCLUDE [left-navigation-pane](../../includes/left-navigation-pane.md)]
72-
1. select **New dataflow**, and then select **Start from blank**.
73-
1. On the New Dataflow page, enter a **Name** for the dataflow. By default, dataflows store tables in Dataverse. Select **Analytical entities only** if you want tables to be stored in your organization's Azure Data Lake storage account. Select **Create**.
61+
1. Select **New dataflow**. On the **New Dataflow** page, enter a **Name** for the dataflow. By default, dataflows store tables in Dataverse. Select **Analytical entities only** if you want tables to be stored in your organization's Azure Data Lake storage account. Select **Create**.
7462

7563
> [!IMPORTANT]
7664
> There is only one owner of any dataflow—the person who created it. Only the owner can edit the dataflow. Authorization
@@ -95,31 +83,21 @@ choose data and a source, the Power Platform Dataflow service will subsequently
9583
reconnect to the data source in order to keep the data in your dataflow
9684
refreshed, at the frequency you select later in the setup process.
9785

98-
9986
![Choose data.](media/choose-data.png)
10087

101-
Now that you've selected the data to use in the table, you can use the dataflow editor to
102-
shape or transform that data into the format necessary for use in your dataflow.
88+
Now that you've selected the data to use in the table, you can use the dataflow editor to shape or transform that data into the format necessary for use in your dataflow.
10389

10490
## Use the dataflow editor to shape or transform data
10591

106-
You can shape your data selection into a form that works best for your table using a
107-
Power Query editing experience, similar to the Power Query Editor in Power BI
108-
Desktop. To learn more about Power Query, see [Query overview in Power BI Desktop](/power-bi/desktop-query-overview).
92+
You can shape your data selection into a form that works best for your table using a Power Query editing experience, similar to the Power Query Editor in Power BI Desktop. To learn more about Power Query, see [Query overview in Power BI Desktop](/power-bi/desktop-query-overview).
10993

110-
If you want to see the code that Query Editor is creating with each step, or
111-
if you want to create your own shaping code, you can use the advanced editor.
94+
If you want to see the code that Query Editor is creating with each step, or if you want to create your own shaping code, you can use the advanced editor.
11295

11396
![Advanced editor.](media/advanced-editor.png)
11497

11598
## Dataflows and the Common Data Model
11699

117-
Dataflows tables include new tools to easily map your business data to the
118-
Common Data Model, enrich it with Microsoft and third-party data, and gain simplified access to machine learning. These new capabilities can be leveraged to provide intelligent and actionable insights
119-
into your business data. Once you’ve completed any transformations in the edit
120-
queries step described below, you can map columns from your data source tables to standard
121-
table columns as defined by the Common Data Model. Standard tables have a
122-
known schema defined by the Common Data Model.
100+
Dataflows tables include new tools to easily map your business data to the Common Data Model, enrich it with Microsoft and non-Microsoft data, and gain simplified access to machine learning. These new capabilities can be leveraged to provide intelligent and actionable insights into your business data. Once you’ve completed any transformations in the edit queries step described below, you can map columns from your data source tables to standard table columns as defined by the Common Data Model. Standard tables have a known schema defined by the Common Data Model.
123101

124102
For more information about this approach, and about the Common Data Model, see [The Common Data Model](/common-data-model/).
125103

@@ -129,83 +107,58 @@ To leverage the Common Data Model with your dataflow, select the **Map to Standa
129107

130108
When you map a source column to a standard column, the following occurs:
131109

132-
1. The source column takes on the standard column name (the column is renamed if
133-
the names are different).
110+
1. The source column takes on the standard column name (the column is renamed if
111+
the names are different).
134112

135-
2. The source column gets the standard column data type.
113+
2. The source column gets the standard column data type.
136114

137-
To keep the Common Data Model standard table, all standard columns that aren't
138-
mapped get *Null* values.
115+
To keep the Common Data Model standard table, all standard columns that aren't mapped get *Null* values.
139116

140-
All source columns that aren't mapped remain as is to ensure that the result
141-
of the mapping is a standard table with custom columns.
117+
All source columns that aren't mapped remain as is to ensure that the result of the mapping is a standard table with custom columns.
142118

143-
Once you’ve completed your selections and your table and its data settings are
144-
complete, you’re ready for the next step, which is selecting the refresh frequency of your
145-
dataflow.
119+
Once you’ve completed your selections and your table and its data settings are complete, you’re ready for the next step, which is selecting the refresh frequency of your dataflow.
146120

147121
## Set the refresh frequency
148122

149-
Once your tables have been defined, you’ll want to schedule the refresh
150-
frequency for each of your connected data sources.
123+
Once your tables have been defined, you should schedule the refresh frequency for each of your connected data sources.
151124

152-
1. Dataflows use a data refresh process to keep data up to date. In the **Power Platform Dataflow authoring tool**, you can choose to refresh your dataflow manually or automatically on a scheduled
153-
interval of your choice. To schedule a refresh automatically, select **Refresh automatically**.
125+
1. Dataflows use a data refresh process to keep data up to date. In the **Power Platform Dataflow authoring tool**, you can choose to refresh your dataflow manually or automatically on a scheduled interval of your choice. To schedule a refresh automatically, select **Refresh automatically**.
154126

155127
![Refresh automatically.](media/refresh-automatically.png)
156128

157129
2. Enter the dataflow refresh frequency, start date, and time, in UTC.
158130

159131
3. Select **Create.**
160-
<!--
161-
## Connect to your dataflow in Power BI Desktop
162-
Once you’ve created your dataflow and you have scheduled the refresh frequency
163-
for each data source that will populate the model, you’re ready for the final task, which is connecting to your dataflow from within Power BI Desktop.
164-
165-
To connect to the dataflow, in Power BI Desktop select **Get Data** > **Power Platform** > **Power Platform dataflows** > **Connect**.
166-
167-
![Connect to the dataflow.](media/get-data.png)
168132

169-
Navigate to the environment where you saved your dataflow, select
170-
the dataflow, and then select the tables that you created from the list.
133+
Some organizations might want to use their own storage for creation and management of dataflows. You can integrate dataflows with Azure Data Lake Storage Gen2 if you follow the requirements to set up the storage account properly. More information: [Connect Azure Data Lake Storage Gen2 for dataflow storage](/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow)
171134

172-
You can also use the search bar, near the top of the window, to quickly find the
173-
name of your dataflow or tables from among many dataflow tables.
135+
## Troubleshooting data connections
174136

175-
When you select the table and then select the **Load** button, the tables
176-
appear in the **Columns** pane in Power BI Desktop, and appear and behave just
177-
like tables from any other dataset. -->
137+
There might be occasions when connecting to data sources for dataflows runs into issues. This section provides troubleshooting tips when issues occur.
178138

179-
## Using dataflows stored in Azure Data Lake Storage Gen2
139+
- **Salesforce connector.** Using a trial account for Salesforce with dataflows results in a connection failure with no information provided. To resolve this, use a production Salesforce account or a developer account for
140+
testing.
180141

181-
Some organizations might want to use their own storage for creation and management
182-
of dataflows. You can integrate dataflows with Azure Data Lake Storage Gen2 if
183-
you follow the requirements to set up the storage account properly. More information: [Connect Azure Data Lake Storage Gen2 for dataflow storage](/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow)
142+
- **SharePoint connector.** Make sure you supply the root address of the SharePoint site, without any subfolders or documents. For example, use a link similar to `https://microsoft.sharepoint.com/teams/ObjectModel`.
184143

185-
## Troubleshooting data connections
144+
- **JSON File connector.** Currently you can connect to a JSON file using basic authentication only. For example, a URL similar to `https://XXXXX.blob.core.windows.net/path/file.json?sv=2019-01-01&si=something&sr=c&sig=123456abcdefg` is currently not supported.
186145

187-
There might be occasions when connecting to data sources for dataflows runs into
188-
issues. This section provides troubleshooting tips when issues occur.
146+
- **Azure Synapse Analytics.** Dataflows don't currently support Microsoft Entra authentication for Azure Synapse Analytics. Use basic authentication for this scenario.
189147

190-
- **Salesforce connector.** Using a trial account for Salesforce with
191-
dataflows results in a connection failure with no information provided. To
192-
resolve this, use a production Salesforce account or a developer account for
193-
testing.
148+
> [!NOTE]
149+
> If you use data loss prevention (DLP) policies to block the **HTTP with Microsoft Entra (preauthorized)** connector then **SharePoint** and **OData** connectors will fail. The **HTTP with Microsoft Entra (preauthorized)** connector needs to be allowed in DLP policies for **SharePoint** and **OData** connectors to work.
194150
195-
- **SharePoint connector.** Make sure you supply the root address of the
196-
SharePoint site, without any subfolders or documents. For example, use a link
197-
similar to `https://microsoft.sharepoint.com/teams/ObjectModel`.
151+
### Troubleshoot the error: Connection to Dataverse failed. Please check the link below on how to fix this issue
198152

153+
Users might receive an error message if the connection they're using for export requires a fix. In this case, the user receives an error message that states **Connection to Dataverse failed. Please check the link below on how to fix this issue**.
199154

200-
- **JSON File connector.** Currently you can connect to a JSON file using
201-
basic authentication only. For example, a URL similar to `https://XXXXX.blob.core.windows.net/path/file.json?sv=2019-01-01&si=something&sr=c&sig=123456abcdefg` is currently not supported.
155+
To resolve this issue:
202156

203-
- **Azure Synapse Analytics.** Dataflows don't currently support Azure
204-
Active Directory authentication for Azure Synapse Analytics. Use
205-
basic authentication for this scenario.
157+
1. In Power Apps (make.powerapps.com), select **Connections** from the left navigation pane. [!INCLUDE [left-navigation-pane](../../includes/left-navigation-pane.md)]
158+
2. Locate the **Microsoft Dataverse (legacy)** connection.
159+
3. Select the **Fix connection** link in the **Status** column, and follow the instructions on your screen.
206160

207-
> [!NOTE]
208-
> If you use data loss prevention (DLP) policies to block the **HTTP with Microsoft Entra (preauthorized)** connector then **SharePoint** and **OData** connectors will fail. The **HTTP with Microsoft Entra (preauthorized)** connector needs to be allowed in DLP policies for **SharePoint** and **OData** connectors to work.
161+
After the fix completes, retry the export.
209162

210163
## Next steps
211164

0 commit comments

Comments
 (0)