Skip to content

Commit f7c6fc5

Browse files
committed
Revising for better structure and formatting
1 parent 7571a34 commit f7c6fc5

File tree

2 files changed

+40
-74
lines changed

2 files changed

+40
-74
lines changed

powerapps-docs/maker/data-platform/create-and-use-dataflows.md

Lines changed: 35 additions & 59 deletions
Original file line numberDiff line numberDiff line change
@@ -37,28 +37,17 @@ which destination you chose when creating the dataflow.
3737

3838
There are three primary steps to using a dataflow:
3939

40-
1. Author the dataflow in the Power Apps portal. You select the destination
41-
to load the output data to, the source to get the data from, and the Power
42-
Query steps to transform the data using Microsoft tools that are
43-
designed to make doing so straightforward.
40+
1. Author the dataflow in the Power Apps portal. You select the destination to load the output data to, the source to get the data from, and the Power Query steps to transform the data using Microsoft tools that are designed to make doing so straightforward.
4441

45-
2. Schedule dataflow runs. This is the frequency in which the Power Platform
46-
Dataflow should refresh the data that your dataflow will load and transform.
42+
2. Schedule dataflow runs. This is the frequency in which the Power Platform Dataflow should refresh the data that your dataflow will load and transform.
4743

48-
3. Use the data you loaded to the destination storage. You can build apps,
49-
flows, Power BI reports, and dashboards or connect directly to the dataflow’s
50-
Common Data Model folder in your organization’s lake using Azure data services like Azure
51-
Data Factory, Azure Databricks or any other service that supports the Common Data Model
52-
folder standard.
44+
3. Use the data you loaded to the destination storage. You can build apps, flows, Power BI reports, and dashboards or connect directly to the dataflow’s Common Data Model folder in your organization’s lake using Azure data services like Azure Data Factory, Azure Databricks or any other service that supports the Common Data Model folder standard.
5345

54-
The following sections look at each of these steps so you can become familiar
55-
with the tools provided to complete each step.
46+
The following sections look at each of these steps so you can become familiar with the tools provided to complete each step.
5647

5748
## Create a dataflow
5849

59-
Dataflows are created in one environment. Therefore, you'll only be able to see
60-
and manage them from that environment. In addition, individuals who want to get
61-
data from your dataflow must have access to the environment in which you created
50+
Dataflows are created in one environment. Therefore, you'll only be able to see and manage them from that environment. In addition, individuals who want to get data from your dataflow must have access to the environment in which you created
6251
it.
6352

6453
> [!NOTE]
@@ -94,31 +83,21 @@ choose data and a source, the Power Platform Dataflow service will subsequently
9483
reconnect to the data source in order to keep the data in your dataflow
9584
refreshed, at the frequency you select later in the setup process.
9685

97-
9886
![Choose data.](media/choose-data.png)
9987

100-
Now that you've selected the data to use in the table, you can use the dataflow editor to
101-
shape or transform that data into the format necessary for use in your dataflow.
88+
Now that you've selected the data to use in the table, you can use the dataflow editor to shape or transform that data into the format necessary for use in your dataflow.
10289

10390
## Use the dataflow editor to shape or transform data
10491

105-
You can shape your data selection into a form that works best for your table using a
106-
Power Query editing experience, similar to the Power Query Editor in Power BI
107-
Desktop. To learn more about Power Query, see [Query overview in Power BI Desktop](/power-bi/desktop-query-overview).
92+
You can shape your data selection into a form that works best for your table using a Power Query editing experience, similar to the Power Query Editor in Power BI Desktop. To learn more about Power Query, see [Query overview in Power BI Desktop](/power-bi/desktop-query-overview).
10893

109-
If you want to see the code that Query Editor is creating with each step, or
110-
if you want to create your own shaping code, you can use the advanced editor.
94+
If you want to see the code that Query Editor is creating with each step, or if you want to create your own shaping code, you can use the advanced editor.
11195

11296
![Advanced editor.](media/advanced-editor.png)
11397

11498
## Dataflows and the Common Data Model
11599

116-
Dataflows tables include new tools to easily map your business data to the
117-
Common Data Model, enrich it with Microsoft and non-Microsoft data, and gain simplified access to machine learning. These new capabilities can be leveraged to provide intelligent and actionable insights
118-
into your business data. Once you’ve completed any transformations in the edit
119-
queries step described below, you can map columns from your data source tables to standard
120-
table columns as defined by the Common Data Model. Standard tables have a
121-
known schema defined by the Common Data Model.
100+
Dataflows tables include new tools to easily map your business data to the Common Data Model, enrich it with Microsoft and non-Microsoft data, and gain simplified access to machine learning. These new capabilities can be leveraged to provide intelligent and actionable insights into your business data. Once you’ve completed any transformations in the edit queries step described below, you can map columns from your data source tables to standard table columns as defined by the Common Data Model. Standard tables have a known schema defined by the Common Data Model.
122101

123102
For more information about this approach, and about the Common Data Model, see [The Common Data Model](/common-data-model/).
124103

@@ -128,62 +107,59 @@ To leverage the Common Data Model with your dataflow, select the **Map to Standa
128107

129108
When you map a source column to a standard column, the following occurs:
130109

131-
1. The source column takes on the standard column name (the column is renamed if
132-
the names are different).
110+
1. The source column takes on the standard column name (the column is renamed if
111+
the names are different).
133112

134-
2. The source column gets the standard column data type.
113+
2. The source column gets the standard column data type.
135114

136-
To keep the Common Data Model standard table, all standard columns that aren't
137-
mapped get *Null* values.
115+
To keep the Common Data Model standard table, all standard columns that aren't mapped get *Null* values.
138116

139-
All source columns that aren't mapped remain as is to ensure that the result
140-
of the mapping is a standard table with custom columns.
117+
All source columns that aren't mapped remain as is to ensure that the result of the mapping is a standard table with custom columns.
141118

142-
Once you’ve completed your selections and your table and its data settings are
143-
complete, you’re ready for the next step, which is selecting the refresh frequency of your
144-
dataflow.
119+
Once you’ve completed your selections and your table and its data settings are complete, you’re ready for the next step, which is selecting the refresh frequency of your dataflow.
145120

146121
## Set the refresh frequency
147122

148-
Once your tables have been defined, you should schedule the refresh
149-
frequency for each of your connected data sources.
123+
Once your tables have been defined, you should schedule the refresh frequency for each of your connected data sources.
150124

151-
1. Dataflows use a data refresh process to keep data up to date. In the **Power Platform Dataflow authoring tool**, you can choose to refresh your dataflow manually or automatically on a scheduled
152-
interval of your choice. To schedule a refresh automatically, select **Refresh automatically**.
125+
1. Dataflows use a data refresh process to keep data up to date. In the **Power Platform Dataflow authoring tool**, you can choose to refresh your dataflow manually or automatically on a scheduled interval of your choice. To schedule a refresh automatically, select **Refresh automatically**.
153126

154127
![Refresh automatically.](media/refresh-automatically.png)
155128

156129
2. Enter the dataflow refresh frequency, start date, and time, in UTC.
157130

158131
3. Select **Create.**
159132

160-
Some organizations might want to use their own storage for creation and management
161-
of dataflows. You can integrate dataflows with Azure Data Lake Storage Gen2 if
162-
you follow the requirements to set up the storage account properly. More information: [Connect Azure Data Lake Storage Gen2 for dataflow storage](/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow)
133+
Some organizations might want to use their own storage for creation and management of dataflows. You can integrate dataflows with Azure Data Lake Storage Gen2 if you follow the requirements to set up the storage account properly. More information: [Connect Azure Data Lake Storage Gen2 for dataflow storage](/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow)
163134

164135
## Troubleshooting data connections
165136

166-
There might be occasions when connecting to data sources for dataflows runs into
167-
issues. This section provides troubleshooting tips when issues occur.
137+
There might be occasions when connecting to data sources for dataflows runs into issues. This section provides troubleshooting tips when issues occur.
168138

169-
- **Salesforce connector.** Using a trial account for Salesforce with
170-
dataflows results in a connection failure with no information provided. To
171-
resolve this, use a production Salesforce account or a developer account for
139+
- **Salesforce connector.** Using a trial account for Salesforce with dataflows results in a connection failure with no information provided. To resolve this, use a production Salesforce account or a developer account for
172140
testing.
173141

174-
- **SharePoint connector.** Make sure you supply the root address of the
175-
SharePoint site, without any subfolders or documents. For example, use a link
176-
similar to `https://microsoft.sharepoint.com/teams/ObjectModel`.
177-
142+
- **SharePoint connector.** Make sure you supply the root address of the SharePoint site, without any subfolders or documents. For example, use a link similar to `https://microsoft.sharepoint.com/teams/ObjectModel`.
178143

179-
- **JSON File connector.** Currently you can connect to a JSON file using
180-
basic authentication only. For example, a URL similar to `https://XXXXX.blob.core.windows.net/path/file.json?sv=2019-01-01&si=something&sr=c&sig=123456abcdefg` is currently not supported.
144+
- **JSON File connector.** Currently you can connect to a JSON file using basic authentication only. For example, a URL similar to `https://XXXXX.blob.core.windows.net/path/file.json?sv=2019-01-01&si=something&sr=c&sig=123456abcdefg` is currently not supported.
181145

182-
- **Azure Synapse Analytics.** Dataflows don't currently support Microsoft Entra authentication for Azure Synapse Analytics. Use basic authentication for this scenario.
146+
- **Azure Synapse Analytics.** Dataflows don't currently support Microsoft Entra authentication for Azure Synapse Analytics. Use basic authentication for this scenario.
183147

184148
> [!NOTE]
185149
> If you use data loss prevention (DLP) policies to block the **HTTP with Microsoft Entra (preauthorized)** connector then **SharePoint** and **OData** connectors will fail. The **HTTP with Microsoft Entra (preauthorized)** connector needs to be allowed in DLP policies for **SharePoint** and **OData** connectors to work.
186150
151+
### Troubleshoot the error: Connection to Dataverse failed. Please check the link below on how to fix this issue
152+
153+
Users might receive an error message if the connection they're using for export requires a fix. In this case, the user receives an error message that states **Connection to Dataverse failed. Please check the link below on how to fix this issue**.
154+
155+
To resolve this issue:
156+
157+
1. In Power Apps (make.powerapps.com), select **Connections** from the left navigation pane. [!INCLUDE [left-navigation-pane](../../includes/left-navigation-pane.md)]
158+
2. Locate the **Microsoft Dataverse (legacy)** connection.
159+
3. Select the **Fix connection** link in the **Status** column, and follow the instructions on your screen.
160+
161+
After the fix completes, retry the export.
162+
187163
## Next steps
188164

189165
The following articles are useful for further information and scenarios when using dataflows:

powerapps-docs/maker/data-platform/data-platform-import-export.md

Lines changed: 5 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,9 @@ There are two ways to import data from Excel.
2424
- [Option 2: Import by bringing your own source file](#option-2-import-by-bringing-your-own-source-file)
2525

2626
> [!IMPORTANT]
27-
> Import from Excel or CSV file using the **Import** > **Import data from Excel** command isn’t available in GCC, GCC High, and DoD environments. To work around this limitation, from the **Tables** area in Power Apps select **Import** > **Import data**, and then choose a data source, such as **Excel workbook** or **Text/CSV**.
27+
>
28+
> - To import or export data, you must have the **Environment Maker** security role.
29+
> - Import from Excel or CSV file using the **Import** > **Import data from Excel** command isn’t available in GCC, GCC High, and DoD environments. To work around this limitation, from the **Tables** area in Power Apps select **Import** > **Import data**, and then choose a data source, such as **Excel workbook** or **Text/CSV**.
2830
2931
### Option 1: Import by creating and modifying a file template
3032

@@ -138,20 +140,8 @@ The following fields are system fields and aren't supported for import and expor
138140

139141
Use a connector to import data from a selection of many different sources, such as Azure, SQL Server database, SharePoint, Access, OData, and more. More information: [Create and use dataflows in Power Apps](create-and-use-dataflows.md)
140142

141-
## Troubleshoot connection issues
143+
## See also
142144

143-
Users might receive an error message if the connection they're using for export requires a fix. In this case, the user receives an error message that states **Connection to Dataverse failed. Please check the link below on how to fix this issue**.
144-
145-
To fix this issue:
146-
147-
1. In Power Apps (make.powerapps.com), select **Connections** from the left navigation pane. [!INCLUDE [left-navigation-pane](../../includes/left-navigation-pane.md)]
148-
2. Locate the **Microsoft Dataverse (legacy)** connection.
149-
3. Select the **Fix connection** link in the **Status** column, and follow the instructions on your screen.
150-
151-
After the fix completes, retry the export.
152-
153-
## Permissions
154-
155-
To import or export data, the user must have the **Environment Maker** security role.
145+
[Tables in Dataverse](entity-overview.md)
156146

157147
[!INCLUDE[footer-include](../../includes/footer-banner.md)]

0 commit comments

Comments
 (0)