Skip to content

Commit 4757a8d

Browse files
authored
Revised for style
1 parent 9361660 commit 4757a8d

File tree

1 file changed

+10
-11
lines changed

1 file changed

+10
-11
lines changed

powerapps-docs/maker/data-platform/export-to-data-lake-data-adf.md

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: "Ingest Microsoft Dataverse data with Azure Data Factory | MicrosoftDocs"
33
description: Learn how to use Azure Data Factory to create dataflows, transform, and run analysis on Dataverse data
44
ms.custom: ""
5-
ms.date: 07/29/2020
5+
ms.date: 03/22/2021
66
ms.reviewer: "matp"
77
author: sabinn-msft
88
ms.service: powerapps
@@ -49,20 +49,19 @@ The user account that's used to sign in to Azure must be a member of the
4949
To view the permissions that you have in the subscription, go to the [Azure portal](https://portal.azure.com/), select your username in the upper-right corner, select **...**, and then select **My permissions**. If you have access to multiple subscriptions, select the appropriate one. To create and manage child resources for Data Factory in the Azure portal—including datasets, linked services, pipelines, triggers, and integration runtimes—you must belong to the *Data Factory Contributor* role at the resource group level or above.
5050

5151
### Export to data lake
52-
53-
This guide assumes that you have already exported Dataverse data by using the [Export to Data Lake service](export-to-data-lake.md).
52+
This guide assumes that you've already exported Dataverse data by using the [Export to Data Lake service](export-to-data-lake.md).
5453

5554
In this example, account table data is exported to the data lake.
5655

5756
### Azure Data Factory
5857

59-
This guide assumes that you have already created a data factory under the same subscription and resource group as the storage account containing the exported Dataverse data.
58+
This guide assumes that you've already created a data factory under the same subscription and resource group as the storage account containing the exported Dataverse data.
6059

6160
## Set the Data Lake Storage Gen2 storage account as a source
6261

63-
1. Open [Azure Data Factory](https://ms-adf.azure.com/en-us/datafactories) and select the data facotry that is on the same subscription and resource group as the storage account containing your exported Dataverse data. Then select **Create data flow** from the home page.
62+
1. Open [Azure Data Factory](https://ms-adf.azure.com/en-us/datafactories) and select the data factory that is on the same subscription and resource group as the storage account containing your exported Dataverse data. Then select **Create data flow** from the home page.
6463

65-
2. Turn on **Data flow debug** mode and select your preferred time to live. This may take up to 10 minutes, but you
64+
2. Turn on **Data flow debug** mode and select your preferred time to live. This might take up to 10 minutes, but you
6665
can proceed with the following steps.
6766

6867
![Dataflow debug mode](media/data-flow-debug.png "Dataflow debug mode")
@@ -71,7 +70,7 @@ This guide assumes that you have already created a data factory under the same s
7170

7271
![Add source](media/add-source.png "Add source")
7372

74-
4. Under **Source settings**, do the following<!--Suggested. It's "configure the following options" here and "select the following options" in the next procedure, but these are a combination of entering and selecting.-->:
73+
4. Under **Source settings**, do the following:
7574

7675
- **Output stream name**: Enter the name you want.
7776
- **Source type**: Select **Common Data Model**.
@@ -88,7 +87,7 @@ This guide assumes that you have already created a data factory under the same s
8887

8988
6. Check the **Projection** tab to ensure that your schema has been imported sucessfully. If you do not see any columns, select **Schema options** and check the **Infer drifted column types** option. Configure the formatting options to match your data set then select **Apply**.
9089

91-
7. You may view your data in the **Data preview** tab to ensure the Source creation was complete and accurate.
90+
7. You can view your data in the **Data preview** tab to ensure the Source creation was complete and accurate.
9291

9392
## Transform your Dataverse data
9493
After setting the exported Dataverse data in the Data Lake Storage Gen2 storage account as a source in the Data Factory dataflow, there are many possibilities for transforming your data. More information: [Azure Data Factory](/azure/data-factory/introduction)
@@ -106,7 +105,7 @@ Follow these instructions to create a rank for the each row by the *revenue* of
106105

107106
![Configure the Rank settings tab](media/configure-rank.png "Configure the Rank settings tab")
108107

109-
3. You may view your data in the **data preview** tab where you will find the new *revenueRank* column at the right-most position.
108+
3. You can view your data in the **data preview** tab where you will find the new *revenueRank* column at the right-most position.
110109

111110
## Set the Data Lake Storage Gen2 storage account as a sink
112111
Ultimately, you must set a sink for your dataflow. Follow these instructions to place your transformed data as a Delimited Text file in the Data Lake.
@@ -133,7 +132,7 @@ Ultimately, you must set a sink for your dataflow. Follow these instructions to
133132

134133
3. On the **Optimize** tab, set the **Partition option** to **Single partition**.
135134

136-
4. You may view your data in the **data preview** tab.
135+
4. You can view your data in the **data preview** tab.
137136

138137
## Run your dataflow
139138

@@ -148,7 +147,7 @@ Ultimately, you must set a sink for your dataflow. Follow these instructions to
148147

149148
4. Select **Debug** from the command bar.
150149

151-
5. Let the dataflow run until the bottom view shows that is has been completed. This may take a few minutes.
150+
5. Let the dataflow run until the bottom view shows that is has been completed. This might take a few minutes.
152151

153152
6. Go to the final destination storage container, and find the transformed table data file.
154153

0 commit comments

Comments
 (0)