You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: powerapps-docs/developer/common-data-service/cds-odata-dataflows-migration.md
+50-47Lines changed: 50 additions & 47 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
title: Migrate data between Common Data Service environments using the dataflows OData connector
3
3
author: denisem-msft
4
4
ms.reviewer: "nabuthuk"
5
-
description: Migrate data between Common data Service environments using dataflows OData connector.
5
+
description: Migrate data between Common Data Service environments using dataflows OData connector.
6
6
ms.date: 05/05/2020
7
7
ms.service: powerapps
8
8
ms.topic: "article"
@@ -11,6 +11,8 @@ search.app:
11
11
- PowerApps
12
12
---
13
13
14
+
15
+
14
16
# Migrate data between Common Data Service environments using the dataflows OData connector
15
17
16
18
Common Data Service [Web API](/powerapps/developer/common-data-service/webapi/overview) works with any technology that supports OData and OAuth. There are many options available to move data in and out of Common Data Service. OData connector is one of the dataflows, which is designed to support migration and synchronization of large datasets in Common Data Service.
@@ -19,17 +21,17 @@ In this article, we walk you through how to migrate data between Common Data Ser
19
21
20
22
## Prerequisites
21
23
22
-
- System Administrator or System Customizer security role permission on both source and the target environments.
24
+
- System Administrator or System Customizer security role permission on both the source and target environments.
23
25
24
26
- Power Apps, Power Automate, or Common Data Service license (per app or per user).
25
27
26
28
- Two Common Data Service [environments with database](/power-platform/admin/create-environment#create-an-environment-with-a-database).
27
29
28
30
## Scenarios
29
31
30
-
- A one-time cross-environment or cross-tenant migration is needed (for example, geo-migration)
32
+
- A one-time cross-environment or cross-tenant migration is needed (for example, geo-migration).
31
33
32
-
-Developer needs to update an app that is being used in production. Test data is needed in their development environment to easily build out changes.
34
+
-A developer needs to update an app that is being used in production. Test data is needed in their development environment to easily build out changes.
33
35
34
36
## Step 1: Plan out the dataflow
35
37
@@ -43,82 +45,83 @@ In this article, we walk you through how to migrate data between Common Data Ser
43
45
44
46
1. When importing relationships, multiple dataflows are required.
45
47
46
-
One (parent/independent) to many (children/dependent) entities require separate dataflows. Configure the parent dataflow to run before any child entities, since the data in the parent needs to be loaded first to correctly map to the fields in the corresponding child entities.
48
+
Entities that are one (parent/independent) to many (children/dependent) require separate dataflows. Configure the parent dataflow to run before any child entities, since the data in the parent needs to be loaded first to correctly map to the fields in the corresponding child entities.
47
49
48
50
## Step 2: Get the OData endpoint
49
51
50
-
Common Data Service provides an OData endpoint that does not require any additional configuration to authenticate with the dataflows connector. It is relatively easy process to connect to the source environment.
52
+
Common Data Service provides an OData endpoint that does not require additional configuration to authenticate with the dataflows' connector. It is relatively easy to connect to the source environment.
53
+
54
+
This article will walk through how to set up a new dataflow with the OData connector. For information on connecting to all data sources supported by dataflows, see [Create and use dataflows](https://docs.microsoft.com/powerapps/maker/common-data-service/create-and-use-dataflows).
51
55
52
-
This article will walk through how to set up a new dataflow with the OData connector. See, [Creating dataflows](https://docs.microsoft.com/powerapps/maker/common-data-service/create-and-use-dataflows) article for connecting to all data sources supported by dataflows.
53
56
54
-
From the **source** environment, get the [OData endpoint](https://docs.microsoft.com/powerapps/developer/common-data-service/view-download-developer-resources)(aka Service Root URL) for that environment:
57
+
From the **source** environment, get the [OData endpoint](https://docs.microsoft.com/powerapps/developer/common-data-service/view-download-developer-resources) for that environment:
1. In the Connection Settings dialog box, type the field values:
93
+
1. In the **Connection settings** dialog box, type the field values:
91
94
92
95
> [!div class="mx-imgBorder"]
93
-
> 
96
+
> 
94
97
95
98
96
99
| Field | Description |
97
100
|--|--|
98
-
| URL | Provide the Service Root URL in the URL field of the connection settings |
101
+
| URL | Provide the Service Root URL in the URL field of the connection settings.|
99
102
| Connection | Create new connection. This will be automatically chosen if you have not made an OData connection in dataflows before. |
100
-
| Connection name | Optionally, rename the connection name, but a value is automatically populated ||
101
-
| On-premise data gateway | None. An on-premises data gateway is not needed for connections to this cloud service. |
102
-
| Authentication kind | Organizational account. Select the Sign in button to open the login dialog that authenticates the account associated with the connection. |
103
+
| Connection name | Optionally, rename the connection name, but a value is automatically populated.||
104
+
| On-premises data gateway | None. An on-premises data gateway is not needed for connections to this cloud service. |
105
+
| Authentication kind | Organizational account. Select **Sign in**to open the sign-in dialog that authenticates the account associated with the connection. |
103
106
104
107
> [!IMPORTANT]
105
-
> Disable pop-up and cookies blocker in your browser in order to configure the Azure AD authentication. This is orthogonal to the fact that you are using the Common Data Service OData endpoint or any other OAuthbased authentication data source.
108
+
> Disable pop-up and cookies blockers in your browser in order to configure the Azure AD authentication. This is similar to the fact that you are using the Common Data Service OData endpoint or any other OAuth-based authentication data source.
106
109
107
-
1. Select **Next** in the bottom right.
110
+
1. Select **Next** in the lower right.
108
111
109
-
## Step 4: Select and transform data with the Power Query
112
+
## Step 4: Select and transform data with Power Query
110
113
111
114
Use Power Query to select the tables and also transform data as per your requirement.
112
115
113
116
First, select the entities that need to be transferred. You can browse all entities in the source environment and preview some of the data in each entity.
1. Select one or multiple entities as needed, then select **Transform data**.
121
+
1. Select one or multiple entities as needed, and then select **Transform data**.
119
122
120
123
> [!NOTE]
121
-
> When importing relationships, remember that the parent entity dataflow need to be imported before the child ones. The data for the child dataflow will require data to be in the parent entity for it to correctly map, otherwise it might throw an error.
124
+
> When importing relationships, remember that the parent entity dataflow needs to be imported before the child ones. The data for the child dataflow will require data to be in the parent entity for it to correctly map, otherwise it might throw an error.
122
125
123
126
1. In the **Power Query - Edit queries** window, you can transform the query before import.
124
127
@@ -129,67 +132,67 @@ First, select the entities that need to be transferred. You can browse all entit
129
132
> [!TIP]
130
133
> You can go back to choose more tables in the **Get data** ribbon option for the same OData connector.
131
134
132
-
1. Select **Next** in the bottom right.
135
+
1. Select **Next** in the lower right.
133
136
134
137
## Step 5: Configure target environment settings
135
138
136
139
This section describes how to define the target environment settings.
137
140
138
141
### Step 5.1: Map entities
139
142
140
-
For each entity chosen, select the behavior for importing that entity in these settings and select **Next**.
143
+
For each entity chosen, select the behavior for importing that entity in these settings and then select **Next**.
- The dataflow syncs data from the source environment's entity to the target environment, and the same entity schema is already defined in the target environment.
148
151
149
-
- Ideally, use the same solution in both target and source environments to make data transfer seamless. Another advantage to having a pre-defined entity is more control over which solution the entity is defined in and the prefix.
152
+
- Ideally, use the same solution in both target and source environments to make data transfer seamless. Another advantage to having a predefined entity is more control over which solution the entity is defined in and the prefix.
150
153
151
-
- Choose the **Delete rows that no longer exist in the query output**. This ensures that the relationships will map correctly because it maintains the values for the lookups.
154
+
- Choose **Delete rows that no longer exist in the query output**. This ensures that the relationships will map correctly because it maintains the values for the lookups.
152
155
153
-
- If the schema is identical in both source and target tables, you can select the **Auto map** button to quickly map the fields.
154
-
155
-
- Requires a key configuration in the target environment (as the unique identifier fields is not available to modify).
156
+
- If the schema is identical in both source and target tables, you can select **Auto map** to quickly map the fields.
157
+
158
+
- Requires a key configuration in the target environment (as the unique identifier fields are not available to modify).
156
159
157
160
-**Load to new entity (not recommended)**
158
161
159
-
- Ideally there should be an entity pre-defined in the target environment from the same solution import as the source environment. However, there are cases where this might not be feasible, so there is this option to choose if there is no existing entity to load to.
162
+
- Ideally there should be an entity predefined in the target environment from the same solution import as the source environment. However, there are cases where this might not be feasible, so this is an option if there is no existing entity to load to.
160
163
161
164
- It creates a new custom entity in the target environment's default solution.
162
165
163
166
- There is an option to **Do not load**, but do not include entities in the dataflow that are not being loaded. You can select **Back** from this menu to return to the Power Query menu and remove the entities that are not needed.
164
167
165
168
### Step 5.2: Refresh settings
166
169
167
-
Select **Refresh manually** since this is a one-time migration ad select **Create**.
170
+
Select **Refresh manually** since this is a one-time migration and then select **Create**.
168
171
169
172
## Step 6: Run the dataflow
170
173
171
-
The initial dataflow load initiates when you select the **Create** button.
174
+
The initial dataflow load initiates when you select **Create**.
You can manually initiate a dataflow by selecting **(...)** in the dataflows list. Make sure to run dependent dataflows after the parent flows have completed.
- Try out one entity first to walk through the steps, then build out all the dataflows.
184
187
185
188
- If there are more entities that contain larger amounts of data, consider configuring multiple separate dataflows for individual entities.
186
189
187
-
- One to many relationships will require separate dataflows for each entity. Configure and run the parent (aka one, or independently) entity dataflow before the child (aka many, or dependent) entity.
190
+
- One-to-many relationships will require separate dataflows for each entity. Configure and run the parent (aka one, or independently) entity dataflow before the child entity.
188
191
189
192
- If there are errors with the dataflow refresh, you can view the refresh history in the **(...)** menu in the dataflows list and download each refresh log.
190
193
191
194
## Limitations
192
195
193
-
- Many to many relationship data imports are not supported.
196
+
- Many-to-many relationship data imports are not supported.
194
197
195
198
- Parent dataflows must be manually configured to run before child dataflows.
0 commit comments