Skip to content

Commit a04ecff

Browse files
authored
Merge branch 'live' into patch-1
2 parents 8535bd9 + 9662bc3 commit a04ecff

9 files changed

+147
-104
lines changed

.openpublishing.redirection.json

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,10 @@
11
{
22
"redirections": [
3+
{
4+
"source_path": "powerapps-docs/maker/data-platform/change-solution-publisher-prefix.md",
5+
"redirect_url": "create-solution#solution-publisher",
6+
"redirect_document_id": "false"
7+
},
38
{
49
"source_path": "powerapps-docs/maker/canvas-apps/monitor.md",
510
"redirect_url": "../monitor-canvasapps",
@@ -1874,7 +1879,7 @@
18741879
"source_path": "powerapps-docs/maker/maker/canvas-apps/webinars-listing.md",
18751880
"redirect_url": "https://powerusers.microsoft.com/t5/Samples-Learning-and-Videos/ct-p/PA_Comm_Galleries",
18761881
"redirect_document_id": "false"
1877-
},
1882+
},
18781883
{
18791884
"source_path": "powerapps-docs/developer/common-data-service/org-service/index.md",
18801885
"redirect_url": "/powerapps/developer/common-data-service/org-service/overview",
@@ -6099,7 +6104,7 @@
60996104
"source_path": "powerapps-docs/developer/data-platform/webapi/index.md",
61006105
"redirect_url": "/powerapps/developer/data-platform/webapi/overview",
61016106
"redirect_document_id": "false"
6102-
},
6107+
},
61036108
{
61046109
"source_path": "powerapps-docs/developer/data-platform/org-service/index.md",
61056110
"redirect_url": "/powerapps/developer/data-platform/org-service/overview",

powerapps-docs/maker/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1954,6 +1954,8 @@
19541954
href: ./data-platform/export-to-data-lake-data-adf.md
19551955
- name: Analyze exported data with Power BI
19561956
href: ./data-platform/export-to-data-lake-data-powerbi.md
1957+
- name: Export to data lake FAQ
1958+
href: ./data-platform/export-data-lake-faq.yml
19571959
- name: Security in Dataverse
19581960
href: /power-platform/admin/wp-security
19591961
- name: Privileges for customization

powerapps-docs/maker/data-platform/change-solution-publisher-prefix.md

Lines changed: 0 additions & 57 deletions
This file was deleted.

powerapps-docs/maker/data-platform/configure-actions.md

Lines changed: 4 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ Like real-time workflow processes, actions have the following properties in the
6262

6363
This property establishes that this is an action process. You can’t change this after you save the process.
6464

65-
- **table**
65+
- **Entity**
6666

6767
With action processes, you can select a table to provide context for the real-time workflow just like other types of processes, but you also have the option to choose **None (global)**. Use this if your action doesn’t require the context of a specific table. You can’t change this after you save the process.
6868

@@ -93,16 +93,13 @@ When editing an action, you have the following options:
9393
9494
- **Enable rollback**
9595

96-
97-
<!-- from editor: The third sentence below is confusing. Can you take another pass at it? -->
98-
99-
100-
Generally, processes that support transactions will “undo” (or roll back) the entire operation if any part of them fails. There are exceptions to this. Some actions developers might do in code initiated by the action might not support transactions. An example would be when the code perform actions in other systems that are beyond the scope of the transaction. Those can’t be rolled back by the action running in an app. Some messages in the platform don’t support transactions. But everything you can do just with the user interface of the action will support transactions. All the actions that are part of a real-time workflow are considered in transaction, but with actions you have the option to opt out of this.
96+
Generally, processes that support transactions will “undo” (or roll back) the entire operation if any part of them fails. There are exceptions to this. Actions executed in code by a developer initiated by the action might not support transactions. An example would be when the code perform actions in other systems that are beyond the scope of the transaction. Those can’t be rolled back by the action running in an app. Some messages in the platform don’t support transactions. But everything you can do just with the user interface of the action will support transactions. All the actions that are part of a real-time workflow are considered in transaction, but with actions you have the option to opt out of this.
10197

10298
You should consult with the developer who will use this message to determine whether it must be in transaction or not. Generally, an action should be in transaction if the actions performed by the business process don’t make sense unless all of them are completed successfully. The classic example is transferring funds between two bank accounts. If you withdraw funds from one account you must deposit them in the other. If either fails, both must fail.
10399

104100
> [!NOTE]
105-
> You can’t enable rollback if a custom action is invoked directly from within a workflow. You can enable rollback if an action is triggered by a Power Apps web services message.
101+
> - You can’t enable rollback if a custom action is invoked directly from within a workflow. You can enable rollback if an action is triggered by a Power Apps web services message.
102+
> - If the action defined does not change data, but only retrieves data, there are certain situations where performance can be improved by disabling **Enable rollback.**
106103
107104
- **Activate As**
108105

@@ -114,11 +111,6 @@ When editing an action, you have the following options:
114111

115112
- **Add Stages, Conditions and Actions**
116113

117-
118-
119-
<!-- from editor: The following link opens a section titled "Add stages and steps". Do you want to change this to match? -->
120-
121-
122114
Like other processes, you specify what actions to perform and when to perform them. More information: [Add stages, conditions and actions](configure-actions.md#BKMK_AddStagesConditionsAndActions)
123115

124116
<a name="BKMK_DefineProcessArgs"></a>
Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
### YamlMime:FAQ
2+
metadata:
3+
title: Frequently asked questions about exporting Dataverse table data to Azure data lake
4+
description: Get answers to frequently asked questions about Power Apps export to data lake.
5+
author: sabinn-msft
6+
ms.service: powerapps
7+
ms.search.keywords:
8+
ms.date: 02/10/2021
9+
ms.author: sabinn
10+
ms.reviewer: matp
11+
title: Power Apps component framework FAQs
12+
summary: This article provides information on frequently asked questions about exporting Dataverse table data to the Azure data lake.
13+
sections:
14+
- name: General
15+
questions:
16+
- question: When should I use a yearly or monthly partition strategy?
17+
answer: |
18+
For Dataverse tables where data volume is high within a year,we recommend you use monthly partitions. Doing so results in smaller files and better performance. Additionally, if the rows in Dataverse tables are updated frequently, splitting into multiple smaller files help improve performance in the case of in-place update scenarios.
19+
- question: When do I use Append only mode for a historical view of changes?
20+
answer: |
21+
Append only mode is the recommended option for writing Dataverse table data to the lake, especially when the data volumes are high within a partition with frequently changing data. Again, this is a commonly used and highly recommended option for enterprise customers. Additionally, you can choose to use this mode for scenarios where the intent is to incrementally review changes from Dataverse and process the changes for ETL, AI, and ML scenarios. Append only mode provides a history of changes, instead of the latest change or in place update, and enables several time series from AI scenarios, such as prediction or forecasting analytics based on historical values.
22+
- question: What happens when I add a column?
23+
answer: |
24+
When you add a new column to a table in the source, it is also added at the end of the file in the destination in the corresponding file partition. While the rows that existed prior to the addition of the column won't show the new column, new or updated rows will show the newly added column.
25+
- question: What happens when I delete a column?
26+
answer: |
27+
When you delete a column from a table in the source, the column is not dropped from the destination. Instead, the rows are no longer updated and are marked as null while preserving the previous rows.
28+
- question: What happens when I delete a row?
29+
answer: |
30+
Deleting a row is handled differently based on which data write options you choose:
31+
- In-place update: This is the default mode and when you delete a table row in this mode, the row is also deleted from the corresponding data partition in the Azure data lake. In other words, data is hard deleted from the destination.
32+
- Append-only: In this mode, when a Dataverse table row is deleted, it is not hard deleted from the destination. Instead, a row is added and set as isDeleted=True to the file in the corresponding data partition in Azure data lake.
33+
- question: What happens when I add a new table to sync to the data lake?
34+
answer: |
35+
It is important to note that export to data lake simultaneously processes a fixed set of tables for initial sync at any given time. When a new table is added to the export to data lake profile, the incremental or delta sync for the existing tables are paused. While the delta messages for existing tables continue to queue, export to data lake won't process any messages from the delta queue until the initial sync for the newly added table is complete. This is true whether you create a new linked lake, a new export to data lake profile, or add a table to an existing profile. For new or existing profiles, we recommend that you add no more than 5 tables to sync to the data lake at a time. Adding more tables won't expedite the sync process. While initial sync for the newly added tables in ongoing, it will queue any delta changes on those tables into a single queue. Export to data lake won't process these deltas change until the initial sync for all previously added tables is complete.
36+
- question: Does adding multiple Dataverse tables simultaneously to the lake impact performance?
37+
answer: |
38+
Adding multiple tables to export to data lake will impact performance in following scenarios:
39+
- Initial sync: When the tables are first added, there is a fixed set of tables which are synched for initial data at a time. We recommend adding no more than 5 tables at a time to new or existing linked lake or export to data lake profile. Adding more than the recommend number of tables doesn't expedite the sync since only the fixed set of tables are processed at a given time.
40+
- Delta sync: After the initial sync of tables are completed, any change on Dataverse for selected tables is added to a common queue in export to data lake. The queue is processed with a fixed set of parallel listeners. If you add more tables to the queue, export to data lake will process the messages, but if number of messages are higher, it will add latency for data to sync to the data lake.
41+
42+
additionalContent: |
43+
44+
## See also
45+
46+
[Export table data to Azure Data Lake Storage Gen2](export-to-data-lake.md)

0 commit comments

Comments
 (0)