You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: powerapps-docs/developer/data-platform/dependent-assembly-plugins.md
+4-6Lines changed: 4 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -1,13 +1,12 @@
1
1
---
2
2
title: "Dependent Assembly plug-ins (preview) (Microsoft Dataverse) | Microsoft Docs"# Intent and product brand in a unique string of 43-59 chars including spaces
3
3
description: "Learn how to include additional assemblies that your plug-in assembly can depend on."# 115-145 characters including spaces. This abstract displays in the search result.
4
-
ms.date: 12/01/2022
4
+
ms.date: 01/05/2023
5
5
ms.reviewer: jdaly
6
6
ms.topic: article
7
7
author: divkamath # GitHub ID
8
8
ms.subservice: dataverse-developer
9
9
ms.author: dikamath # MSFT alias of Microsoft employees only
10
-
manager: sunilg # MSFT alias of manager or PM counterpart
11
10
search.audienceType:
12
11
- developer
13
12
search.app:
@@ -307,10 +306,6 @@ More information:
307
306
308
307
The following are known issues that should be resolved before dependent assemblies for plug-ins becomes generally available.
309
308
310
-
### Asynchronous plug-in steps do not work
311
-
312
-
If you use dependent assemblies for a plug-in registered for an asynchronous step an error with the message `Expected non-empty Guid.` will occur.
313
-
314
309
### Plug-in profiler
315
310
316
311
You cannot use Plug-in Profiler to debug plug-ins that are part of a plug-in package. More information: [Use Plug-in profiler](debug-plug-in.md#use-plug-in-profiler)
@@ -329,6 +324,9 @@ You can manually edit this for each security role following the steps here: [Edi
This .NET 6.0 sample demonstrates how to perform parallel data operations using the Dataverse Web API.
23
22
24
23
This sample uses the common helper code in the [WebAPIService class library (C#)](webapiservice.md).
@@ -62,7 +61,7 @@ To encounter service protection limits with this sample you should raise the `nu
62
61
63
62
This example uses the [Parallel.ForEachAsync Method](/dotnet/api/system.threading.tasks.parallel.foreachasync) introduced with .NET 6.0.
64
63
65
-
This sample processes a list of requests to create account records, sending the requests in parallel and then uses the data returned to add requests to delete the created accounts to a [ConcurrentBag](/dotnet/api/system.collections.concurrent.concurrentbag-1?view=net-6.0). After the records are created, the number of seconds to create the records is displayed.
64
+
This sample processes a list of requests to create account records, sending the requests in parallel and then uses the data returned to add requests to delete the created accounts to a [ConcurrentBag](/dotnet/api/system.collections.concurrent.concurrentbag-1?view=net-6.0&preserve-view=true). After the records are created, the number of seconds to create the records is displayed.
66
65
67
66
Then, the delete requests in the `ConcurrentBag` are processed and the time spent deleting the records is displayed.
Copy file name to clipboardExpand all lines: powerapps-docs/maker/data-platform/azure-synapse-link-synapse.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
title: "Create an Azure Synapse Link for Dataverse with your Azure Synapse Workspace | MicrosoftDocs"
3
3
description: "Learn how to export table data to Azure Synapse Analytics in Power Apps"
4
4
ms.custom: ""
5
-
ms.date: 09/30/2022
5
+
ms.date: 01/05/2023
6
6
ms.reviewer: "Mattp123"
7
7
ms.suite: ""
8
8
ms.tgt_pltfrm: ""
@@ -39,7 +39,7 @@ You can use the Azure Synapse Link to connect your Microsoft Dataverse data to A
39
39
40
40
## Prerequisites
41
41
42
-
- Azure Data Lake Storage Gen2: You must have an Azure Data Lake Storage Gen2 account and **Owner** and **Storage Blob Data Contributor** role access. Your storage account must enable **Hierarchical namespace** and it is recommended that replication is set to **read-access geo-redundant storage (RA-GRS)**.
42
+
- Azure Data Lake Storage Gen2: You must have an Azure Data Lake Storage Gen2 account and **Owner** and **Storage Blob Data Contributor** role access. Your storage account must enable **Hierarchical namespace** and **public network access** for both initial setup and delta sync. **Allow storage account key access**is required only for the initial setup. We recommend that replication is set to **read-access geo-redundant storage (RA-GRS)**.
43
43
44
44
- Synapse workspace: You must have a Synapse workspace and the **Synapse Administrator** role access within the Synapse Studio. The Synapse workspace must be in the same region as your Azure Data Lake Storage Gen2 account with **public network access** enabled. The storage account must be added as a linked service within the Synapse Studio. To create a Synapse workspace, go to [Creating a Synapse workspace](/azure/synapse-analytics/get-started-create-workspace).
45
45
@@ -76,7 +76,7 @@ You can use the Azure Synapse Link to connect your Microsoft Dataverse data to A
76
76
You can follow the steps above to create a link from one environment to multiple Azure Synapse Analytics workspaces and Azure data lakes in your Azure subscription by adding an Azure data lake as a linked service on a Synapse workspace. Similarly, you could create a link from multiple environments to the same Azure Synapse Analytics workspace and Azure data lake, all within the same tenant.
77
77
78
78
> [!NOTE]
79
-
> The data exported by Azure Synapse Link service is encrypted at rest in Azure Data Lake Storage Gen2. Additionally, transient data in the blob storage is also encrypted at rest. Encryption in Azure Data Lake Storage Gen2 helps you protect your data, implement enterprise security policies, and meet regulatory compliance requirements. More information: [Azure Data Encryption-at-Rest](/azure/security/fundamentals/encryption-atrest)
79
+
> The data exported by Azure Synapse Link service is encrypted at transit using Transport Layer Security(TLS) 1.2 or higher and encrypted at rest in Azure Data Lake Storage Gen2. Additionally, transient data in the blob storage is also encrypted at rest. Encryption in Azure Data Lake Storage Gen2 helps you protect your data, implement enterprise security policies, and meet regulatory compliance requirements. More information: [Azure Data Encryption-at-Rest](/azure/security/fundamentals/encryption-atrest)
80
80
>
81
81
> Currently, you can't provide public IPs for the Azure Synapse Link for Dataverse service that can be used in **Azure Data Lake firewall settings**. Public IP network rules have no effect on requests originating from the same Azure region as the storage account. Services deployed in the same region as the storage account use private Azure IP addresses for communication. Thus, you can't restrict access to specific Azure services based on their public outbound IP address range.
82
82
More information: [Configure Azure Storage firewalls and virtual networks](/azure/storage/common/storage-network-security)
Copy file name to clipboardExpand all lines: powerapps-docs/maker/data-platform/export-data-lake-faq.yml
+19-4Lines changed: 19 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ metadata:
4
4
description: Get answers to frequently asked questions about Power Apps Azure Synapse Link for Dataverse.
5
5
author: sabinn-msft
6
6
ms.search.keywords:
7
-
ms.date: 12/15/2022
7
+
ms.date: 01/05/2023
8
8
ms.author: sabinn
9
9
ms.reviewer:
10
10
contributors: JasonHQX
@@ -34,6 +34,9 @@ sections:
34
34
Deleting a row is handled differently based on which data write options you choose:
35
35
- In-place update: This is the default mode and when you delete a table row in this mode, the row is also deleted from the corresponding data partition in the Azure Data Lake. In other words, data is hard deleted from the destination.
36
36
- Append-only: In this mode, when a Dataverse table row is deleted, it is not hard deleted from the destination. Instead, a row is added and set as isDeleted=True to the file in the corresponding data partition in Azure Data Lake.
37
+
- question: Why does the Model.json file increase or change in length for the data types and doesn't keep what is defined in Dataverse?
38
+
answer: |
39
+
Model.json keeps the database length for the column's size. Dataverse has a concept of database length for each column. If you create a column with a size of 200 and later reduce it to 100, Dataverse still allows your existing data to be present in Dataverse. It does that by keeping DBLength to 200 and MaxLength to 100. What you see in Model.json is DBLength and if you use that for downstream processes you will never provision lesser space for your Dataverse columns.
37
40
- question: What date and time formats can be expected in exported Dataverse tables?
38
41
answer: |
39
42
There are three date and time formats that can be expected in the exported Dataverse tables.
@@ -49,18 +52,30 @@ sections:
49
52
- question: When should I use a yearly or monthly partition strategy?
50
53
answer: |
51
54
For Dataverse tables where data volume is high within a year, we recommend you use monthly partitions. Doing so results in smaller files and better performance. Additionally, if the rows in Dataverse tables are updated frequently, splitting into multiple smaller files help improve performance in the case of in-place update scenarios.
52
-
- question: When do I use Append only mode for a historical view of changes?
55
+
- question: What is append only mode and what is the difference between append only and in-place update mode?
56
+
answer: |
57
+
In append only mode, incremental data from Dataverse tables are appended to the corresponding file partition in the lake. For more information: [Advanced Configuration Options in Azure Synapse Link](azure-synapse-link-advanced-configuration.md)
58
+
- question: When do I use append only mode for a historical view of changes?
53
59
answer: |
54
60
Append only mode is the recommended option for writing Dataverse table data to the lake, especially when the data volumes are high within a partition with frequently changing data. Again, this is a commonly used and highly recommended option for enterprise customers. Additionally, you can choose to use this mode for scenarios where the intent is to incrementally review changes from Dataverse and process the changes for ETL, AI, and ML scenarios. Append only mode provides a history of changes, instead of the latest change or in place update, and enables several time series from AI scenarios, such as prediction or forecasting analytics based on historical values.
61
+
- question: How do I retrieve the most up-to-date row of each record and exclude deleted rows when I export data in append only mode?
62
+
answer: |
63
+
In append only mode, you should identify the latest version of record with the same ID using **VersionNumber** and **SinkModifiedOn** then apply **isDeleted=0** on the latest version.
64
+
- question: Why do I see duplicated version numbers when I export data using append only mode?
65
+
answer: |
66
+
For append only mode, if Azure Synapse Link for Dataverse does not get an acknowledgement from the Azure data lake that the data has been committed due to any reason such as network delays, Synapse Link will retry in those scenarios and commit the data again. The downstream consumption should be made resilient to this scenario by filtering data using SinkModifiedOn.
55
67
- question: Which Dataverse tables are not supported for export?
56
68
answer: |
57
69
Any table that does not have change tracking enabled will not be supported in addition to following system tables:
58
70
- Attachment
59
71
- String Map
60
72
- Calendar
61
-
- question: Which Dataverse tables use Append only by default?
73
+
- question: Does Azure Synapse Link support calculated columns?
74
+
answer: |
75
+
Yes. In Dataverse, the calculated column keeps only the formula information and the real value depends on the base entity column so the calculated column will be updated if and only if all data (base tables) related to calculated columns are exported via Azure Synapse Link.
76
+
- question: Which Dataverse tables use append only mode by default?
62
77
answer: |
63
-
All tables that do not have a createdOn field will be synced using Append only mode by default. This includes relationship tables as well as the ActivityParty table.
78
+
All tables that do not have a createdOn field will be synced using append only mode by default. This includes relationship tables as well as the ActivityParty table.
64
79
- question: Why does Azure Synapse Link for Dataverse require all resources to be in the same region and what can I do about it?
65
80
answer: |
66
81
To ensure high performance and low latency in addition to preventing egress charges, Synapse Link requires all resources to be located in the same region. If you have a cross-region scenario, you can:
Copy file name to clipboardExpand all lines: powerapps-docs/maker/portals/component-rte-tutorial.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -45,7 +45,7 @@ In this step, you'll create a new basic form in portals and then add the control
45
45
46
46
### Step 3.1. Create a new basic form
47
47
48
-
1. Open the [Portal Management app](../configure/configure-portal.md).
48
+
1. Open the [Portal Management app](configure/configure-portal.md).
49
49
50
50
1. On the left pane under **Content**, select **Basic Forms**.
51
51
@@ -65,7 +65,7 @@ In this step, you'll create a new basic form in portals and then add the control
65
65
66
66
### Step 3.2. Add the rich text editor control to the basic form
67
67
68
-
1. Open the [Portal Management app](../configure/configure-portal.md).
68
+
1. Open the [Portal Management app](configure/configure-portal.md).
69
69
70
70
1. On the left pane under **Content**, select **Basic Forms**.
71
71
@@ -153,7 +153,7 @@ For using and storing images in the rich text editor on the portal, you'll need
153
153
> [!TIP]
154
154
> If you don't see the form, select **Sync Configuration** to synchronize changes from Dataverse.
155
155
156
-
1. Under **Permissions**, select **Manage table permissions** and make sure that you have the appropriate [table permissions](../configure/assign-entity-permissions.md) and [web roles](../configure/create-web-roles.md) configured for the Dataverse table associated to the form.
156
+
1. Under **Permissions**, select **Manage table permissions** and make sure that you have the appropriate [table permissions](configure/assign-entity-permissions.md) and [web roles](configure/create-web-roles.md) configured for the Dataverse table associated to the form.
157
157
158
158
> [!NOTE]
159
159
> By default, the **feedback** table has **create** permissions configured for the default web roles. For more information, go to [Contact us sample](contact-us-sample.md).
0 commit comments