Skip to content

Commit b369008

Browse files
authored
Live publish
2 parents 21fded8 + 0792be7 commit b369008

File tree

8 files changed

+115
-104
lines changed

8 files changed

+115
-104
lines changed

powerapps-docs/developer/data-platform/dependent-assembly-plugins.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,12 @@
11
---
22
title: "Dependent Assembly plug-ins (preview) (Microsoft Dataverse) | Microsoft Docs" # Intent and product brand in a unique string of 43-59 chars including spaces
33
description: "Learn how to include additional assemblies that your plug-in assembly can depend on." # 115-145 characters including spaces. This abstract displays in the search result.
4-
ms.date: 12/01/2022
4+
ms.date: 01/05/2023
55
ms.reviewer: jdaly
66
ms.topic: article
77
author: divkamath # GitHub ID
88
ms.subservice: dataverse-developer
99
ms.author: dikamath # MSFT alias of Microsoft employees only
10-
manager: sunilg # MSFT alias of manager or PM counterpart
1110
search.audienceType:
1211
- developer
1312
search.app:
@@ -307,10 +306,6 @@ More information:
307306

308307
The following are known issues that should be resolved before dependent assemblies for plug-ins becomes generally available.
309308

310-
### Asynchronous plug-in steps do not work
311-
312-
If you use dependent assemblies for a plug-in registered for an asynchronous step an error with the message `Expected non-empty Guid.` will occur.
313-
314309
### Plug-in profiler
315310

316311
You cannot use Plug-in Profiler to debug plug-ins that are part of a plug-in package. More information: [Use Plug-in profiler](debug-plug-in.md#use-plug-in-profiler)
@@ -329,6 +324,9 @@ You can manually edit this for each security role following the steps here: [Edi
329324

330325
:::image type="content" source="media/set-pluginpackage-read-access.png" alt-text="Setting plugin package read access.":::
331326

327+
### Custom API cannot use dependent assembly plug-ins
328+
329+
The custom API will not work after being imported as part of a solution.
332330

333331
### See also
334332

powerapps-docs/developer/data-platform/troubleshoot-plug-in.md

Lines changed: 23 additions & 23 deletions
Large diffs are not rendered by default.

powerapps-docs/developer/data-platform/webapi/samples/webapiservice-parallel-operations.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,6 @@ contributors:
1818

1919
[!INCLUDE[cc-terminology](../../includes/cc-terminology.md)]
2020

21-
2221
This .NET 6.0 sample demonstrates how to perform parallel data operations using the Dataverse Web API.
2322

2423
This sample uses the common helper code in the [WebAPIService class library (C#)](webapiservice.md).
@@ -62,7 +61,7 @@ To encounter service protection limits with this sample you should raise the `nu
6261

6362
This example uses the [Parallel.ForEachAsync Method](/dotnet/api/system.threading.tasks.parallel.foreachasync) introduced with .NET 6.0.
6463

65-
This sample processes a list of requests to create account records, sending the requests in parallel and then uses the data returned to add requests to delete the created accounts to a [ConcurrentBag](/dotnet/api/system.collections.concurrent.concurrentbag-1?view=net-6.0). After the records are created, the number of seconds to create the records is displayed.
64+
This sample processes a list of requests to create account records, sending the requests in parallel and then uses the data returned to add requests to delete the created accounts to a [ConcurrentBag](/dotnet/api/system.collections.concurrent.concurrentbag-1?view=net-6.0&preserve-view=true). After the records are created, the number of seconds to create the records is displayed.
6665

6766
Then, the delete requests in the `ConcurrentBag` are processed and the time spent deleting the records is displayed.
6867

powerapps-docs/developer/data-platform/webapi/samples/webapiservice.md

Lines changed: 29 additions & 31 deletions
Large diffs are not rendered by default.

powerapps-docs/developer/data-platform/workflow/workflow-extensions.md

Lines changed: 33 additions & 32 deletions
Large diffs are not rendered by default.

powerapps-docs/maker/data-platform/azure-synapse-link-synapse.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: "Create an Azure Synapse Link for Dataverse with your Azure Synapse Workspace | MicrosoftDocs"
33
description: "Learn how to export table data to Azure Synapse Analytics in Power Apps"
44
ms.custom: ""
5-
ms.date: 09/30/2022
5+
ms.date: 01/05/2023
66
ms.reviewer: "Mattp123"
77
ms.suite: ""
88
ms.tgt_pltfrm: ""
@@ -39,7 +39,7 @@ You can use the Azure Synapse Link to connect your Microsoft Dataverse data to A
3939
4040
## Prerequisites
4141

42-
- Azure Data Lake Storage Gen2: You must have an Azure Data Lake Storage Gen2 account and **Owner** and **Storage Blob Data Contributor** role access. Your storage account must enable **Hierarchical namespace** and it is recommended that replication is set to **read-access geo-redundant storage (RA-GRS)**.
42+
- Azure Data Lake Storage Gen2: You must have an Azure Data Lake Storage Gen2 account and **Owner** and **Storage Blob Data Contributor** role access. Your storage account must enable **Hierarchical namespace** and **public network access** for both initial setup and delta sync. **Allow storage account key access** is required only for the initial setup. We recommend that replication is set to **read-access geo-redundant storage (RA-GRS)**.
4343

4444
- Synapse workspace: You must have a Synapse workspace and the **Synapse Administrator** role access within the Synapse Studio. The Synapse workspace must be in the same region as your Azure Data Lake Storage Gen2 account with **public network access** enabled. The storage account must be added as a linked service within the Synapse Studio. To create a Synapse workspace, go to [Creating a Synapse workspace](/azure/synapse-analytics/get-started-create-workspace).
4545

@@ -76,7 +76,7 @@ You can use the Azure Synapse Link to connect your Microsoft Dataverse data to A
7676
You can follow the steps above to create a link from one environment to multiple Azure Synapse Analytics workspaces and Azure data lakes in your Azure subscription by adding an Azure data lake as a linked service on a Synapse workspace. Similarly, you could create a link from multiple environments to the same Azure Synapse Analytics workspace and Azure data lake, all within the same tenant.
7777

7878
> [!NOTE]
79-
> The data exported by Azure Synapse Link service is encrypted at rest in Azure Data Lake Storage Gen2. Additionally, transient data in the blob storage is also encrypted at rest. Encryption in Azure Data Lake Storage Gen2 helps you protect your data, implement enterprise security policies, and meet regulatory compliance requirements. More information: [Azure Data Encryption-at-Rest]( /azure/security/fundamentals/encryption-atrest)
79+
> The data exported by Azure Synapse Link service is encrypted at transit using Transport Layer Security(TLS) 1.2 or higher and encrypted at rest in Azure Data Lake Storage Gen2. Additionally, transient data in the blob storage is also encrypted at rest. Encryption in Azure Data Lake Storage Gen2 helps you protect your data, implement enterprise security policies, and meet regulatory compliance requirements. More information: [Azure Data Encryption-at-Rest]( /azure/security/fundamentals/encryption-atrest)
8080
>
8181
> Currently, you can't provide public IPs for the Azure Synapse Link for Dataverse service that can be used in **Azure Data Lake firewall settings**. Public IP network rules have no effect on requests originating from the same Azure region as the storage account. Services deployed in the same region as the storage account use private Azure IP addresses for communication. Thus, you can't restrict access to specific Azure services based on their public outbound IP address range.
8282
More information: [Configure Azure Storage firewalls and virtual networks]( /azure/storage/common/storage-network-security)

powerapps-docs/maker/data-platform/export-data-lake-faq.yml

Lines changed: 19 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ metadata:
44
description: Get answers to frequently asked questions about Power Apps Azure Synapse Link for Dataverse.
55
author: sabinn-msft
66
ms.search.keywords:
7-
ms.date: 12/15/2022
7+
ms.date: 01/05/2023
88
ms.author: sabinn
99
ms.reviewer:
1010
contributors: JasonHQX
@@ -34,6 +34,9 @@ sections:
3434
Deleting a row is handled differently based on which data write options you choose:
3535
- In-place update: This is the default mode and when you delete a table row in this mode, the row is also deleted from the corresponding data partition in the Azure Data Lake. In other words, data is hard deleted from the destination.
3636
- Append-only: In this mode, when a Dataverse table row is deleted, it is not hard deleted from the destination. Instead, a row is added and set as isDeleted=True to the file in the corresponding data partition in Azure Data Lake.
37+
- question: Why does the Model.json file increase or change in length for the data types and doesn't keep what is defined in Dataverse?
38+
answer: |
39+
Model.json keeps the database length for the column's size. Dataverse has a concept of database length for each column. If you create a column with a size of 200 and later reduce it to 100, Dataverse still allows your existing data to be present in Dataverse. It does that by keeping DBLength to 200 and MaxLength to 100. What you see in Model.json is DBLength and if you use that for downstream processes you will never provision lesser space for your Dataverse columns.
3740
- question: What date and time formats can be expected in exported Dataverse tables?
3841
answer: |
3942
There are three date and time formats that can be expected in the exported Dataverse tables.
@@ -49,18 +52,30 @@ sections:
4952
- question: When should I use a yearly or monthly partition strategy?
5053
answer: |
5154
For Dataverse tables where data volume is high within a year, we recommend you use monthly partitions. Doing so results in smaller files and better performance. Additionally, if the rows in Dataverse tables are updated frequently, splitting into multiple smaller files help improve performance in the case of in-place update scenarios.
52-
- question: When do I use Append only mode for a historical view of changes?
55+
- question: What is append only mode and what is the difference between append only and in-place update mode?
56+
answer: |
57+
In append only mode, incremental data from Dataverse tables are appended to the corresponding file partition in the lake. For more information: [Advanced Configuration Options in Azure Synapse Link](azure-synapse-link-advanced-configuration.md)
58+
- question: When do I use append only mode for a historical view of changes?
5359
answer: |
5460
Append only mode is the recommended option for writing Dataverse table data to the lake, especially when the data volumes are high within a partition with frequently changing data. Again, this is a commonly used and highly recommended option for enterprise customers. Additionally, you can choose to use this mode for scenarios where the intent is to incrementally review changes from Dataverse and process the changes for ETL, AI, and ML scenarios. Append only mode provides a history of changes, instead of the latest change or in place update, and enables several time series from AI scenarios, such as prediction or forecasting analytics based on historical values.
61+
- question: How do I retrieve the most up-to-date row of each record and exclude deleted rows when I export data in append only mode?
62+
answer: |
63+
In append only mode, you should identify the latest version of record with the same ID using **VersionNumber** and **SinkModifiedOn** then apply **isDeleted=0** on the latest version.
64+
- question: Why do I see duplicated version numbers when I export data using append only mode?
65+
answer: |
66+
For append only mode, if Azure Synapse Link for Dataverse does not get an acknowledgement from the Azure data lake that the data has been committed due to any reason such as network delays, Synapse Link will retry in those scenarios and commit the data again. The downstream consumption should be made resilient to this scenario by filtering data using SinkModifiedOn.
5567
- question: Which Dataverse tables are not supported for export?
5668
answer: |
5769
Any table that does not have change tracking enabled will not be supported in addition to following system tables:
5870
- Attachment
5971
- String Map
6072
- Calendar
61-
- question: Which Dataverse tables use Append only by default?
73+
- question: Does Azure Synapse Link support calculated columns?
74+
answer: |
75+
Yes. In Dataverse, the calculated column keeps only the formula information and the real value depends on the base entity column so the calculated column will be updated if and only if all data (base tables) related to calculated columns are exported via Azure Synapse Link.
76+
- question: Which Dataverse tables use append only mode by default?
6277
answer: |
63-
All tables that do not have a createdOn field will be synced using Append only mode by default. This includes relationship tables as well as the ActivityParty table.
78+
All tables that do not have a createdOn field will be synced using append only mode by default. This includes relationship tables as well as the ActivityParty table.
6479
- question: Why does Azure Synapse Link for Dataverse require all resources to be in the same region and what can I do about it?
6580
answer: |
6681
To ensure high performance and low latency in addition to preventing egress charges, Synapse Link requires all resources to be located in the same region. If you have a cross-region scenario, you can:

powerapps-docs/maker/portals/component-rte-tutorial.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ In this step, you'll create a new basic form in portals and then add the control
4545

4646
### Step 3.1. Create a new basic form
4747

48-
1. Open the [Portal Management app](../configure/configure-portal.md).
48+
1. Open the [Portal Management app](configure/configure-portal.md).
4949

5050
1. On the left pane under **Content**, select **Basic Forms**.
5151

@@ -65,7 +65,7 @@ In this step, you'll create a new basic form in portals and then add the control
6565

6666
### Step 3.2. Add the rich text editor control to the basic form
6767

68-
1. Open the [Portal Management app](../configure/configure-portal.md).
68+
1. Open the [Portal Management app](configure/configure-portal.md).
6969

7070
1. On the left pane under **Content**, select **Basic Forms**.
7171

@@ -153,7 +153,7 @@ For using and storing images in the rich text editor on the portal, you'll need
153153
> [!TIP]
154154
> If you don't see the form, select **Sync Configuration** to synchronize changes from Dataverse.
155155
156-
1. Under **Permissions**, select **Manage table permissions** and make sure that you have the appropriate [table permissions](../configure/assign-entity-permissions.md) and [web roles](../configure/create-web-roles.md) configured for the Dataverse table associated to the form.
156+
1. Under **Permissions**, select **Manage table permissions** and make sure that you have the appropriate [table permissions](configure/assign-entity-permissions.md) and [web roles](configure/create-web-roles.md) configured for the Dataverse table associated to the form.
157157

158158
> [!NOTE]
159159
> By default, the **feedback** table has **create** permissions configured for the default web roles. For more information, go to [Contact us sample](contact-us-sample.md).

0 commit comments

Comments
 (0)