Skip to content

Commit 6b9f2ff

Browse files
authored
Merge branch 'main' into winona-screenshot-updates
2 parents b3fe2a9 + 5168158 commit 6b9f2ff

9 files changed

+58
-63
lines changed

powerapps-docs/developer/data-platform/azure-integration.md

Lines changed: 25 additions & 28 deletions
Large diffs are not rendered by default.

powerapps-docs/developer/data-platform/configure-azure-integration.md

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: "Configure Azure integration (Microsoft Dataverse) | Microsoft Docs" # Intent and product brand in a unique string of 43-59 chars including spaces
33
description: "Learn about configuring Azure integration with Microsoft Dataverse." # 115-145 characters including spaces. This abstract displays in the search result.
4-
ms.date: 03/22/2022
4+
ms.date: 07/19/2024
55
ms.reviewer: "pehecke"
66
ms.topic: "article"
77
author: "jaredha" # GitHub ID
@@ -18,19 +18,19 @@ You can post the message request data for the current Dataverse core operation t
1818

1919
## Configure Azure For Dataverse integration
2020

21-
Because you will use SAS for authorization, you need to configure the rules and issuers of your Azure solution to allow a listener application to read the Dataverse message posted to the Azure Service Bus. In addition, you must configure the service bus rules to accept the Dataverse issuer claim. The recommended method to configure Azure is to use the Plug-in Registration tool (PRT).
21+
Because you'll use SAS for authorization, you need to configure the rules and issuers of your Azure solution to allow a listener application to read the Dataverse message posted to the Azure Service Bus. In addition, you must configure the service bus rules to accept the Dataverse issuer claim. The recommended method to configure Azure is to use the Plug-in Registration tool (PRT).
2222

2323
For instructions on configuring authorization see [Tutorial: Configure Azure (SAS) for integration with Dataverse](walkthrough-configure-azure-sas-integration.md).
2424

2525
## Test Configuration
2626

27-
After configuring Azure integration, you will need to perform these additional tasks.
27+
After configuring Azure integration, you'll need to perform these other tasks.
2828

29-
1. Write and register a listener application with a Azure Service Bus solution endpoint. For more information, see the Azure Service Bus [documentation](/azure/service-bus-messaging/service-bus-messaging-overview).
30-
1. Register an Azure aware plug-in or a Azure-aware custom workflow activity with Dataverse. More information: [Tutorial: Register an Azure-aware plug-in using the Plug-in Registration tool](walkthrough-register-azure-aware-plug-in-using-plug-in-registration-tool.md)
29+
1. Write and register a listener application with an Azure Service Bus solution endpoint. For more information, see the Azure Service Bus [documentation](/azure/service-bus-messaging/service-bus-messaging-overview).
30+
1. Register an Azure aware plug-in or an Azure-aware custom workflow activity with Dataverse. More information: [Tutorial: Register an Azure-aware plug-in using the Plug-in Registration tool](walkthrough-register-azure-aware-plug-in-using-plug-in-registration-tool.md)
3131
1. Perform the necessary Dataverse operation that triggers the plug-in or custom workflow activity to run.
3232

33-
If all of the preceding steps were performed correctly, a message containing the Dataverse data context should be sent to a Azure queue or topic and ultimately received by the listener application. You can navigate to the System Jobs grid in the Dataverse web application and check the status of the related System Job to see if the post to the Azure Service Bus succeeded. In case of errors, the message section of the System Job displays the error details.
33+
If all of the preceding steps were performed correctly, a message containing the Dataverse data context should be sent to an Azure queue or topic and ultimately received by the listener application. You can navigate to the System Jobs grid in the Power Apps web application, under **Advanced settings**, and check the status of the related system job to see if the post to the Azure Service Bus succeeded. If errors occur, the message section of the system job displays the error details.
3434

3535
### See also
3636

@@ -40,5 +40,4 @@ If all of the preceding steps were performed correctly, a message containing the
4040
[Write a listener application for a Azure solution](write-listener-application-azure-solution.md)<br />
4141
[What is Azure Service Bus?](/azure/service-bus-messaging/service-bus-messaging-overview)
4242

43-
4443
[!INCLUDE[footer-include](../../includes/footer-banner.md)]

powerapps-docs/developer/data-platform/transaction-currency-currency-entity.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,8 @@
22
title: "Transaction Currency (currency) table (Microsoft Dataverse) | Microsoft Docs" # Intent and product brand in a unique string of 43-59 chars including spaces
33
description: "Learn about transaction table, which is a multicurrency feature enabling users to perform financial transactions in multiple currencies. Multiple records in different transaction currencies can be aggregated, compared, or analyzed with regard to a single currency using the base currency." # 115-145 characters including spaces. This abstract displays in the search result.
44
ms.custom: ""
5-
ms.date: 07/19/2021
5+
ms.date: 07/25/2024
66
ms.reviewer: "pehecke"
7-
87
ms.topic: "article"
98
author: "mayadumesh" # GitHub ID
109
ms.subservice: dataverse-developer
@@ -32,7 +31,7 @@ Dataverse is a multicurrency system, in which each record can be associated with
3231

3332
- Define product pricelists for each currency.
3433

35-
To use multiple currencies, the base currency must be defined for an organization during server installation and organization setup. After the base currency is set for an organization, it cannot be changed. This value is stored in the `Organization.BaseCurrencyID` attribute.
34+
To use multiple currencies, the base currency must be defined for an organization during server installation and organization setup. This value is stored in the `Organization.BaseCurrencyID` attribute.
3635

3736
Transaction currencies are defined as a part of the system settings. An unlimited number of transaction currencies can be defined. Transaction currencies are related to the base currency with the definition of a currency exchange rate.
3837

powerapps-docs/developer/data-platform/work-data-azure-solution.md

Lines changed: 4 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: "Work with Microsoft Dataverse data in your Azure solution (Microsoft Dataverse) | Microsoft Docs"
33
description: "Provides an overview of passing data from Dataverse to an Azure cloud hosted solution."
4-
ms.date: 08/28/2023
4+
ms.date: 07/19/2024
55
author: swylezol
66
ms.author: swylezol
77
ms.reviewer: pehecke
@@ -18,22 +18,19 @@ contributors:
1818

1919
[!INCLUDE[cc-terminology](includes/cc-terminology.md)]
2020

21-
An internal plug-in named `ServiceBusPlugin` is provided with Dataverse. The plug-in contains the business logic to post the Dataverse message execution context to the integrated Azure service. To use this plug-in, you need to register an Azure endpoint and a step for the plug-in. The step defines what message and table combination being processed by the core Dataverse operation should trigger the plug-in to execute. For more information, see [Walkthrough: Register an Azure-aware Plug-in using the Plug-in Registration Tool](walkthrough-register-azure-aware-plug-in-using-plug-in-registration-tool.md).
21+
A default Azure-aware plug-in is provided with Dataverse. The plug-in contains the business logic to post the Dataverse message execution context to the Azure Service Bus. To use this plug-in, you need to register a service endpoint and a step. The step defines what message and table combination being processed by the core Dataverse operation should trigger the plug-in to execute. For more information, see [Walkthrough: Register an Azure-aware Plug-in using the Plug-in Registration Tool](walkthrough-register-azure-aware-plug-in-using-plug-in-registration-tool.md).
2222

23-
In addition, you can write a custom plug-in that includes the required lines of code to post to the Azure service. The plug-in is registered in a similar way, except that it must be registered in the sandbox. For more information on writing a custom plug-in that can post to the Azure services, see [Write a Custom Azure-aware Plug-in](write-custom-azure-aware-plugin.md).
23+
In addition, you can write a custom plug-in that includes the required lines of code to post to the Azure service. The plug-in is registered in a similar way, except that it must be registered in the sandbox. For more information on writing a custom plug-in that can post to the Azure Service Bus, see [Write a Custom Azure-aware Plug-in](write-custom-azure-aware-plugin.md).
2424

2525
You can also write a custom workflow activity that can post the execution context to the Azure service and include this activity in your workflows. Sample code for a custom Azure-aware workflow activity is provided in the [Sample: Azure aware custom workflow activity](org-service/samples/azure-aware-custom-workflow-activity.md).
2626

2727
> [!NOTE]
2828
> Any service endpoint registered for a synchronous step will send the execution context data to the Azure service immediately. If an error occurs after the request was sent, the data operation will rollback but the request sent the the Azure service cannot be recalled.
2929
30-
31-
3230
### See also
3331

3432
[Writing a Plug-in](write-plug-in.md)<br/>
35-
[Event execution pipeline](event-framework.md#event-execution-pipeline)<br/>
33+
[Event execution pipeline](event-framework.md#event-execution-pipeline)<br/>
3634
[ServiceEndPoint Entity](reference/entities/serviceendpoint.md)<br/>
3735

38-
3936
[!INCLUDE[footer-include](../../includes/footer-banner.md)]

powerapps-docs/developer/data-platform/write-custom-azure-aware-plugin.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: "Write a custom Azure-aware plug-in (Microsoft Dataverse) | Microsoft Docs"
33
description: "Learn how to write plug-in code that can post a message or the execution context of the current database transaction to the Azure Service Bus."
4-
ms.date: 06/19/2023
4+
ms.date: 07/19/2024
55
author: divkamath
66
ms.author: dikamath
77
ms.reviewer: pehecke
@@ -21,17 +21,17 @@ Writing a plug-in that works with Azure is similar to writing any other Datavers
2121

2222
## Plug-in design considerations
2323

24-
For a plug-in that executes synchronously, the recommended design is for the plug-in to send a message to Azure for the purpose of retrieving information from a listener application or other external service. Use of a two-way or REST contract on the Azure Service Bus endpoint allows a data string to be returned to the plug-in.
24+
For a plug-in that executes synchronously, the recommended design is for the plug-in to send a message to Azure for retrieving information from a listener application or other external service. Use of a two-way or REST contract on the Azure Service Bus endpoint allows a data string to be returned to the plug-in.
2525

26-
It is not recommended that a synchronous plug-in use the Azure Service Bus to update data with an external service. Problems can arise if the external service becomes unavailable or if there is a lot of data to update. Synchronous plug-ins should execute fast and not hold up all logged in users of an organization while a lengthy operation is performed. In addition, if a rollback of the current core operation that invoked the plug-in occurs, any data changes made by the plug-in are undone. This could leave Dynamics 365 and an external service in an un-synchronized state.
26+
It isn't recommended that a synchronous plug-in use the Azure Service Bus to update data with an external service. Problems can arise if the external service becomes unavailable or if there's much data to update. Synchronous plug-ins should execute fast and not hold up all logged in users of an organization while a lengthy operation is performed. In addition, if a rollback of the current core operation that invoked the plug-in occurs, any data changes made by the plug-in are undone. This rollback could leave Dataverse and an external service in an unsynchronized state.
2727

28-
Note that it is possible for synchronous registered plug-ins to post the current transaction's execution context to the Azure Service Bus.
28+
It is possible for synchronous registered plug-ins to post the current transaction's execution context to the Azure Service Bus.
2929

3030
<a name="bkmk_writing"></a>
3131

3232
## Write the plug-in code
3333

34-
In the following sample plug-in code has been added to obtain the Azure service provider and initiate posting the execution context to the Service Bus by calling <xref:Microsoft.Xrm.Sdk.IServiceEndpointNotificationService.Execute(Microsoft.Xrm.Sdk.EntityReference,Microsoft.Xrm.Sdk.IExecutionContext)>. Tracing code has been added to facilitate debugging of the plug-in because the plug-in must run in the sandbox.
34+
In the following sample plug-in, code has been added to obtain the Azure service provider and initiate posting the execution context to the Service Bus by calling <xref:Microsoft.Xrm.Sdk.IServiceEndpointNotificationService.Execute(Microsoft.Xrm.Sdk.EntityReference,Microsoft.Xrm.Sdk.IExecutionContext)>. Tracing code has been added to facilitate debugging of the plug-in because the plug-in must run in the sandbox.
3535

3636
> [!NOTE]
3737
> The `serviceEndpointId` passed into the constructor in this code is the one you get from creating a service endpoint as described in [Walkthrough: Configure Azure (SAS) for integration with Dataverse](walkthrough-configure-azure-sas-integration.md)
@@ -115,12 +115,12 @@ In your plug-in code, you can update the writeable data in the context before in
115115

116116
## Plug-in registration
117117

118-
There are a few restrictions when you register a Azure-aware custom plug-in. The plug-in must be registered to execute in the sandbox. Because of this, the plug-in is limited to calling <xref:Microsoft.Xrm.Sdk.IOrganizationService> methods, Azure solution methods, or accessing a network using a web client. No other external access, such as access to a local file system, is allowed.
118+
There are a few restrictions when you register an Azure-aware custom plug-in. The plug-in must be registered to execute in the sandbox. Sandbox registration limits the plug-in to calling <xref:Microsoft.Xrm.Sdk.IOrganizationService> methods, Azure solution methods, or accessing a network using a web client. No other external access, such as access to a local file system, is allowed.
119119

120-
For a plug-in registered to execute in asynchronous mode, this also means that the order of plug-in execution compared to other asynchronous plug-ins is not guaranteed. In addition, asynchronous plug-ins always execute after the Dynamics 365 core operation.
120+
For a plug-in registered to execute in asynchronous mode, the order of plug-in execution compared to other asynchronous plug-ins isn't guaranteed. In addition, asynchronous plug-ins always execute after the Dataverse core operation.
121121

122122
<a name="bkmk_failure"></a>
123-
123+
124124
## Handle a failed Service Bus post
125125

126126
The expected behavior from a failed Service Bus post is dependent on whether the plug-in was registered for synchronous or asynchronous execution. For asynchronous plug-ins, the system job that actually posts the execution context to the service bus will retry the post. For a synchronous registered plug-in, an exception is returned. More information [Management and Notification of Run-time Errors](azure-integration.md)
@@ -132,8 +132,8 @@ For a plug-in registered to execute asynchronously, the <xref:Microsoft.Xrm.Sdk.
132132

133133
### See also
134134

135-
[Azure extensions for Dynamics 365](azure-integration.md)
136-
[Send Dynamics 365 data over the Microsoft Azure Service Bus](work-data-azure-solution.md)
135+
[Azure integration](azure-integration.md)
136+
[Work with Microsoft Dataverse data in your Azure solution](work-data-azure-solution.md)
137137
[Sample: Azure aware custom plug-in](org-service/samples/azure-aware-custom-plugin.md)
138138
[Write a plug-In](write-plug-in.md)
139139
[Event execution pipeline](event-framework.md)

powerapps-docs/maker/data-platform/azure-synapse-incremental-updates.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: "Query and analyze the incremental updates | MicrosoftDocs"
33
description: "Learn how to query and analyze the incremental updates made to Microsoft Dataverse data during a user-specified time interval with Power Apps and Azure Synapse Analytics"
44
ms.custom: ""
5-
ms.date: 02/07/2023
5+
ms.date: 07/29/2024
66
ms.reviewer: "matp"
77
ms.suite: ""
88
ms.tgt_pltfrm: ""
@@ -51,7 +51,7 @@ Azure Synapse Link for Dataverse. This guide assumes that you have already met t
5151
:::image type="content" source="media/azure-synapse-add-tables-settings.png" alt-text="Add tables settings":::
5252

5353
> [!NOTE]
54-
> The minimum time interval is 15 minutes. That means the incremental update folder will be created every 15 minutes and contain the changes that occurred within the time interval. This setting is also configurable after the link creation via **Manage tables**
54+
> The minimum time interval is five minutes. That means the incremental update folder is created every five minutes and contain the changes that occurred within the time interval. This setting is also configurable after the link creation via **Manage tables**
5555
>
5656
> Ensure **Connect to your Azure Synapse workspace Azure Synapse workspace** is not checked in the first page of setup.
5757

powerapps-docs/maker/data-platform/azure-synapse-link-delta-lake.md

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: jasonhuang
66
ms.reviewer: matp
77
ms.service: powerapps
88
ms.topic: how-to
9-
ms.date: 06/18/2024
9+
ms.date: 07/29/2024
1010
ms.custom: template-how-to
1111
---
1212
# Export Dataverse data in Delta Lake format
@@ -59,7 +59,7 @@ When setting up an Azure Synapse Link for Dataverse, you can enable the **export
5959
- Dataverse: You must have the Dataverse **system administrator** security role. Additionally, tables you want to export via Azure Synapse Link must have the **Track changes** property enabled. More information: [Advanced options](create-edit-entities-portal.md#advanced-options)
6060
- Azure Data Lake Storage Gen2: You must have an Azure Data Lake Storage Gen2 account and **Owner** and **Storage Blob Data Contributor** role access. Your storage account must enable **Hierarchical namespace** and **public network access** for both initial setup and delta sync. **Allow storage account key access** is required only for the initial setup.
6161
- Synapse workspace: You must have a Synapse workspace and **Owner** role in access control(IAM) and the **Synapse Administrator** role access within the Synapse Studio. The Synapse workspace must be in the same region as your Azure Data Lake Storage Gen2 account. The storage account must be added as a linked service within the Synapse Studio. To create a Synapse workspace, go to [Creating a Synapse workspace](/azure/synapse-analytics/get-started-create-workspace).
62-
- A Spark Pool in the connected Azure Synapse workspace with **Apache Spark Version 3.3** using this [recommended Spark Pool configuration](#recommended-spark-pool-configuration). For information about how to create a Spark Pool, go to [Create new Apache Spark pool](/azure/synapse-analytics/quickstart-create-apache-spark-pool-portal#create-new-apache-spark-pool).
62+
- An Apache Spark pool in the connected Azure Synapse workspace with **Apache Spark Version 3.3** using this [recommended Spark Pool configuration](#recommended-spark-pool-configuration). For information about how to create a Spark Pool, go to [Create new Apache Spark pool](/azure/synapse-analytics/quickstart-create-apache-spark-pool-portal#create-new-apache-spark-pool).
6363
- The Microsoft Dynamics 365 minimum version requirement to use this feature is 9.2.22082. More information: [Opt in to early access updates](/power-platform/admin/opt-in-early-access-updates#how-to-enableearly-access-updates)
6464

6565
### Recommended Spark Pool configuration
@@ -75,6 +75,9 @@ This configuration can be considered a bootstrap step for average use cases.
7575
- Dynamically allocate executors: Enabled
7676
- Default number of executors: 1 to 9
7777

78+
> [!IMPORTANT]
79+
> Use the Spark pool exclusively for Delta Lake conversation operation with Synapse Link for Dataverse. For optimal reliability and performance, avoid running other Spark jobs using the same Spark pool.
80+
7881
## Connect Dataverse to Synapse workspace and export data in Delta Lake format
7982

8083
1. Sign into [Power Apps](https://make.powerapps.com/?utm_source=padocs&utm_medium=linkinadoc&utm_campaign=referralsfromdoc) and select the environment you want.

0 commit comments

Comments
 (0)