You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: powerapps-docs/developer/data-platform/analyze-performance.md
+13-3Lines changed: 13 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
2
title: "Analyze plug-in performance (Microsoft Dataverse) | Microsoft Docs"
3
3
description: "Learn how to find and analyze performance data on plug-ins execution."
4
-
ms.date: 02/23/2023
4
+
ms.date: 02/24/2023
5
5
ms.reviewer: "pehecke"
6
6
ms.topic: "article"
7
7
author: "divkamath"
@@ -22,10 +22,19 @@ When you add business logic to your plug-in you should be aware of the impact yo
22
22
23
23
## Time and resource constraints
24
24
25
-
There is a **2-minute time limit** for a Dataverse message operation to complete. This limit includes executing all registered synchronous plug-ins. There are also limitations on the amount of CPU and memory resources that can be used by extensions. If the limits are exceeded an exception is thrown and the operation will be cancelled.
25
+
There is a hard **2-minute time limit** for a Dataverse message operation to complete. This limit includes executing the intended message operation and all registered synchronous plug-ins. There are also limitations on the amount of CPU and memory resources that can be used by extensions. If the limits are exceeded an exception is thrown and the entire message operation will be cancelled (rolled back).
26
26
27
27
If the time limit is exceeded, an <xref:System.TimeoutException> will be thrown. If any custom extension exceeds threshold CPU, memory, or handle limits or is otherwise unresponsive, that process will be killed by the platform. At that point any current extension in that process will fail with exceptions. However, the next time that the extension is executed it will run normally.
28
28
29
+
> [!IMPORTANT]
30
+
> You cannot control how long the message operation or other synchronous registered plug-ins take to execute. You can only control how long your plug-in takes to execute based on its design and coding.
31
+
>
32
+
> Our general recommendation is to limit the time your plug-in takes to execute to no more than 2 seconds.
33
+
>
34
+
> If your plug-in requires more time to execute, consider registering your plug-in for asynchronous rather than synchronous execution. In fact, asynchronous execution should always be considered first when possible as it results in better application responsiveness and system scalability.
35
+
36
+
More information: [Best practices and guidance regarding plug-in and workflow development](best-practices/business-logic/index.md)
37
+
29
38
## Monitor performance
30
39
31
40
Run-time information about plug-ins and custom workflow extensions is captured and stored in the [PluginTypeStatistic Table](reference/entities/plugintypestatistic.md). These records are populated within 30 minutes to one hour after the custom code executes. This table provides the following data points:
@@ -58,8 +67,9 @@ To access the dashboard, navigate to [Power Platform Admin Center](https://admin
58
67
## See also
59
68
60
69
[Use plug-ins to extend business processes](plug-ins.md)
70
+
[Write Telemetry to your Application Insights resource using ILogger](application-insights-ilogger.md)
61
71
[Tutorial: Debug a plug-in](tutorial-debug-plug-in.md)
Copy file name to clipboardExpand all lines: powerapps-docs/developer/data-platform/authenticate-dot-net-framework.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -55,7 +55,7 @@ If you prefer to not have a dependency on any SDK assemblies, you can also use t
55
55
56
56
The SDK APIs available in [Microsoft.CrmSdk.XrmTooling.CoreAssembly](https://www.nuget.org/packages/Microsoft.CrmSdk.XrmTooling.CoreAssembly/) and other "crmsdk" owned NuGet packages do not support .NET Core code development.
57
57
58
-
For .NET Core application development there is a `DataverseServiceClient` class, currently in preview release, that is patterned after the `CrmServiceClient` class mentioned previously. You can download the [Microsoft.PowerPlatform.Dataverse.Client](https://www.nuget.org/packages/Microsoft.PowerPlatform.Dataverse.Client/) package from Nuget.org to begin using this new service client class in your applications. Documentation and sample code for the `DataverseServiceClient` and related classes will be made available in a future documentation release.
58
+
For .NET Core application development there is a `DataverseServiceClient` class, that is patterned after the `CrmServiceClient` class mentioned previously. You can download the [Microsoft.PowerPlatform.Dataverse.Client](https://www.nuget.org/packages/Microsoft.PowerPlatform.Dataverse.Client/) package from Nuget.org to begin using this new service client class in your applications. Documentation and sample code for the `DataverseServiceClient` and related classes will be made available in a future documentation release.
59
59
60
60
To update existing .NET Framework based application code that uses `CrmServiceClient`, begin by substituting the `DataverseServiceClient` class for `CrmServiceClient` in your code. You will need to set the project type to build a .NET Core application, remove any .NET Framework specific references and NuGet packages, and then add the Microsoft.PowerPlatform.Dataverse.Client package to the project.
61
61
@@ -65,4 +65,4 @@ To update existing .NET Framework based application code that uses `CrmServiceCl
Copy file name to clipboardExpand all lines: powerapps-docs/developer/data-platform/dataverse-sql-query.md
+30-7Lines changed: 30 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
title: "Use SQL to query data (Microsoft Dataverse) | Microsoft Docs"# Intent and product brand in a unique string of 43-59 chars including spaces
3
3
description: "Learn how to query Microsoft Dataverse table data using SQL."# 115-145 characters including spaces. This abstract displays in the search result.
4
4
ms.custom: ""
5
-
ms.date: 09/06/2022
5
+
ms.date: 03/02/2023
6
6
ms.reviewer: "pehecke"
7
7
8
8
ms.topic: "article"
@@ -19,13 +19,13 @@ search.app:
19
19
20
20
# Use SQL to query data
21
21
22
-
[This topic is pre-release documentation and is subject to change. Note that only the SQL data connection is in preview. Power BI is General Availability (GA)]
22
+
[This topic is pre-release documentation and is subject to change. Note that only the SQL data connection through SQL Server Management Studio and .NET libraries is in preview. Power BI is General Availability (GA)]
23
23
24
-
The Microsoft Dataverse business layer provides a Tabular Data Stream (TDS) endpoint that emulates a SQL data connection. The SQL connection provides read-only access to the table data of the target Dataverse environment thereby allowing you to execute SQL queries against the Dataverse data tables. No custom views of the data have been provided. The Dataverse endpoint SQL connection uses the Dataverse security model for data access. Data can be obtained for all Dataverse tables to which a user has access to.
24
+
The Microsoft Dataverse business layer provides a Tabular Data Stream (TDS) endpoint that emulates a SQL data connection. The SQL connection provides read-only access to the table data of the target Dataverse environment thereby allowing you to execute SQL queries against the Dataverse data tables. No custom views of the data have been provided. The Dataverse endpoint SQL connection uses the Dataverse security model for data access. Data can be obtained for all Dataverse tables to which a user has access.
25
25
26
26
## Prerequisites
27
27
28
-
The **Enable TDS endpoint** setting must be enabled in your environment. More information: [Manage feature settings](/power-platform/admin/settings-features)
28
+
The **Enable TDS endpoint** setting must be enabled in your environment. It is enabled by default. More information: [Manage feature settings](/power-platform/admin/settings-features)
29
29
30
30
## Applications support
31
31
@@ -34,10 +34,10 @@ TDS (SQL) endpoint applications support for Power BI and SQL Server Management S
34
34
### SQL Server Management Studio (Preview)
35
35
36
36
> [!NOTE]
37
-
> A compatibility issue has been found with the SQL Server Management Studio 18.9.2 build. A fix is being investigated. Until the fix is available please use build [18.9.1 of SQL Server Management Studio](/sql/ssms/release-notes-ssms?view=sql-server-ver15#1891).
37
+
> A compatibility issue has been found with the SQL Server Management Studio 19.0.1 build. A fix is being investigated. Until the fix is available please use build [18.12.1 of SQL Server Management Studio](/sql/ssms/release-notes-ssms?view=sql-server-ver15#1891).
38
38
> This note will be updated once a fix is available.
39
39
40
-
You can also use [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) (SSMS) version 18.4 or later with the Dataverse endpoint SQL connection. Examples of using SSMS with the SQL data connection are provided below.
40
+
You can also use [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) (SSMS) version 18.12.1 or later with the Dataverse endpoint SQL connection. Examples of using SSMS with the SQL data connection are provided below.
@@ -101,10 +101,32 @@ Dataverse choice columns are represented as \<choice\>Name and \<choice\>Label i
101
101
>[!TIP]
102
102
> After making changes to labels for a choice column, the table needs to have customizations published.
103
103
104
+
> [!NOTE]
105
+
> Including a large number of choice lables in your query will have significant impact on performance. It is best to use less than 10 labels if possible. Because choice lables are localized, the localized string is more expensive to return.
106
+
104
107
### Reported SQL version
105
108
The Dataverse TDS endpoint emulates Microsoft SQL Server read-only query capabilities over the Dataverse business logic. Dataverse returns the current SQL Azure version 12.0.2000.8 for `select @@version`.
106
109
110
+
## Performance guidance
111
+
112
+
When retrieving data through through the TDS endpoint, there are a few key query patterns that should be used. Described below, these query patterns will manage performance and size of result sets.
113
+
114
+
### Only necessary columns
115
+
116
+
When building a query, only return the necessary columns. This helps both execution of the query and also transferring the results back to the client application. In general keeping a query under 100 columns is recommended.
117
+
118
+
### Choice columns
107
119
120
+
Choice columns have been flattened into two columns which helps usability. However, it is important to do any aggregates and filters against the value portion of the choice column. The value portion can have indexes and is stored in the base table. However, the label portion ('choicecolumn' name) is stored separately which cost more to retrieve and can't be indexed. Using a significant number of choice label columns may generate a very slow query.
121
+
122
+
### Use Top X
123
+
124
+
It is very important to use a top clause in your queries to prevent trying to return the whole table of data. For example, use `Select Top 1000 accountid,name From account Where revenue > 50000` limits the results to the first 1000 accounts.
125
+
126
+
### Do not use NOLOCK
127
+
128
+
When building queries do not use the table hint NOLOCK. This will prevent Dataverse from optimizing queries.
129
+
108
130
## Limitations
109
131
110
132
There is an 80-MB maximum size limit for query results returned from the Dataverse endpoint. Consider using data integration tools such as [Azure Synapse Link for Dataverse](../../maker/data-platform/export-to-data-lake.md) and [dataflows](/power-bi/transform-model/dataflows/dataflows-introduction-self-service) for large data queries that return over 80 MB of data. More information: [Importing and exporting data](../../maker/data-platform/import-export-data.md)
@@ -188,8 +210,9 @@ This means the port has been blocked at the client.
188
210
189
211
### See also
190
212
213
+
[How Dataverse SQL differs from Transact-SQL](./how-dataverse-sql-differs-from-transact-sql.md)
191
214
[Get started with virtual tables (entities)](./virtual-entities/get-started-ve.md)
192
-
[Use FetchXML to construct a query](dataverse-sql-query.md)
215
+
[Use FetchXML to construct a query](./use-fetchxml-construct-query.md)
Copy file name to clipboardExpand all lines: powerapps-docs/developer/data-platform/org-service/execute-multiple-requests.md
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
2
title: "Execute multiple requests using the Organization service (Microsoft Dataverse) | Microsoft Docs"# Intent and product brand in a unique string of 43-59 chars including spaces
3
3
description: "ExecuteMultipleRequest message supports higher throughput bulk message passing scenarios in Microsoft Dataverse."# 115-145 characters including spaces. This abstract displays in the search result.
4
-
ms.date: 02/16/2023
4
+
ms.date: 02/28/2023
5
5
ms.reviewer: pehecke
6
6
ms.topic: article
7
7
author: divkamath # GitHub ID
@@ -22,11 +22,11 @@ contributors:
22
22
23
23
The primary purpose of executing multiple requests it so improve performance in high-latency environments by reducing the total volume of data that is transmitted over the network.
24
24
25
-
You can use the <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> message to support higher throughput bulk message passing scenarios in Microsoft Dataverse. <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> accepts an input collection of message <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest.Requests>, executes each of the message requests in the order they appear in the input collection, and optionally returns a collection of <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleResponse.Responses> containing each message’s response or the error that occurred. Each message request in the input collection is processed in a separate database transaction. <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> is executed by using the <xref:Microsoft.Xrm.Sdk.IOrganizationService>.<xref:Microsoft.Xrm.Sdk.IOrganizationService.Execute%2a> method.
25
+
You can use the <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> message to support higher throughput bulk message passing scenarios in Microsoft Dataverse. <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> accepts an input collection of message <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest.Requests>, executes each of the message requests in the order they appear in the input collection, and optionally returns a collection of <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleResponse.Responses> containing each message's response or the error that occurred. Each message request in the input collection is processed in a separate database transaction. <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> is executed by using the <xref:Microsoft.Xrm.Sdk.IOrganizationService>.<xref:Microsoft.Xrm.Sdk.IOrganizationService.Execute%2a> method.
26
26
27
27
In general, <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> behaves the same as if you executed each message request in the input request collection separately, except with better performance. Use of the <xref:Microsoft.Xrm.Sdk.Client.OrganizationServiceProxy.CallerId> parameter of the service proxy is honored and will apply to the execution of every message in the input request collection. Plug-ins and workflow activities are executed as you would expect for each message processed.
28
28
29
-
Plug-ins and custom workflow activities are not blocked from using <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest>. However, this is not recommended. Any failures in the synchronous step must rollback all data operations to maintain data integrity. Each operation performed within `ExecuteMultiple`will not be rolled back. `ExecuteMultiple` also causes issues when the operations exceed the maximum plug-in timeout duration.
29
+
Plug-ins and custom workflow activities are not blocked from using <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest>. However, this is not recommended. Any failures in the synchronous step must rollback all data operations to maintain data integrity. Each operation performed within `ExecuteMultiple`must be rolled back. `ExecuteMultiple` also causes issues when the operations exceed the maximum plug-in timeout duration.
30
30
31
31
More information: [Do not use batch request types in plug-ins and workflow activities](../best-practices/business-logic/avoid-batch-requests-plugin.md)
32
32
@@ -85,7 +85,7 @@ More information: [Sample: Execute multiple requests](samples/execute-multiple-
85
85
86
86
## Specify run-time execution options
87
87
88
-
The <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest.Settings> parameter of <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> applies to all of the requests in the request collection controlling execution behavior and results returned. Let’s take a look at these options in more detail.
88
+
The <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest.Settings> parameter of <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> applies to all of the requests in the request collection controlling execution behavior and results returned. Let's take a look at these options in more detail.
@@ -120,7 +120,7 @@ There are several constraints related to the use of the <xref:Microsoft.Xrm.Sdk.
120
120
121
121
## Handle a batch size fault
122
122
123
-
What should you do when your input request collection exceeds the maximum batch size? Your code can’t directly query the maximum batch size through the deployment web service unless it is run under an account that has the deployment administrator role.
123
+
What should you do when your input request collection exceeds the maximum batch size? Your code can't directly query the maximum batch size through the deployment web service unless it is run under an account that has the deployment administrator role.
124
124
125
125
Fortunately, there is another method that you can use. When the number of requests in the input <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest.Requests> collection exceeds the maximum batch size allowed for an organization, a fault is returned from the <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest> call. The maximum batch size is returned in the fault. Your code can check for that value, resize the input request collection to be within the indicated limit, and re-submit the <xref:Microsoft.Xrm.Sdk.Messages.ExecuteMultipleRequest>. The following code snippet demonstrates some of this logic.
Copy file name to clipboardExpand all lines: powerapps-docs/developer/data-platform/register-plug-in.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -165,7 +165,7 @@ When you register a step, there are many options available to you, which depend
165
165
|**Event Handler**|This value will be populated based on the name of the assembly and the plug-in class. |
166
166
|**Step Name**|The name of the step. A value is pre-populated based on the configuration of the step, but this value can be overridden.|
167
167
|**Run in User's Context**|Provides options for applying impersonation for the step. The default value is **Calling User**. If the calling user doesn't have privileges to perform operations in the step, you may need to set this to a user who has these privileges. More information: [Impersonate a user](impersonate-a-user.md)|
168
-
|**Execution Order**|Multiple steps can be registered for the same stage of the same message. The number in this field determines the order in which they'll be applied from lowest to highest. <br/> **Note**: You should set this to control the order in which plug-ins are applied in the stage. It not recommended to simply accept the default value. The actual execution order of the plugins with the same Execution Order value (for the same stage, table and message) isn't guaranteed and can be random.|
168
+
|**Execution Order**|Multiple steps can be registered for the same stage of the same message. The number in this field determines the order in which they'll be applied from lowest to highest. <br/> **Note**: You should set this to control the order in which plug-ins are applied in the stage. It's not recommended to simply accept the default value. The actual execution order of the plugins with the same Execution Order value (for the same stage, table and message) isn't guaranteed and can be random.|
169
169
|**Description**|A description for step. This value is pre-populated but can be overwritten.|
170
170
171
171
### Event Pipeline Stage of execution
@@ -335,4 +335,4 @@ You can also disable steps in the solution explorer using the **Activate** and *
335
335
[Tutorial: Update a plug-in](tutorial-update-plug-in.md)<br />
0 commit comments