Where I work we make extensive use of both Azure Functions and console apps, to manipulate data in Dataverse, or to integrate to/from Dataverse and other systems.
We have found that it’s very easy to run into Dataverse API Service Protection limits when working with quite small datasets, so we’ve had to learn how to adopt various techniques to keep our applications working.
This “project” will pull together an occasional series of posts documenting the different approaches we have tried, based on our own experience and a trawl through Microsoft example code.
Dataverse API Service Protection limits
These are enforced by Microsoft to ensure no one consumer impacts on the overall performance of the Dataverse platform for all consumers. At the time of writing the Dataverse Service protection API limits are evaluated per-user and per web-server (see below), and are set at:
- A cumulative 600 requests in a 300 second sliding window
- A combined execution time of 1200 seconds aggregated across requests in a 300 second sliding window
- A maximum of 52 concurrent requests per-user
These limits are enforced per web server, however the number of web servers servicing a given Dataverse environment is opaque, so it is prudent to plan for only one server when considering limits.
Impact of exceeding limits
Depending on which API you are using, the platform will signal that limits are exceeded in one of two different ways:
With the Dataverse SDK for .NET, an OrganizationServiceFault error with one of three specific error codes:
Error code (from SDK) Hex code (from Web API) Message
Number of requests exceeded the limit of 6000 over time window of 300 seconds
Combined execution time of incoming requests exceeded limit of 1,200,000 milliseconds over time window of 300 seconds. Decrease number of concurrent requests or reduce the duration of requests and try again later.
Number of concurrent requests exceeded the limit of 52
Areas I aim to cover
Our experiences include: simple console apps, typically used for data manipulation or bulk import; simple low-volume Azure Functions; and a couple of more complex Azure Functions apps that can scale out significantly and create a significant parallel load on the Dataverse API.
In this series I aim to touch on all these scenarios and document the techniques we have found to work.
As I publish posts in this series they will be linked at the bottom of this post.
Example code can be found here.