The HTTP 1.1 protocol advised that only 2 concurrent requests should be made per domain. Both the .NET Framework and .NET Core use this limit for desktop applications. ASP.NET applications have a limit of 10 concurrent requests. Both runtimes allow you to change the limit.
This limit made sense for browsers a while ago but it's too restrictive for service oriented applications. Browsers allow around 8 concurrent connections nowadays and service/REST applications can handle more.
ServicePointManager.DefaultConnectionLimit can be used to change the limit for the entire application, eg :
ServicePointManager.DefaultConnectionLimit = 100;
You can also specify a limit per HttpClient
instance, by using an HttpClientHandler
with the HttpClientHandler.MaxConnectionsPerServer property set to the desired limit :
var handler = new HttpClientHandler
{
MaxConnectionsPerServer= 100,
AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate
};
HttpClient client = new HttpClient(handler);
This way you can set different limits per target service.
Don't rush and set the limit to a huge number. The target services may not be able to handle 20 or 40 concurrent requests from the same client. Badly written services may crash or flood the server. Concurrent requests may block each other, reducing the actual throughput. Well written services may impose a rate limit per client, or queue requests.
You'd be surprised how badly some supposedly high-traffic services behave. I've encountered airline services that could crash if more than just 10 concurrent requests were made over a minute. Badly configured load balancers would still direct traffic to those crashed services for at least 1-2 minutes until the service restarted, making retries meaningless.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…