0

I got a C# .NET 8 service. .Net 8 delivers a built in concurrency limiter to limit the concurrent requests made to the service, with an option to define a queue for requests made if their number pass the limit set. What I need to know is how to set the ideal limits for my service.

So what I did is first load test the service (using JMeter if it matters). It seems it can take requests of about 20000 concurrent users, with a ramp up of 90 seconds. The service itself is offering CRUD operation on a DB (let's divide it to 2 operations - set and get) - the set operation takes about 245 ms to complete, while the get operation takes about 28 ms. About 80% of the requests are GET operations and 20% are SET.

So taking all these data above I tried calculating it like this:

Requests per seconds = 20000 / 90 = 222.222 req/sec Average operation time = (80% * 28ms) + (20% * 245 ms) = 71.4 ms By little's law, the concurrent value should be the multiplication between them:

L = 222.222 * 0.0714 = 15.866 = about 16 connections

Are my calculations correct? Since it seems odd that the service could only handle 16 connections at a time

0

You must log in to answer this question.

Browse other questions tagged .