Got into a conversation today with my brother (and tech-blog editor extraordinaire) Justin Smith, about the performance of async vs non-async handlers/pages. Thought it'd be interesting to test out if you only reap the benefits of async handlers/pages when you're passing work off to a webservice or db. So, we're going to test throughput/response time of asnyc vs synchronous Fibonacci calculations.
Note all tests were ran on my localhost using in debug="false" mode, which is very important as with it on test results are completely different. Here are the tests and results:
IHttpAsyncHandler Example and Performance
The TestMethods class at the top was used in both tests.
Performance results for 2000 requests at a concurrency level of 50:
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.3 0 1 Processing: 6 31 6.4 30 60 Waiting: 5 31 6.3 30 60 Total: 6 31 6.4 31 60 Percentage of the requests served within a certain time (ms) 50% 31 66% 32 75% 35 80% 36 90% 39 95% 42 98% 47 99% 48 100% 60 (longest request)
Performance results for 5000 requests at a concurrency level of 50:
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.3 0 2 Processing: 5 29 5.7 30 81 Waiting: 5 29 5.7 29 79 Total: 5 30 5.7 30 81 Percentage of the requests served within a certain time (ms) 50% 30 66% 31 75% 33 80% 33 90% 37 95% 39 98% 42 99% 44 100% 81 (longest request)
Performance results for 5000 requests at a concurrency level of 100:
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.3 0 4 Processing: 12 61 11.1 62 88 Waiting: 12 61 11.1 62 87 Total: 12 62 11.1 62 88 Percentage of the requests served within a certain time (ms) 50% 62 66% 66 75% 68 80% 69 90% 75 95% 80 98% 83 99% 85 100% 88 (longest request)
IHttpHandler Example and Performance
Performance results for 2000 requests at a concurrency level of 50:
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.3 0 2 Processing: 6 23 4.6 23 55 Waiting: 6 22 4.6 22 55 Total: 6 23 4.6 23 55 Percentage of the requests served within a certain time (ms) 50% 23 66% 24 75% 25 80% 26 90% 28 95% 30 98% 34 99% 37 100% 55 (longest request)
Performance results for 5000 requests at a concurrency level of 50:
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.3 0 4 Processing: 8 23 4.3 22 52 Waiting: 7 22 4.2 22 52 Total: 8 23 4.3 22 52 Percentage of the requests served within a certain time (ms) 50% 22 66% 24 75% 24 80% 25 90% 27 95% 30 98% 36 99% 40 100% 52 (longest request)
Performance results for 5000 requests at a concurrency level of 100:
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.3 0 5 Processing: 7 47 6.9 45 138 Waiting: 7 46 6.8 45 114 Total: 7 47 6.9 45 138 Percentage of the requests served within a certain time (ms) 50% 45 66% 48 75% 50 80% 51 90% 55 95% 59 98% 66 99% 68 100% 138 (longest request)
Analysis
So synchronous seems like the way to go-- no real gains on async. The next tests will be for IO bound tasks (web/db calls).
Sounds like you and your brother have some really boring conversations.
ReplyDelete