A few years ago, I did some performance testing on my blog. I run this blog on a single hobby dyno on the Heroku Platform, and I have seen some performance changes over the past few months. Folks have even been tweeting about performance increases on Heroku, so I wanted to look at my non-scientific tests again.
It’s running on a single Heroku Hobby Dyno, with 512m of RAM. It’s running nginx, and all of the files are static - generated by Jekyll.
Testing
I ran AB with a concurrency of 20 on it, and it handled it very well.
Server Software: nginx
Server Hostname: greg.nokes.name
Server Port: 443
SSL/TLS Protocol: TLSv1.2,ECDHE-RSA-AES128-GCM-SHA256,2048,128
Server Temp Key: ECDH X25519 253 bits
TLS Server Name: greg.nokes.name
Document Path: /
Document Length: 82735 bytes
Concurrency Level: 20
Time taken for tests: 322.931 seconds
Complete requests: 10000
Failed requests: 6
(Connect: 3, Receive: 0, Length: 3, Exceptions: 0)
Total transferred: 829739899 bytes
HTML transferred: 827279899 bytes
Requests per second: 30.97 [#/sec] (mean)
Time per request: 645.862 [ms] (mean)
Time per request: 32.293 [ms] (mean, across all concurrent requests)
Transfer rate: 2509.18 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 288 32.0 285 1269
Processing: 257 356 199.5 280 9456
Waiting: 82 95 11.2 94 394
Total: 528 644 200.3 567 9456
Percentage of the requests served within a certain time (ms)
50% 567
66% 578
75% 814
80% 823
90% 835
95% 845
98% 862
99% 936
100% 9456 (longest request)
I then decided to turn it up to 11, and ran with a concurrency of 100. Not a bad showing, 132 rps is really good for a single, small container.
Server Software: nginx
Server Hostname: greg.nokes.name
Server Port: 443
SSL/TLS Protocol: TLSv1.2,ECDHE-RSA-AES128-GCM-SHA256,2048,128
Server Temp Key: ECDH X25519 253 bits
TLS Server Name: greg.nokes.name
Document Path: /
Document Length: 82735 bytes
Concurrency Level: 100
Time taken for tests: 75.743 seconds
Complete requests: 10000
Failed requests: 4
(Connect: 2, Receive: 0, Length: 2, Exceptions: 0)
Total transferred: 829780642 bytes
HTML transferred: 827320642 bytes
Requests per second: 132.02 [#/sec] (mean)
Time per request: 757.432 [ms] (mean)
Time per request: 7.574 [ms] (mean, across all concurrent requests)
Transfer rate: 10698.42 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 345 36.4 343 1523
Processing: 262 401 200.2 318 10199
Waiting: 84 97 11.8 95 450
Total: 554 746 199.1 657 10199
Percentage of the requests served within a certain time (ms)
50% 657
66% 713
75% 928
80% 943
90% 963
95% 980
98% 998
99% 1021
100% 10199 (longest request)
Results
Perc | Old Speed | New Speed | Improvement |
---|---|---|---|
50% | 5290 | 657 | 88% |
66% | 5383 | 713 | 87% |
75% | 5446 | 928 | 83% |
80% | 5571 | 943 | 83% |
90% | 6174 | 963 | 84% |
95% | 6718 | 980 | 85% |
98% | 9109 | 998 | 89% |
99% | 9852 | 1021 | 90% |
Slowest | 13795 | 10199 | 26% |
Mean | 7482 | 1934 | 74% |
So, it appears that I am seeing a big improvement in performance, with no real changes on my side.
I’ll attribute it to Faster Dynos for All.