Does the test server version affect the download speed?

TL;DR

The version of the test server software has no influence on the measured speed.

This test confirms that there are clear differences in the test server used. These differences are significant 13x as large as the "normal" standard deviation and cannot be explained by the version of the test server software used.

 

Contents

  1. Introduction
  2. Method of measurement
  3. Test servers to test
  4. The measurements
  5. Conclusions
 

Introduction

In this study we investigate whether there is a correlation between the version of software running on a test server and the speed measured on that same test server.

 

Method of measurement

We do this by comparing the results of Differences in Ookla test servers with the version of the test server in question. We use Hi! to find out the version of the test server.

The results will be grouped by version number.

 

Test servers to test

All test servers used in Differences in Ookla test servers with a working Hi! .

 

The measurements

  1. hello 2.11 (2.11.0) 2023-11-29.2207.3251a05
    1. Odido: 100.37 Mbps
    2. ServerRoom.net: 100.45 Mbps
    3. NFOrce Entertainment B.V.: 100.48 Mbps
    4. Usenet.Farm: 100.48 Mbps
    5. Clouvider Ltd: 100.49 Mbps
    6. HyperFilter DDoS Protection Solutions: 100.49 Mbps
    7. Melbicom: 100.49 Mbps
    8. WD6.net: 100.50 Mbps
    9. dstny: 100.51 Mbps
    10. Solcon Internetdiensten B.V.: 100.51 Mbps
    11. BlackHOST Ltd.: 100.52 Mbps
    12. Labixe Ltd: 100.52 Mbps
    13. Qonnected B.V.: 100.53 Mbps
    14. I4 Networks: 100.54 Mbps
    15. NewsXS B.V.: 100.57 Mbps
    16. XS News B.V.: 100.59 Mbps
    17. WARIAN: 100.65 Mbps
    18. Global Layer: 101.41 Mbps
    19. RETN: 101.71 Mbps
    20. Voiped Telecom: 101.71 Mbps
    21. VDSina: 101.85 Mbps
    22. Fiber NL B.V.: 101.89 Mbps
    23. PhoenixNAP Global IT Services: 101.89 Mbps
    24. Asimo Networks B.V.: 101.90 Mbps
    25. 31173 Services AB: 101.91 Mbps
    26. Nextpertise: 101.91 Mbps
    27. Kamatera, Inc: 101.94 Mbps
  2. hello 2.11 (2.11.1) 2024-02-13.1456.91c4f93
    1. TNGNET B.V.: 100.45 Mbps
    2. Eranium B.V.: 100.47 Mbps
    3. Sharktech Inc.: 100.48 Mbps
    4. CLOUD LEASE: 100.5 Mbps
    5. MIRHosting: 100.52 Mbps
    6. Hack The Box Ltd: 101.91 Mbps
    7. AVUR AS: 101.92 Mbps
 

Conclusions

  1. Semantic versioning is used
  2. The major version is for all tested test servers: 2
  3. The minor version is for all tested test servers: 11
  4. 27 of the 34 (79%) tested test server has no patch installed, they use version 2.11.0
  5. 7 of the 34 (21%) tested test server have patch 1 installed, they use version 2.11.1
  6. With version 2.11.0, measured speeds vary between 100.37 and 101.95 Mbps
  7. With version 2.11.1, measured speeds vary between 100.45 and 101.92 Mbps

In short, the version of the test server software has no influence on the measured speed.

This is not a normal deviation

The standard deviation of the measured speeds is: 0.65 Mbps.

When we test one server multiple times and calculate the standard various times, we get these standard daviatons:

  1. Odido: 100.49, 100.48, 100.56, 100.52 and 100.52 Mbps, standard deviation: 0.03 Mbps
  2. Global Layer: 101.40, 101.33, 101.35, 101.38 and 101.31 Mbps, standard deviation: 0.04 Mbps
  3. AVUR AS: 101.94, 101.97, 101.99, 101.95 and 101.86 Mbps, standard deviation: 0.05 Mbps

A normal standard deviation is circa 0.05 Mbps. So the standard deviation we found is 0.65 / 0.05 = 13 times larger than you would expect.

So there are clear differences in the test server used. These differences are significant 13x as large as the "normal" standard deviation and cannot be explained by the version of the test server software used.