But that hash is a regular, fast hash that takes like 1µs to compute right? Doesn't that get lost in network jitter? Wouldn't averaging the time it takes to run for(i=0;i<Math.pow(2,18);i++); over 10 runs be much more accurate? Or is this meant to spite the 0.01% of visitors that really try not to be tracked and have turned off javascript?
It's intended to ensure that a CDN doesn't change the content they're serving to your users.
But it turns out you can approximate the speed a visiting browser computes those hashes to fingerprint browsers just by including some CSS on a page.