Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The urgency is coming from mobile networks and high reliance on low latency for user engagement.

Also even if we assume that 50% of all internet users have access the most modern protocols as soon as reasonable we still care about how quickly the next half will upgrade.

And in a sense I agree with this vision of the internet. On one hand the internet could be a super optimized distributed communication network with a lot of embedded functionality (I really like the concept of content-centric-networks/name-centric-networks) on the other hand the internet could be a dumb many-ended pipe (the IP protocol)

In terms of how I use the internet in practice I always prefer the dumb pipe model.



The upgrade problem is something that google created with android devices being stuck on old kernels, now they're trying to paper over that mess by moving things to userspace. That may solve this particular problem but it creates a new maintenance nightmare with hundreds of different applications bringing their own congestion control and different transport implementation on top of dozens of differnet TLS libraries. It seems like a little more patience and ensuring that devices remain upgradable could give us most of the advancements without the downsides.


That is a possible source of error, still there is the issue of network devices that implement ossified versions of internet standards. For example when they tried to use different kinds of compression in unencrypted HTTP it came out that a lot of middleboxes would simply "fix" the content-encoding header to be either gzip or deflate, regardless of that making it illegible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: