Hacker News new | past | comments | ask | show | jobs | submit login

This can be said about every lossy compression technique. Why do quantization that throws away details, and not send higher bitrate instead? Why do edge prediction that can smudge things, and not send higher bitrate instead? Why do inter-frame prediction which can cause wobbly motion and not send higher bitrate instead?

The answer always is that the technique allows better use of bandwidth, so you can have a better image without increasing bandwidth. Or if you're able to increase the bandwidth, you can have even better picture with the technique than without it (until the bandwidth is so high that you can send the video uncompressed, but that's not happening anytime soon for video on the web).




Think of how much money Netflix saves by streaming movies to your TV at 5mbps instead of 10mbps. Serving a single user, the cost difference is negligible, but across 120 million users it probably saves them millions in bandwidth costs.

I still buy blu-rays though so I am a firm believer in the "just throw more bits at it" solution.


That's a false dichtomy. At any bitrate Netflix chooses to stream at, this technique could improve perceived quality. Meanwhile blurays also use codecs with similar advanced reconstructive techniques - even with the higher bitrate they are essential to maintain a high perceived quality.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: