Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

JPEG and PNG are good enough. Encoders and decoders are heavily optimized and omnipresent. No patent issues (either patent-free or expired).

We should focus on other more pressing issues. This one is basically solved.



The trouble is that images are a large fraction of the page size even for well optimised websites. Using smaller images via changing format often makes a significant difference in the page load time.

It's also relatively easy for new formats to be adopted - the html picture tag makes it really easy to serve the same image in multiple formats to allow for backward compatibility.

And of course image decoders are easily distributed - JXL's encoder/decoder are open source.

The combination of those two factors probably means that this will never be a settled area now. It's fairly easy for websites to adopt a new format and there are strong reasons to. Means, motive and opportunity.

I reckon Chrome will cave in eventually. The pressure on browser performance is intense. Unless they want to see Safari faster in benchmarks of image heavy pages they will have to adopt it.


> The trouble is that images are a large fraction of the page size even for well optimised websites. Using smaller images via changing format often makes a significant difference in the page load time.

That's true, but even savvy people don't seem to care much about this. For example, (open) podcasts are still universally MP3-based even though it's been possible for many years to halve their size using modern, ubiquitous audio formats.

> I reckon Chrome will cave in eventually. The pressure on browser performance is intense. Unless they want to see Safari faster in benchmarks of image heavy pages they will have to adopt it.

What's curious about this to me is why Apple is single-handedly saving JPEG-XL from obscurity when AV1 is also a fine substitute for mainstream image compression use cases.


MP3 is also good enough. Modern compressors are on par with Opus, AAC or whatever is in fashion now. The format is sane, patents have expired, it run everywhere.

It's not worth the trouble.


Opus you'd get away with half the bitrate, or less, than MP3 for a podcast. Gets to be somewhat significant given the length of many podcasts, those 64kbps or so that you save add up over time.

Though I agree, the "works everywhere" aspect is really the overriding factor: everyone can play it, all the services will accept it, etc. I think the only way you'd see it get used significantly is if a major service was transcoding to it, but I don't know that podcast bandwidth is a sufficient cost for anyone to bother.


HE-AAC is a bit better than Opus, plus has the benefit of MP3's "works everywhere" experience. I posted more detail elsewhere in the thread if you're interested.


xHE-AAC from 2016 (also known as USAC) yes. The older HE-AAC from 2003 and HE-AACv2 are not. Codecs have similar names, but they are different and released at different times.


USAC doesn't meet the "ubiquitous" requirement for this use case.

Regarding your claim that Opus is better than HE-AAC, here's a "Quality vs Bitrate" chart from opus-codec.org, the home of Opus: https://opus-codec.org/static/comparison/quality.svg

Note that AAC (presumably they mean "Main Profile" rather than AAC-LC) has effectively the same efficiency as Opus. HE-AAC and HE-AACv2 have a higher efficiency than both Opus and AAC, and works great at lower bitrates in comparison to AAC.


"Regarding your claim that Opus is better than HE-AAC, here's a "Quality vs Bitrate" chart from opus-codec.org, the home of Opus: https://opus-codec.org/static/comparison/quality.svg

Note that AAC (presumably they mean "Main Profile" rather than AAC-LC) has effectively the same efficiency as Opus. HE-AAC and HE-AACv2 have a higher efficiency than both Opus and AAC, and works great at lower bitrates in comparison to AAC."

This chart just roughly outlines (according to the feeling of Opus developers at that time) what to expect from Opus - a wide range of useful bitrates. It's not anything that was actually measured or something that can be used for drawing any conclusions from it. I mean - those nice curves and lack of any detail about the codecs used should give it away.

According to public (double blind) listening test that were performed by the Hydrogen audio group Opus does win over best HE-AAC codecs available at time when the test was performed - both at 64kbps and 96kbps bitrates [1] (Multiformat Tests).

[1] https://wiki.hydrogenaud.io/index.php?title=Hydrogenaudio_Li...


The podcast (and audio in general) thing annoys me way more than it should. FFS, just use Opus, it is supported on virtually all devices out there and is massively smaller


last I checked the iPhone podcasts app doesn't play opus. Have things changed?


it's not just the podcasts app. ios itself doesn't support opus. i've run into this problem with plenty of other apps in ios.


Well then fuck iOS


My guess is the fact that existing jpeg images can be transcoded to jpeg-xl losslessly is the driving feature. This gives an easy migration path for existing sites that don't have high resolution masters that can be re-encoded into AV1 or webp.

Also think of 20+ years of jpg digital camera photos that can now be transcoded to save space. For Apple that is also a huge win.


> What's curious about this to me is why Apple is single-handedly saving JPEG-XL from obscurity when AV1 is also a fine substitute for mainstream image compression use cases.

Apple does whatever they want. If they think something is better, they will go ahead and use it, the rest of the market be damned.


Tangentially related, some clients don't handle variable encodings well, putting another limit on efficiency improvements

indeed a pity progress is so slow even in these easier cases of widely available improvements


I didn't know that about podcasts and audio formats. What format would you recommend? It would need to be well-supported by podcast players.


> What format would you recommend? It would need to be well-supported by podcast players.

HE-AAC. Support for HE-AAC and HE-AAC v2 has been universal on modern media platforms and operating systems for well over a decade.¹

  • All versions of Android support HE-AAC v2 playback²
    - Google also added encoders in Android 4.1 (2012)
  • iOS introduced support for HE-AAC v2 playback in iOS 4 (released 2010)
  • macOS introduced support for HE-AAC v2 playback with iTunes 9.2 (released 2010)
  • I'm not sure when Windows added support, but it was available in Windows 8
  • All open source players support HE-AAC v2 playback via FAAD2
FWIW, I distributed a reasonably-popular podcast (1.2M downloads over its lifetime) using HE-AAC v2 several years ago, and never received a complaint, or found a player it didn't work on.

¹ I read the other comments recommending Opus before responding, and although Opus is very nice, it's not as efficient or as ubiquitous as HE-AAC.

² https://developer.android.com/guide/topics/media/media-forma...


FAAD2 is in non-free repositories; for open source, it presents a problem with distribution outside of freeworld.

Neither Opus nor mp3 have this problem. So to maximize compatibility, mp3 is still the best choice, due to the attitude to Opus and other free codecs that Apple has.


> FAAD2 is in non-free repositories; for open source, it presents a problem with distribution outside of freeworld.

I may be misspeaking about FAAD2, but I've never run into an open-source player (like VLC) or library (like ffmpeg) which hasn't supported at least HE-AAC decode for a decade or more. If that's wrong, I'd love to be corrected in the most detailed way possible.


FFmpeg is exactly the kind of application, that in non-pruned configuration (i.e. with codecs like AAC) distributions cannot distribute binaries legally in some countries.

E.g. for Fedora, it is in rpmfusion repository and not in the distribution proper. Other distributions have similar arrangement for license-os-ok-but-patents-are-problem situation. These servers are outside US (or other countries that recognize software patents), and for the US users, the issue of obtaining the license is up to them.

The situation is so bad, that Fedora stopped shipping support for hardware acceleration of patented codecs (i.e. not complete codecs, but support to use the implementation in hardware, for example in your GPU), because they could be sued for contributory infringement.

Also note, that binaries for VLC or ffmpeg for Windows or Mac are similarly distributed from non-US servers, so basically the same situation as rpmfusion.


although Opus is very nice, it's not as efficient ... as HE-AAC

Uh, no, this is not correct. If you try to find a source, you'll easily find the opposite is very well established.


Here's a "Quality vs Bitrate" chart from opus-codec.org, the home of Opus: https://opus-codec.org/static/comparison/quality.svg

Note how AAC has effectively the same efficiency as Opus? HE-AAC and HE-AACv2 are notably better in comparison, and are usable at lower bitrates than AAC.

In cases where "the opposite is very well established", they're talking about AAC-LC. Citations that show otherwise are welcome! In any case, HE-AAC's universality is really beaten only by MP3 (which it trounces).


I assuming you are using HE-AAC at below 96kbps? Because above AAC-LC should perform better.


Opus is pretty widely supported (still some holes though, for example very few Adobe products support Opus), and it can sound better than MP3 at less than half the bitrate.


Normally aac and opus are used.


Opus


> The trouble is that images are a large fraction of the page size even for well optimised websites. Using smaller images via changing format often makes a significant difference in the page load time.

Pepperidge farm remembers Google heavily pushing for WebP for this reason. Look how much smaller it is! and Lighthouse giving demerits for every non-WebP image. Which, of course, is totally true, WebP images were almost always quite a bit smaller. They all looked like shit, too, of course.


> JPEG and PNG are good enough. Encoders and decoders are heavily optimized and omnipresent. No patent issues (either patent-free or expired).

I think you're getting downvotes for the sentiment, but I'm inclined to agree with both this comment, as well as the one you made a bit further in the thread.

> MP3 is also good enough. Modern compressors are on par with Opus, AAC or whatever is in fashion now. The format is sane, patents have expired, it run everywhere. It's not worth the trouble.

Something like Ogg/Vorbis would also be under my consideration for audio files, but I find these established and somewhat out of date technologies to be surprisingly stable and reliable. The same way how for many folks out there the H.264 video codec will also suffice for almost any use case (sadly Theora never got big, but there's VP8/VP9 too). The same way how something like ZIP or 7z will be enough for most archiving needs.

I think there is no harm in using these options and not sweating about having to serve every asset in whichever of the modern formats the user's browser in question might support (nor should you chase perfect quality), as well as be sure to actually convert the assets into a multitude of formats and include them in the page, as long as 90% of your hosting costs aren't caused by this very choice. Who cares if a few pixels are a bit off because you saved your JPG with a quality setting of 75 or 80? As long as your application/site does what it should, that's hardly something that most people care about, or will even notice.

> We should focus on other more pressing issues. This one is basically solved.

However, eventually something new is going to come out AND get stable enough to have the same widespread adoption as the current stable and dependable options. Whether that happens in a year, 5 years or a decade, only time will show (let's say, 95% of devices supporting AV1 and Opus). I think that then will be the right time to switch to these new technologies for most people, as opposed to being early adopters.


Yes, I’m in a bad mood and it probably came across a bit fatalistic.

> However, eventually something new is going to come out…

The thing is, human eyes and ears aren’t getting any better and computing performance seems to be plateauing. My guess is we have already reached the “good enough” point and I doubt we’ll get formats 3-4X as efficient in the future.


> The thing is, human eyes and ears aren’t getting any better and computing performance seems to be plateauing.

My guess is that we'll ideally end up with something that isn't hard to use copyright/patent wise and offers "optimal" quality but at smaller file sizes when compared with the prior technologies, which would be a good thing for both storing your own video files, as well as when handling them en masse in most businesses. After all, if someone could reduce the bandwidth usage of their video hosting or streaming site by 10%, they'd be saving millions in hardware/networking expenses.

Though I don't think that we're at that point yet, many of these more recent technologies are not widely supported, or just not good enough.

For example, I rendered a ~6 minute 1080p video with Kdenlive with multiple codecs, to compare encode performance and resulting filesizes with the default settings:

  Codec  Time   Filesize
  H264   2:27   172 MB
  VP8    9:34   244 MB
  VP9    19:29  332 MB
  AV1    FAIL   FAIL
I probably should have figured out a way to get similar file sizes, but either way you can see that VP8 and VP9 take way longer to process when compared to the version of H264 that's available to me. So the old H264 is indeed a good enough choice for me, at least for now.


I have a strong suspicion you are using hardware acceleration for H264, while the VPx codecs are all being encoded in software.

IMHO the most pressing issue I have with h264 is that it creates files that are almost always larger than HEVC (or AV1), and everything I own handles H265 without any issues (I think it's also mandated by some DVB standard).


Just as ZIP is a mediocre compression algorithm that is supported basically everywhere.

Today we have larger storage and faster bandwidth than ever before so it’s easy to trade off compression for ease.

Same goes with mp3. Back in the day, Microsoft was touting that WMA could replicate the quality of MP3 128kbps at just 64kbps. I bought in and mass converted my entire library because of the high space savings. Now… I have a library of music that has so many artifacts in it, I can listen to it. At the time, the space saving may have been worth it but I now regret it.


Saying that DEFLATE is mediocre is absurd. There's a reason it's still in use after 30 years (and it's not merely inertia). Before zstandard arrived to the scene there was literally nothing able to achieve the same ratio as deflate without taking 10x longer.

This is unlike WMA/WMV which, I agree, were pretty much already obsolete by the time they came out.


well, there was brotli (implemented in 2013, added to web stack in 2015) which was both faster on encode and decode

fastest brotli option encodes about 3x faster than fastest zlib option


You are overseeing how demand is still increasing I believe. Faster bandwidth than ever before is met by higher demand than ever before and it will continue like this for a while.

60fps HDR 4k Content will become normal, eating away at bandwidth availability, meaning that any possible savings will still be relevant, for example in still images.


> ... PNG are good enough.

For a start it's 2023 and all browsers do support lossless webp [1].

Lossless webp (webp can be both lossy or lossless) is not only typically smaller than most optimized PNG files but also compresses faster than the optimized PNG encoders.

So, basically, the main reason to use PNG in 2023 for anything Web related is if you like wasting bandwith. For 99.95% of the PNG files (something huge like that) are smaller when encoded with lossless webp.

Heck, even my Emacs supports webp images.

Then other formats are already knocking on the door and promise even better compression than webp.

https://caniuse.com/webp


Not if you use MozJPEG and a decent PNG compressor, like ECT, OxiPNG, PNGCrush, etc.

Time and time again an article pops up showing how marginal the difference is, if at all.


if you use an optimal strategy to compress PNGs, WebP lossless is still 26 % more dense

if you take usual PNGs from the wild, WebP lossless is 42-45 % more dense

disclaimer: I'm a co-author of Zopfli and ZopfliPNG, and usually best PNG optimizers are based on Zopfli. I'm also the author of WebP lossless.


Unless you have a large image in which case you can’t even store it in webp because of its abysmal max resolution. PNG continues to be the format of record for lossless image storage. Hopefully JPEG XL can replace it.


Jpeg literally can’t show most clear blue skies in pictures without banding due to it only being 8-Bit.

The fact I need to worry about going back to re-edit them and losing quality is silly.

So no, not good enough.


Learn about noise and dithering. You can have seamless blue skies with 8 bits that way.

It’s more a fault of the camera applying excessive noise reduction than the format.

Besides, if we’re worried about file sizes, 16 bit files are humongous.


I know about adding noise, but to call that a solution is silly. You’re purposely degrading the image to get around the technical limitations of a 30 year old format.

I disagree about the noise reduction part too. Modern (larger, not sure about cell phone) sensors basically have basically no noise at ISO 64-100 or so. It’s just an inherent limitation of only having 255 steps of brightness.


Dithering is a feature. It’s used all the time in audio and image. And a little bit of noise is a good thing. Two of my favorite sources on the subject:

https://youtu.be/cIQ9IXSUzuM?t=700

https://theonlinephotographer.typepad.com/the_online_photogr...


jpegli is an even better option -- it adds 10+ bits in old jpeg format in its '8-bit mode'


High-quality HDR at reasonable bitrates is solved?


Ah yes the ubundend fiber in rural areas around the globe.

You do understand that this is not for you if you have high bandwidth and low latency right?


PNG only produces reasonable image sizes for a very restricted category of image. And while JPEG is decent for photos, it does a pretty bad job compressing images which contain sharp edges or text.

So neither is a good choice, unless you know a priori that all your images will fall into one of the categories where these formats do a decent job.


You are free to focus on whatever issues you find pressing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: