Very unlikely. They'd be the first to ship customer PC parts with AV1 decode support. But the next generation almost certainly will. Same goes for other vendors, for Nvidia post-Ampere, for Intel post-Icelake/Tigerlake/whatever the next is nowadays, etc.
Otherwise, there are AV1 decode IPs available, including a SoC or two. Plus some very recently announced set-top boxes and TVs. So you'll definitely be seeing some hardware with AV1 support shipping this year.
AV1 is inferior to h265 and the successor to h265 should come soon and rekt AV1 and no longer have big royalties issues.
BTW the hardware cost by not using h265 far outweight the royalty cost. It has always been a pure economic nonsense for google (YouTube) to exclude h265.
Could you elaborate on the weaknesses of AV1? I hadn't heard that. Is it because of the wavelet-based i-frames that h265 uses? Or a bunch of little things?
I think he may be referring to AV1's economics weakness. Not the actual quality itself. As in the cost to bring quality to this level while disregarding encoder and decoder complexity. Kostya has a rant about it here [1]. He was the guy that made Real Video work across all platform .
Personally I am giving the industry a benefits of doubt and one last chance on VVC / H.266.
> As in the cost to bring quality to this level while disregarding encoder and decoder complexity.
AV1 is amazing for YouTube, Netflix and torrent-scale videos currently.
I can't find the reference for this, but in the beginning the codec developers and big companies had meetings. The answer from big companies for how much encode time increase would be acceptable was 100x to 1000x the current standard. Thus the design.
Of course the encode time problem will be solved for regular users too once AV1 encode ASICs for consumer hardware enter the market in 3–5 years. There are a few solutions already offering cloud FPGA encoding alongside beefy servers. If streaming bandwith costs are a significant issue for you, then you can easily afford that.
That's not why next-generation codecs keep being made.
I remember back in 90s / 00s when a CPU would stutter doing MPEG2 decode with XVID or DIVX codecs. Eventually, CPUs got faster, and guess what?
Faster CPUs mean you can "afford" better compression / decompression routines. No one uses XVID or DIVX anymore, because CPUs are so fast that we want better compression (not faster compression).
H264 was then popularized, and I remember in the late 00s / early 10s when "Netbooks" would stutter decoding H264. ASIC chips were invented to better decode H264. But guess what? Today, CPUs / GPUs are so much faster that H264 is now too easy, and we desire even better compression.
Repeat the story again for H265 and VP9, the next generation (with VP9 patent-free becoming more popular).
AV1 is designed to be the next step. Its still too slow for most people, but ASICs are being developed so that cell-phones can run the AV1 decoder algorithm. Eventually, computers will overall get much faster, and everyone can then move to AV1.
-----------
Eventually, computers will be so fast that even AV1 is "too quick", and a new, slower, better form of compression will be invented.
We move to newer codecs because our technology continues to change. Now maybe AV1 will be the "end all, be all" of codecs, but it will be a sad day if that is true. Because for codec-progress to end implies that CPUs have stopped improving.
Moore's law seems to be dying, and later nodes (5nm, 3nm, etc. etc.) are taking longer and longer to research-and-develop. Maybe our computers will stop improving soon...
The speed of CPUs doesn't really matter for codecs these days. Only for adoption during the first few years until fixed-function hardware starts shipping.
Besides, most of the processing burden is on encoding. CPU performance hasn't resulted in better codecs, but rather advances in encoding methods. AV1 encode is 100x slower than previous codecs, but that's fine due to the savings at scale.