Is AV1 going to be the next H.264 (i.e., long-lasting), or is Google going to screw us like they did with VP9 by releasing yet another more computationally expensive format just a few years later that saves like 15% bandwidth for web servers but uses your i7 on Turbo Boost and sounds like a jet engine just to decode a 480p YouTube video because no CPU/GPU or other device has hardware decoding support? Even Intel's fricking 11th-gen i3 (ex. i3-1125g4) iGPUs or any Nvidia or AMD GPUs still don't have AV1 hardware decoding support, yet all these major video streamers are pushing AV1 as default on everyone, and braincell-killers like Twitch and Facebook said they're now switching in 2022. There's not going to be an AV2 soon, is there? Inb4 people finally get laptops with AV1 hardware decoding in 2022 and Google announces the release of AV2 in Q1 2023 including for YouTube.
CRIME Shirt $21.68 |
DMT Has Friends For Me Shirt $21.68 |
CRIME Shirt $21.68 |
AV2 is going to be released by the end of 2023. That's something you could have googled. There will never be another h264, it's at a unique inflection point between compression and complexity. It will still be used in 30 years
Which means that it will start to be used in 2024 and AV1 will be around for a few more years after that, like VP8 and VP9. Even if not, 2019-2024 would make for 5 years of active use, which isn't that bad for a codec like this.
kek
no one uses AV1 and there's already a successor to be released soon
Netflix encoded their entire library in AV1 lol
did they use m1?
M1 can't encode for shit, it lacks x64 SIMD extensions. If you were talking about something other than the processor, please ignore
got any tests on m1 vs x86 av1 encoding speeds?
Soon(tm)
LOLNO, that goes to HEVC which now has state of the art hardware encoders so batshit optimized that GPU video encoding is now officially better than slow preset x264 which is fricking bananas. You can still eek out ~20-40% higher compression from SW HEVC encoding using 2 pass and 10-bit encoder precision but we've finally gotten to a point where GPU video encoding doesn't need 5-10X higher bitrate for the same quality as SW video encoding. AV1 isn't going to have that kind of magic, it'll just a freetard video codec corporations will use to avoid paying the mpeg-la toll.
>You can still eek out ~20-40% higher compression from SW HEVC encoding using 2 pass and 10-bit encoder precision but we've finally gotten to a point where GPU video encoding doesn't need 5-10X higher bitrate for the same quality as SW video encoding.
Those are both true independently but putting them in the same sentence like you can get both at once is disingenuous. You still need the slow hardware preset to match the quality of the slow software preset at the same bitrate.
the practicality is different tho.
the presets used for higher quality can't be used on the fly. people use these this to streaming.
if you are able to fiddle with the configurations of the codec you can get good results faster than a cpu could do. but they'll tank your performance.
gaming is different from production.
AV1 has none of this and probably never will. That's not a good thing if you want longevity for a codec.
but... do we want AV1 to begin with?
if is not open I don't care.
sticking with 264 just because it works.
>if is not open I don't care.
It being open and royalty free was the entire point.
It's to stop the cuck licensing shit for h264/h265.
>open
I thought it was the opposite just because make it work was a always a pain in the ass.
like some fricking company pushing for a shit private codec.. but it always was opensource being opensource.
I always felt like you had to do some sort of weird shit to install it and get it to decode. even worse to use fricking gpu acceleration.wait eons fro browsers to support it.
I took too much time to make this shit work with "no" issues.
whatever.
based
forget about AV1
how come VP9
>crf 28 lag in frames 0
gives the same result with same filesize anf 1/3 of encoding time than
>crf 32 lag in frames 25
why would someone make a lookahead like option than will hurt filesize so much it can be replaced with lower crf?
Who cares about a greater filesize if the quality-to-filesize ratio is even greater?
(CRF is not the only option that affects quality, anon. Consequently, the other options are not for the sole purpose of fitting the same quality in smaller filesizes.)
lag-in-frames should be decreasing the filesize, not increasing it. Although it would only matter either way at pretty high crf.
>Twitch
Yeah right, they said they would add vp9 years ago and never did. Their clientele will not be happy that they can't hardware encode
>You can think of VP9 as H264+ and AV1 as H265+
Not even kind of. They work on fundamentally different mathematical theories, and all the good ones are patented and only in the MPEG codecs. That's why the google codecs suck so much, they're not allowed to use the good stuff since it's patented.
https://openbenchmarking.org/test/pts/aom-av1-2.4.0 see how the M1 isn't listed? That's because it gets about 0.3fps on this benchmark
>only matter either way at pretty high crf
lower crf values are affected the same way but crf difference would be crf 10 laginframes 0 and crf 10.5 with laginframes 25 with respectively time encoding advantage on crf 10 on
>lag-in-frames should be decreasing the filesize
never experencied it
are you using two passes?
>Who cares about a greater filesize if the quality-to-filesize ratio is even greater?
read again
what I said is that option is fricking pointless but slow it encoding time
>CRF is not the only option that affects quality, anon.
Im talking about how those two affect the quality
since higher lag-in-frames values yielded worse results all two passes quality modes
If lag-in-frame results in worse quality AND greater filesize… well, looks like a bug.
Report this to the developers of the encoder.
> Their clientele will not be happy that they can't hardware encode
>what is AVC fallback
I didn't fact check it, I only said Twitch because OP did, I was putting my faith in him, the fact of the matter is Amazon owns twitch, and Amazon is a member of the consortium. Which is why they would use it, if OP was accurate
>any Nvidia or AMD GPUs still don't have AV1 hardware decoding support
Is this copypasta or something? the RTX 30 series and RX 6000 series both have AV1 decode support
>is Google going to israelite us out like they did with VP9 by releasing yet another more computationally expensive format just a few years later
Google was the sole developer of VP9
Google is not the sole developer of AV1, it is a consortium, which Amazon and Facebook members of, hence why Twitch and Facebook plan to use it. Do you think google is just that good at marketing and somehow convinced everyone to use it? because they aren't, companies are switching because they are involved too
>Google announces the release of AV2 in Q1 2023 including for YouTube.
should have proofread my post, that last quote was supposed to go below the second one
My SteamDeck will have decoding support
What's the difference between AV1 and VP9?
AV1 is more efficient but not as mature as VP9. You can think of VP9 as H264+ and AV1 as H265+.
VP10 was going to be googles successor to VP9, they scrapped that and used it as a basis for AV1, then created a consortium to get other companies involved in the creation process, probably because VP9 wasn't very popular
> i7 sounds like a jet engine just to decode
You seriously have to change your thermal paste on CPU, anon.
It's taking so damn long to get hardware support, I don't get it.
Do you mean encode or just decode? Intel, AMD, and Nvidia, all have decode out now
https://en.wikipedia.org/wiki/AV1#Hardware