Well, it is meant to COMPLETELY DESTROY THE DREAMS OF VCC (h266) SUPPORTERS- I mean, it’s 30% more efficient compared to AV1 as others pointed out.
Also, this means we’re approaching a moment where most of our devices decode AV1 natively without use of dav1d (although I still use it since sometimes hardware decoders for some reason mess up shit)
AV2, a generation leap in open video coding and the answer to the world’s growing streaming demands, delivers significantly better compression performance than AV1. AV2 provides enhanced support for AR/VR applications, split-screen delivery of multiple programs, improved handling of screen content, and an ability to operate over a wider visual quality range. AV2 marks a milestone on the path to an open, innovative future of media experiences.
So AV2 in a homelab is grossly inappropriate as this seems like something more geared up for server farm status. Odds are most homelabs aren’t rocking that kinda hardware and there’s probably an energy consumption cost and hardware cost factor to measure against how much someone is really saving compared to just AV1 which a intel ARC card can handle. You’d probably have to be rocking a homelab with 100s of TB of stuff to even come close to maybe wanting to go to AV2 hardware and codec?
I mean, unless you’ve got a hardware accelerator (which won’t be a thing before multiple years from AV2’s release, and possibly more due to its complexity), it will be measurably much worse than current codecs.
jokes aside although it will likely keep the slow aspect of encoding like it’s older sibling, I think something like av1an can help speed it up nicely. not only that, if there is a svt-AV2 that will also speed up the process likely
yooo, this is so amazing!!
Can’t wait to tryout AV2, I’m not sure there’s an usable encoder yet but I am real excited!
FOSS & royalty free for the win 🔥
(I sound like an AI bot but I’m really just excited 🥲)
So what makes av2 better than av1
Well, it is meant to COMPLETELY DESTROY THE DREAMS OF VCC (h266) SUPPORTERS- I mean, it’s 30% more efficient compared to AV1 as others pointed out.
Also, this means we’re approaching a moment where most of our devices decode AV1 natively without use of dav1d (although I still use it since sometimes hardware decoders for some reason mess up shit)
https://aomedia.org/press releases/AOMedia-Announces-Year-End-Launch-of-Next-Generation-Video-Codec-AV2-on-10th-Anniversary/
AV2 vs AV1: What the Next-Gen Video Codec Brings to the Table
Or older article but more comprehensive : AV1 vs AV2 Video Codec: 7 Key Differences You Need to Know. It’s key takeaways :
Ah but split screen delivery may mean plex content on an Apple Vision Pro or something VR related right?
So AV2 in a homelab is grossly inappropriate as this seems like something more geared up for server farm status. Odds are most homelabs aren’t rocking that kinda hardware and there’s probably an energy consumption cost and hardware cost factor to measure against how much someone is really saving compared to just AV1 which a intel ARC card can handle. You’d probably have to be rocking a homelab with 100s of TB of stuff to even come close to maybe wanting to go to AV2 hardware and codec?
How do you know you’re not an LLM? Can you prove it?
I don’t drink enough water unlike AI which “drinks” way too much water
Ask them how many Rs are in strawberry.
there are infinitely many 👍
Solipsism FTW.
I mean, unless you’ve got a hardware accelerator (which won’t be a thing before multiple years from AV2’s release, and possibly more due to its complexity), it will be measurably much worse than current codecs.
I like torturing my CPU.
jokes aside although it will likely keep the slow aspect of encoding like it’s older sibling, I think something like av1an can help speed it up nicely. not only that, if there is a svt-AV2 that will also speed up the process likely