I’ve wanted to buy an upgrade to my RX580 for years now, but I’d really like AV1 encoding support. With OBS finally supporting AV1 on all platforms (?), this actually makes sense. But I’m once again reminded how bad the used market for GPUs is in my country atm, so I’ll wait for a while longer.
Now we just have to wait until platforms like Twitch support the codec too. It’ll be a huge leap, when they do
YouTube already has it, wouldnt hold my breath for twitch. they still havent had h265 support, and its not like thats brand new or amything.
Isn’t that due to that codec requiring royalties? Half the reason there is such a bit push towards av1 instead of h265 is that there is no royalties involved
That’s because H.265 is patent encumbered. Firefox doesn’t support H.265 at all and Chrome only supports it if the hardware does. In order to support accepting H.265 input from streamers, Twitch would basically have to pony up the compute resources for full-res realtime transcoding for every H.265 stream to H.264 – either that or put up with a lot of bad press surrounding people not being able to stream at full res anymore.
AV1 would introduce a similar hardware requirement because not everyone even has AV1 Decode, and even fewer have AV1 encode. AV1 encode would only be available on people on gpus using the latest generation, blocking anyone buying previous generation stuff (so no AMD 6000 or older, or Nvidia 3000 or older, and non Intel Arc products).
All (recent) major browsers I’m aware of have software AV1 decode as standard, so the receiving end wouldn’t be a problem apart from higher CPU usage. As for encode, obviously this wouldn’t be universal – just streamers who had the computing power (hardware or software) for realtime AV1 encode would be able to take advantage of that on Twitch.
the browsers have the software, but not the hardware decode step.
software decode, especially for mobile, would be battery draining and no streaming service would realistically would use it without the userbase having hardware decode support.
for pcs, av1 hardware decode is amd 6000 or newer, amd phoenix apus, nvidia 3000 or newer gpus, 11th Gen intel cpus or newer.
for mobile, its only like a small portion of the phones released in the past year and a half or so.
for iphone, the list is the iphone 15 pro max. and for the other devices, things using the M3.
as long as the world is a mobile first mindset, theres no way theyre going to ask evwryone on mobile to take a significant battery loss just for a higher resolution stream.
Isn’t h265 proprietary? Maybe they just didn’t want to pay license fees
deleted by creator
This is the best summary I could come up with:
The cross-platform OBS software that is popular with game streamers and others live-recording their desktops has finally landed support for AV1 video encoding using Linux’s Video Acceleration API (VA-API) interface.
Opened last May was a merge request for the OBS FFmpeg code to add AV1 support for VA-API.
As of Tuesday evening that code was merged.
The code has successfully tested the VA-API AV1 encoding using the Mesa drivers.
VA-API AV1 encoding is available with AMD Radeon RX 7000 series graphics and Intel Arc Graphics when it comes to those with open-source Mesa driver support.
It’s unfortunate that it has taken until into 2024 to get this code merged, but nevertheless exciting for the next OBS feature release.
The original article contains 118 words, the summary contains 118 words. Saved 0%. I’m a bot and I’m open source!
Good try bot! You did your best
Maybe don’t put a comment if you save under 20%
Man people be hatin but thank you bot
So my Rx 5500 will work right?
Is the performance drawback from streaming in this encoding less noticeable?
deleted by creator
Your GPU has a dedicated ASIC that can do the encoding simultaneously. On NVIDIA (not relevant in this case) that would be your NVENC encoder.
AMD and Intel have their own ASIC IP blocks that do encode/decode that’s part of the GPU “SoC” but wouldn’t consume GPU compute resources (eg CUs). That’s how you see people already using GPU encode with obs (non-AV1 codecs) while gaming, and really that’s how people like me using Sunshine/Parsec for the host PC for “remote” gaming (mostly for remoting into a Windows machine for the 1 game that cannot be run on Linux nor a VM due to anti-cheat). The only GPU resources you’re using are PCIe bandwidth and perhaps some VRAM usage? But I wouldn’t call it just dumping it from the CPU to the GPU, you have an ASIC that mitigates the brunt of the workload and AV1 with Sunshine has been amazing, can’t imagine now using it for recording my gameplay vids will hopefully be better than H264 (due to lower bitrates and hence smaller file sizes).