What codec is used and how change it

I’m testing the nextcloud spreed signaling.
I’m not able to understand what codec is using (vp8,h264,vp9) .
Also it possible to change it ?
I like to test the available codecs to see performance differences.


vp8 is used. There is an optional setting to tell browsers to use h.264, but it does not work with the High-performance backend, due to the nature of things.

VP8 is the only must have codec by the webrtc specs. so if you have one participant on a device that only supports vp8, everyone must at least send their data in vp8. Since the traffic is the biggest problem in most setups, only sending vp8 then is the best thing you can do. With the HPB each participant doesn’t know the set of codec of the other participants but only from the server in the middle, that’s why vp8 has to be used in this case.

Any specific reason why you would like to change it?

None specific need.
I like to test any available option to verify what available after read this article.

Thx however.

Interesting article…

Brought back memories about the high-definition war: HD-DVD vs. Blu-ray; Toshiba vs. Sony, VC1 vs. H.264… AVSforum had hundreds of pages of “discussions” before the first discs were even released…

And then Blu-ray won, and despite their best efforts - hacked a year later.
Soon after streaming started taking off and studious gave up the idea of making it unhackable…

Around that time 2-pass encoding a 100min high-def movie could take 10+ hours on a decent PC.
Today the same is accomplished in real time or faster…

The main point is:

  1. VP9 requires more CPU than VP8 or H.264, and there are complaints about CPU use in WebRTC with these codecs already, so VP9 won’t help alleviate that problem – only worsen it
1 Like

Yes of course , this was almost one of the explanation why it was so slow adopted for 7 years.

But the curiosity is that now the cpu are more fast and powerful than 7 years ago (about every 10 years the cpu power increase of an order of magnitude) and the VP9 has a better compression rate that could reduce the channel occupacy without lost quality.

Howeer as I say , it was only a curiosity to test the more CPU use with the better compression rate , only if it was avalability.
I guess for standard needs the VP8 could be surely sufficient.

That’s the theory. The main issue is adaption rate. Especially in the enterprise world older machines are still in use. My 2016 laptop had troubles with 4 videos onwards if I tried to do anything next to it like LibreOffice + Mails + normal browsing, on my 2019 desktop that’s hardly a 10% CPU usage. But we need to cover a big enough share of the target group and so vp8 it is.