Hi.

I was trying to find a way to prevent server crash every time I do a streaming.

Basically I have a popular movie website and every time I release an episode I get minimum 30k live viewers and whenever I pass 8k live viewers, 20gbit bandwidth becomes useless. I tried to put 5mbps bandwidth limit occasionally to prevent server crash but it didn’t do much. And I don’t want to rent 100 gbit network bandwidth every time I release an episode. So my question is, is there a way to deal with 30 to 60k live viewers only by using 20 gbit network or I just need to rent 100 gbit network occasionally?

Thank you!

  • klauskinski79@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    50000 users at 20gb is 400kbit per second per second? That ain’t gonna cut it even for potato quality. The recommended bit rate for 480p is 5 times higher the recommended bit rate for 720 ten times higher. So you maxing out at 5k users makes sense.

    Now I am astonished a single server can service 5000 streams at rge same time. That’s some scalable server software.

  • autogyrophilia@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    You can always look to implement webtorrent. It does not work everywhere, but it works very well.

    Basically how it works it’s that you have a streemable file, you make a torrent, and a javascript client on the browser it’s capable of downloading and streaming it.

    https://webtorrent.io/

  • globor@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    I wouldn’t expect a server crash from bandwidth limitations. A few questions: Do you have metrics on the bandwidth saturation and server resources? Have you isolated the issue to be certainly with bandwidth?

    • lunimater@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      All the time whenever the crash happens, bandwidth usage goes up all the way and website stops responding.

  • Lenin_Lime@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    Use a more efficient type of video codec / audio. Going from basic hardware encoded H.264 to something like x264 can increase efficiency greatly. I have no idea what your setup is. Or moving onto VP9/AV1/H.265

    • vasveritas@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      H.264 and x264 are the same thing. H.264 is the name of the math standard. x264 is the name of an open source software that implemented that math.

  • notjfd@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    Out-of-the-box solution: implement a queue on your website. As soon as it hits 7k viewers, new viewers must wait in a queue until older sessions have closed. They do the same for popular MMOs.

  • Syntaxvgm@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    From the live video delivery, this is definitely above my expertise, though it sounds like a fun problem to have. Suffering from success and all that.
    The only bit I can contribute is automated deployment is your friend. Many providers will allow very easy to use scripted deployment. I don’t know what kind of computer power you need to push, but a VPS with a Gb (in theory) pipe is very cheap, and if you need more more powerful VPS or bare metal a lot of providers allow hourly billing, some even hourly billing for bare metal if you need a lot of compute with it- some providers will not take kindly to you pounding a shared compute instance at 100% CPU. What you should look into do is automated deployment…may k8s, maybe something else. Before you start a stream that might have that many viewers, you can just deploy like 40 instances. You don’t even have to worry about bandwidth with most providers since you’re just running a short time. When your stream is over, all of the instances are destroyed and the billing is stopped to the nearest hour.

    You can seriously do what you need for a couple hours for a few bucks if you don’t need a bunch of compute, more if you need compute but still very reasonable. The non-compute heavy go 1Gb/s connections as low as not even a whole cent an hour, and you can distribute this across multiple providers to give more geographic locations.
    The one caution I would give with this method is before you invest time in making a platform work for your setup is to actually grab an instance manually and benchmark the connection. Some are 1Gb/s, but will throttle to less during peak times, so test over time too. Know what you are dealing with and how many instances you need.