Designing infrastructure for video streaming – software

By Patrycja DziedzicOctober 8, 2021
video streaming software infrastructure

If we want to design an efficient infrastructure for streaming for our website or application, we must first understand what it should consist of. The division that I will present here concerns the software layer primarily. This, in turn, is scaled on the hardware layer, which we will discuss later.

Streaming application

The entire streaming process begins with the streaming application installed on a regular phone, PC / Mac computer, and professional workstation. For some protocols, such as WebRTC, the streaming application can be, for example, the Internet browser itself. There are also many programs that offer the required functionality, such as OpenBroadcaster, XSplit, and for mobile devices such as Larix. The streaming application is designed to process the image and send it via a special protocol to the streaming server.

Streaming Server

The second element of the infrastructure is, of course, the streaming server. Its task is to receive the signal from the streaming application. Such an application may fulfill several roles or distribute them among other applications. An example may be transcoding, the process of processing the original source signal into other ones, with lower resolution and bitrate. This process is quite essential if we want to present high-quality materials, as not all our viewers will be able to watch it in this form. In order for the video material to be considered FullHD, it should be broadcast in 1920×1080 resolution, which requires about 5-7 Megabits per second of download speed for the target user (our viewer). In the case of fiber-optic Internet, such values ​​are not uncommon, but a user living in the remote area, using a mobile Internet, away from the base station, may not be able to get even as much asa 2 Megabits per second.

Transcoding

In this case, we can scale the stream, for example, to a resolution of 630×360 and about 1.5 Megabits per seconds, which will allow such a person to watch the stream relatively comfortably. Unfortunately, nothing is for free. The transcoding process is very demanding in terms of computing power. Each movie frame needs to be decoded, reduced, and re-encoded again – this is a demanding work, and even high-performance multi-processor and multi-core processors will not be able to perform more than a few such processes simultaneously. In case we plan that there will be only a few simultaneous streams, it is not a big problem to overcome, but if we assume that there will be hundreds of them, it becomes a huge challenge.

End server

The end servers are the third element of the streaming infrastructure. Once we receive the stream from the streaming application and optionally it is transcoded to several different qualities, we need to distribute the signal to end-users. The task of the edge servers is to deliver the stream directly to viewers, previously copying the signal from the main streaming server. The number of these servers somehow defines the maximum throughput of the entire infrastructure.

Single server streaming architecture

The vast majority of streaming servers, such as Wowza, RED5, or the Storm Streaming Server, offer all the above-mentioned functionalities within a single application. It depends on the configuration of individual instances whether a given copy is to receive the stream, transcode it, send it to viewers, or perform all these activities at once. This is a very important feature of this kind of software, because it allows us to start building the service from literally a single server and smoothly scale the whole thing to serve tens of thousands of viewers. You can read about how to configure scaling in the Storm Streaming server here.


Tags: end server, larix, obs, open broadcaster, red5, transcoding, WebRTC, wowza, xsplit