A Conditional-Probability-Distribution Model for Bandwidth
Estimation with Application in Live Video Streaming
Weijia Zheng1
1Department of Information Engineering, CUHK
1Department of Mathematics, CUHK
1wjzheng@link.cuhk.edu.hk
Abstract
Experience of live video streaming can be improved if the
video uploader have more accurate knowledge about the
future available bandwidth. Because with such knowl-
edge, one is able to know what sizes should he encode the
frames to be in an ever-changing network. Researchers
have developed some algorithms to predict throughputs
in the literature, from where some simple hence practical
ones. However, limitation remains as most current band-
width prediction methods are predicting a value, or a point
estimate, of future bandwidth. Because in many practi-
cal scenarios, it is desirable to control the performance to
some targets, e.g., video delivery rate over a given tar-
get percentage, which cannot be easily achieved via most
current methods.
In this work, we propose the use of probability distri-
bution to model future bandwidth. Specifically, we model
future bandwidth using past data transfer measurements,
and then derive a probability model for use in the appli-
cation. This changes the selection of parameters in appli-
cation into a probabilistic manner such that given target
performance can be achieved in the long run. Inside our
model, we use the conditional-probability method to corre-
late past and future bandwidth and hence further improve
the estimating performance.
Keywords
Network throughput estimation, conditional probability,
relative frequency, live video streaming, empirical method
1 Introduction
With the development of smartphones and high-speed mo-
bile data networks such as 3G and 4G/LTE, live video
streaming has long become part of our lives in the enter-
tainment or casual fields. In recent years, it has played a
vital role in the workplace as well. As we all experienced
in person, the year of 2020 witnessed Zoom’s significant in-
crease [1] in the usage of remote work, distance education,
and online social relations.
Given the importance of video streaming and the high
peak bandwidth nowadays in mobile data networks, band-
width fluctuation remains a challenging obligation for its
unpredictable characteristic by its wireless nature, which
may affect the quality of clients’ experience. Several pre-
diction methods were established or used in the literature,
such as arithmetic mean (AM), multiple linear regression
(MLR), ARIMA, LSTM, etc. However, most of them pre-
dict a point estimate of future bandwidth, or gives a con-
fidence interval with further assumptions, which is hard
to guarantee its accuracy to a specified level in practice.
Motivated by the ideas of using an empirical condi-
tional probability for prediction in financial [2] and trans-
portation fields [3], this work contributes to establishing
a conditional-probability model in bandwidth prediction.
And a simulator of the uploading part of a live video
streaming, referencing the one in [4], is implemented for
simulation and demonstrating the feasibility of such a
method. To mimic a real network environment, the simu-
lator uses a packet-level with timestamp trace data mea-
sured from real-world network sources of 3HK 4G.
The rest of the paper is organized as follows. Section
2 introduces the settings of the problem. Some analyses
and related works are presented in Section 3. In Section
4 we will derive the proposed encoding scheme in detail.
Numerical results of it on different network environments
are shown in Section 5, before conclusions are outlined in
Section 6.
Besides, unless otherwise stated, all the bitrate variables
are in Mbps, and all the time variables are in seconds in
this paper.
2 Problem Background
In our study, to concentrate on the uplink part, we con-
sidered the uplink part exclusively with some further sim-
plifications. We do not consider the downlink part of the
streaming process in the simulator and the general sce-
nario can be described as the following.
An uploader generates frames one by one with an equal
time difference (i.e., 1/FPS second) in between. Assume
that TCP is used, once a frame is sent from the uploader
side, the uploader will not consider it anymore but will
only take care of later frames. The uploader then starts
transmitting some newest possible frame as buffer time
1
arXiv:2210.01652v1 [cs.MM] 16 Apr 2022