SHANNON ENTROPY AN ECONOPHYSICAL APPROACH TO CRYPTOCURRENCY PORTFOLIOS Noé Rodriguez-Rodriguez

2025-05-03 0 0 529.05KB 14 页 10玖币
侵权投诉
SHANNON ENTROPY:AN ECONOPHYSICAL APPROACH TO
CRYPTOCURRENCY PORTFOLIOS
Noé Rodriguez-Rodriguez
Facultad de Ciencias
Universidad Nacional Autónoma de México
Ciudad de México
noe.rdz@ciencias.unam.mx
Octavio Miramontes
Departamento de Sistemas Complejos
Instituto de Física
Universidad Nacional Autónoma de México
Ciudad de México
octavio@fisica.unam.mx
ABSTRACT
Cryptocurrency markets have attracted many interest for global investors because of their novelty,
wide online availability, increasing capitalization and potential profits. In the econophysics tradition
we show that many of the most available cryptocurrencies have return statistics that do not follow
Gaussian distributions but heavy–tailed distributions instead. Entropy measures are also applied
showing that portfolio diversification is a reasonable practice for decreasing return uncertainty.
Keywords Cryptocurrencies ·econophysics ·entropy ·portfolio uncertainty ·heavy-tailed ·distributions
1 Introduction
Financial mathematics and financial economics are two scientific disciplines that together are the backbone of modern
financial theory. Both disciplines make extensive use of their own models and theories coming from mathematics and
traditional economics and are aimed mostly to the predictive analysis of markets. By the other hand, physics historically
had an important influence on economical theory when the formalism of thermal equilibrium was inspirational in the
development of the theory of economic general equilibrium[
1
]. In recent years there is a renewed interest in the view
that economic phenomena share many aspects of physical systems and so they are susceptible for being studied under the
science of complex systems[
2
,
3
,
4
] and statistical physics[
5
], given rise to novel research fields such as econophysics[
6
].
One of the first successful econophysical approaches was the discovery that stock market fluctuations are not Gaussian
but heavy–tailed. Mandelbrot came to this conclusion when investigating cotton prices[
7
] and later Mantegna when
characterizing the Milan stock market[
8
]. This discovery forced a serious rethinking that the view introduced by Louis
Bachelier in the early XX Century stating that price variations are random and statistically independent[
9
], while
historically important, is not entirely appropriate. These prices are the outcome of the concurrent non-linear action of
many economic agents[
10
] and so the fluctuations are usually correlated, meaning that the process is not a random
Brownian walk at all. In this article we explore fluctuations in the cryptocurrency market and show for the first time
that all of the most common coins analyzed fail a Shapiro–Wilk normality test and are best explained by heavy–tailed
statistical models.
By the other hand, the entropy is a fundamental key quantity in physics both in thermodynamics and information
theory. It is related to disorder and diversity and has been previously used in finances, specifically in portfolio selection
theory[
11
]. Acceptance of entropy in economic theory has historically some reticence, for instance the one expressed by
Paul Samuelson, a seminal figure in XX Century economics[
12
]. However it is gaining wide acceptance lately[
13
,
14
]
and specially with the development of non-equilibrium thermodynamics and complex systems theory[
15
,
16
,
17
,
18
]. It
is well known that portfolio diversification is a good strategy to minimize specific risks and so we will explore the use
of entropy in the cryptocurrency market as a measure of return uncertainty and risk minimization [11, 19, 17].
arXiv:2210.02633v1 [physics.soc-ph] 6 Oct 2022
2 Methods, data and analysis
In this article we used the historical daily prices of 18 cryptocurrencies that are presented in CryptoDataDowload [
20
]
spanning at least 3 years. These are: Basic Attention Token (BAT), Bitcoin Cash (BCH), Bitcoin (BTC), Dai (DAI),
Eidoo (EDO), Eos (EOS), Ethereum Classic (ETC), Ethereum (ETH), Metaverse ETP (ETP), Litecoin (LTC), Neo
(NEO), OMG Network (OMG), Tron (TRX), Stellar (XLM), Monero (XMR), Verge (XVG), Ripple (XRP), and Zcash
(ZEC)[21, 22].
The opening and closing prices of the cryptocurrencies, quoted in dollars, from
10/16/2018
to
12/31/2021
were used,
giving a total of 1172 observations per cryptocurrency, to calculate daily returns. First, the distributions of the daily
returns of each cryptocurrency were statistically characterized through normality tests and parametric adjustments of
heavy–tailed distributions. Subsequently, an analysis where entropy functions are derived similarly as in Dionisio
et al. (2006), [
11
], Ormos and Zibriezky (2014) [
17
] and Mahmoud and Naoui (2017) [
23
]. From the set of 18
cryptocurrencies, the assets were randomly selected to compose investment portfolios, where the only premise used was
that the proportion invested in each asset is
1
N
, being
N
the number of assets in the portfolio. To compare entropy to
standard deviation consistently, normal entropy was used since it is a function of variance.
In the following subsections we will review some useful concepts regarding entropy functions: the discrete entropy
function, the continuous entropy function, the entropy as a measure of uncertainty, the comparison between entropy and
variance and investment portfolios.
2.1 Discrete entropy function
Let
X
be a discrete random variable,
{A1, A2, A3, ...An}
be a set of possible events and the corresponding probabilities
pX(xi) = P r(X=Ai)
, with
pX(xi)0
and
Pn
i=1 pX(xi)=1
. The generalized discrete entropy function or Rényi
entropy [24, 17] for the variable Xis defined as
Hα(X) = 1
1αlog n
X
i=1
pX(xi)α!(1)
where
α
is the order of the entropy and
α0
. This order can be considered as a bias parameter, where
α1
favors
rare events and where α1favors common events [25]. The base of the logarithm is 2.
When
α= 1
is a special case of the generalized entropy that assumes ergodicity and independence, which the
generalized case does not. However, substituting into 1 results in division by zero. By means of L’Hôpital’s rule, it can
be shown that when αtends to 1, we have the Shannon entropy
H1(X) =
n
X
i=1
pX(xi)log(pX(xi)) (2)
Shannon entropy produces exponential equilibrium distributions, while generalized entropy produces power law
distributions.
2.2 Continuous entropy function
Let
X
be a continuous random variable that takes values of
R
and
pX(x)
be the density function of the random variable.
Continuous entropy is defined as
Hα(X) = 1
1αln ZpX(x)α(3)
Note that the logarithm base of 1 and 3 are different. Although the entropy depends on the base, it can be shown that
the value of the entropy changes only by a constant coefficient for different bases.
When α= 1, we have the Shannon entropy for the continuous case
2
H1(X) = ZpX(x)ln(pX(x))dx (4)
The properties of discrete and differential entropy are similar. The differences are that the discrete entropy is invariant
under variable changes and the continuous entropy is not necessarily so, furthermore the continuous entropy can take
negative values.
2.3 Entropy as a measure of uncertainty
According to Shannon (1948) [26], an uncertainty measure H(pX(x1), pX(x2), ..., pX(xn)) must satisfy:
1. Hmust be continuous on pX(xi), with i= 1, ..., n.
2. If pX(xi) = 1
n,Hmust be monotone increasing as a function of n.
3.
If an option is split into two successive options, the original
H
must be the weighted sum of the individual
values of H.
Shannon showed that a measure that satisfies all these properties is 2 multiplied by any positive constant (the constant
just sets the unit of measure). Among the properties that make it a good uncertainty choice are
1. H(X)=0if and only if all but one of pX(xi)are zero.
2.
When
pX(xi) = 1
n
i.e. when the discrete probability distribution is constant,
H(X)
is maximum and equal to
log(n).
3. H(X, Y )H(X) + H(Y)
, where equality holds if and only if
X
and
Y
are statistically independent i.e.
p(xi, yj) = p(xi)p(yj).
4. Any change towards the equalization of the probabilities pX(xi), increases H.
5. H(X, Y ) = H(X) + H(Y|X)
. So the uncertainty of the joint event
(Y|X)
is the uncertainty of
X
plus the
uncertainty of Ywhen Xis known.
6. H(Y)H(Y|X)
which implies that the uncertainty of
Y
is never increased by knowledge of
X
. Decreases,
unless Xand Yare independent, in which case it doesn’t change.
2.4 Comparison between Entropy and Variance
Ebrahimi et al. (1999) [
27
] showed that entropy can be related to higher order moments of a distribution, thus it can
offer a better characterization of
pX(x)
because it uses more information about the probability distribution than the
variance (which is only related to the second moment of a probability distribution).
The entropy measures the disparity of the density
pX(x)
of the uniform distribution. That is, it measures uncertainty in
the sense of using
pX(x)
instead of the uniform distribution [
23
]. While the variance measures an average of distances
from the mean of the probability distribution. According to Ebrahimi et al. [
27
], both measures reflect concentration,
but use different metrics. The variance measures the concentration around the mean and the entropy measures the
density diffusion regardless of the location of the concentration. Statistically speaking, entropy is not a mean-centered
measure, but takes into account the entire empirical distribution without concentrating on a specific moment. This
way you can take into account the entire distribution of returns without focusing on one particular [
28
]. The discrete
entropy is positive and invariant under transformations, but the variance is not. In the continuous case neither the
entropy nor the variance are invariant under one-to-one transformations [
11
] [
23
]. According to Pele et al. (2017) [
29
]
entropy is strongly related to the tails of the distribution, this feature is important for distributions with heavy tails or
with an infinite second-order moment, where the variance is obsolete. Furthermore, the entropy can be estimated for
any distribution, without prior knowledge of its functional form. These authors found that heavy–tailed distributions
generate low entropy levels, while light–tailed distributions generate high entropy values.
2.5 Investment Portfolios
A portfolio or investment portfolio is simply a collection of assets. They are characterized by the value invested in each
asset. Let wibe the fraction invested in asset iwith i= 1,2, ..., n, the required constraint is that
3
摘要:

SHANNONENTROPY:ANECONOPHYSICALAPPROACHTOCRYPTOCURRENCYPORTFOLIOSNoéRodriguez-RodriguezFacultaddeCienciasUniversidadNacionalAutónomadeMéxicoCiudaddeMéxiconoe.rdz@ciencias.unam.mxOctavioMiramontesDepartamentodeSistemasComplejosInstitutodeFísicaUniversidadNacionalAutónomadeMéxicoCiudaddeMéxicooctavio@f...

展开>> 收起<<
SHANNON ENTROPY AN ECONOPHYSICAL APPROACH TO CRYPTOCURRENCY PORTFOLIOS Noé Rodriguez-Rodriguez.pdf

共14页,预览3页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:14 页 大小:529.05KB 格式:PDF 时间:2025-05-03

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 14
客服
关注