COMPARING PERMUTATION ENTROPY ...

3 downloads 0 Views 4MB Size Report
Section 4. The situation of the village of Lorca (Spain) in the previous days to the earthquake in 11th of May of 2011 is studied. An earthquake of catastrophic.
COMPARING PERMUTATION ENTROPY FUNCTIONS TO DETECT STRUCTURAL CHANGES IN TIME SERIES ´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

Abstract. Entropy can be taken as a measure of the complex dynamical behavior. In this paper, we consider different entropy functions and the permutation symbolic dynamics and we apply them to find structural changes in time series. We analyze what entropy functions are more suitable to show changes in simulated time series where the structural changes are know. Applications to seismic real data and economic data series are shown to illustrate how this type of tools can be used.

1. Introduction Different measures of complexity has been used to compare time series and distinguish between regular and chaotic behavior. The complexity of heart and brain data can distinguish healthy and sick subjects and sometimes even predict heart attack or epileptic seizure, [4, 13]. Other natural processes, like earthquakes, are complex systems by nature. The number of variables to take into account is big enough to make difficult the attempts of modeling them as well as the possibility of predicting their behavior. Entropy-based indicators are useful to study systems evolving with time with a degree of disorder or chaoticity. Time series appear as a source of information in which other approaches are not possible. Measuring the degree of disorder of such that systems is a non-trivial problem. In this frame, permutation entropy can be used as a measure of the chaoticity of the system, see [1, 2, 4, 20]. In [9] permutations are used to encode the information of a data series and detect structural changes using the number of admissible permutations. On the other hand, permutations jointly with a limit of Renyi entropy functions as the parameter goes to infinity has been used in [21] to detect dynamical changes of time series. Finally, Shannon permutation entropy has been used to detect changes in time series (see [10, 19]). In this paper, following [9], permutations are used to codify the information contained in a data series in order to find structural changes but different entropymeasures (see [14]) are used to detect such structural changes. The main motivation for this paper is to analyze what entropy measure is better to show structural changes in data series. Of course, this problem is quite difficult to analyze because families of entropy functions depend on real parameters. All of them includes Shannon entropy as a limit case, which splits the parameter set in two disjoint subsets. Our study considers one parameter value in each of these disjoint subsets. After normalization, these chosen parameters give representative entropy functions with parameters on these subsets. As usual, first we make experiments with simulated time series where structural changes are given and then, we apply the techniques to analyze real data time series analysis. In particular, we consider real seismic series, see [3, 6, 17], and we will apply the practical analysis of entropy-measures to that seismic real data. In addition, two different economic data have been analyzed as well. 1

2

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

The paper is organized as follows. Section 2 is devoted to introduce the basic notation and definitions. We highlight the way in which the time series is encoded and we describe the entropy-like measures that will be used to make the analysis. In Section 3 different time series are generated and we apply the study of the entropy, following the sketch of [9]. Several applications to real data series are given in Section 4. The situation of the village of Lorca (Spain) in the previous days to the earthquake in 11th of May of 2011 is studied. An earthquake of catastrophic consequences happened in that date. We make a thorough analysis, specially we are interested in facts that might indicate precursory behavior. Additionally, the existence of changes in economic data series is also addressed. 2. Permutations and entropy functions In order to make this paper self-contained, basic notation and definitions will be introduced within this section. More details can be obtained in [9] and [14]. Definition 1. Let (xn )Tn=1 , T ∈ N be a time series, where xn ∈ R for each n ∈ {1, . . . , T }. An sliding window from (xn )Tn=1 is given by xk (l) = (xl , xl+1 , . . . , xl+k−1 ), for 1 ≤ l < T − k + 1, where k is a fixed natural number called the embedding dimension. Observe that the embedding dimension agrees with the number of terms of the sliding window. The adjective sliding is because the window changes its position. The window is slid until covering the entire series or a subwindow. This cover can be done with or without overlapping between the windows. Let m be a fixed natural number and Sm be the group of permutations of length m. The cardinality of Sm is given by |Sm | = m!, where |A| denotes the cardinality of a set A. Definition 2. Let m ∈ N be a fixed natural number and π = (i1 , i2 , . . . , im ) ∈ Sm a permutation. The sliding window xm (l) = (xl , xl+1 , . . . , xl+m−1 ) is said to be of π-type if π is the unique permutation such that the following conditions hold: (1) xl+i1 ≤ xl+i2 ≤ . . . ≤ xl+im . (2) is−1 < is if xl+is−1 = xl+is . Definition 3 ([4]). Let (xn )Tn=1 be a time series, m ∈ N and π ∈ Sm , then the relative frequency of π, denoted by p(π) is given by |{j : xm (j), is of π − type, j = 1, 2, . . . , T − m + 1}| . T −m+1 A permutation π ∈ Sm such that p(π) > 0 is called an admissible permutation of order m. The set Am ⊂ Sm is the set of admissible permutations of order m. The number of admissible permutations of order m, that is the cardinality of Am , is denoted by N AP (m). p(π) =

In [5] it is considered the measure HT (m) = log(N AP (m)) and in [9] it is used to detect structural changes in time series. In this paper, instead of HT , we will use different entropy notions to detect structural changes. The entropy-like measures that we are going to consider are summarized in [14]. Here we show the notions suitable for data series. Definition 4. Let (xn )Tn=1 be a time series and m ∈ N the embedding dimension. (1) Shannon entropy, [16], is given by X (1) H S (m) = − p(π) log(p(π)). π∈Am

COMPARING PERMUTATION ENTROPY FUNCTIONS

(a) Renyi entropies for two permutations with frequencies x and 1 − x for parameter values r = 1/5, 1/2, 1 (Shannon entropy), 2 and 5.

(b) Tsallis entropies for two permutations with frequencies x and 1 − x for parameter values q = 1/5, 1/2, 1 (Shannon entropy), 2 and 5.

3

(c) Gaussian entropies for two permutations with frequencies x and 1 − x for parameter values q = 1/5, 1/2, 1 (Shannon entropy), 2 and 5.

Figure 1. Entropy functions with two frequencies x and 1 − x. Shannon entropy divides the parameter region such that q, r > 1 and q, r < 1.

(2) Tsallis entropy, [18], is given by (2)

HqT (m) =

X

p(π) lnTq



π∈Am

1 p(π)

 .

(3) R´enyi entropy, [15], is defined as (3)

HrR (m)

=

log

P

π∈Am

p(π)r



1−r

(4) Gaussian entropy (also Nonextensive Gaussian entropy) [11] is defined in terms of the relative frequency by ! Y T G p(π) (4) Hq (m) = lnq (1/p(π)) , π∈Am

where q and r are parameters and lnTq (x) is the deformed logarithm defined by Tsallis and collaborators as lnTq (x) :=

x1−q − 1 , 1−q

where x > 0 and limq→1 lnTq (x) = log(x). It is remarkable that Shannon entropy appears as a limit case in Tsallis, Renyi and Gausssian entropies, dividing the parameter space in two disjoint subsets. In Figure 1 we show the graphics of the functions consisting in the above entropies for two permutations with frequencies x and 1−x. It can be seen that for Renyi entropy the inequalities HrR1 (2) ≤ HrR2 (2) ≤ H S (2) ≤ HrR3 (2) ≤ HrR4 (2) for r1 ≥ r2 > 1 ≥ r3 ≥ r4 hold for a fixed value x ∈ [0, 1]. The same chain of inequalities hold for Tsallis and Gaussian entropies when the parameter q holds q1 ≤ q2 > 1 ≤ q3 ≤ q4 . Finally, to compare the entropy-like measures we have to normalize the obtained results. To do it, we divide the entropy values between the maximal entropy values obtained along the windows as indicated in the next section.

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

4

3. Comparing the entropy-likes measures in dynamical systems with structural changes In this section we describe the process to design the numerical computations of normalized entropy-like measures that will be applied to dynamical systems with structural changes in order to observe the behavior of the entropy measures. Thus, given a time series (xn )Tn=1 , the sketch of the process for computing the entropy of a time series is as follows. We distinguish two different levels of sliding windows. The first level covers the entire data series with fixed length k = 5m!, where m is the embedding dimension (also permutation length). The second level of sliding windows is given by windows of length m covering the first level of sliding windows. More precisely, (1) We fix the embedding dimension m. In [4] it is recommended to use low order m because the computation time increases when m does. (2) We fix the overlapping term s of sliding windows to cover the total series in the first level. The data series is partitioned into k = 5m! length blocks T −k . Observe that Bk (l) = (xl , xl+1 , . . . , xl+k−1 ), l = 1 + s · j, 0 ≤ j < s if s < k, then two consecutive windows overlap. High values of s are good for big data series because the computing time decreases as s increases. (3) We fix a new overlapping parameter i. For each block Bk (l) = (xl , xl+1 , . . . , xl+k−1 ) we consider the sliding windows of length m given by (xl+j , xl+j+1 , . . . , xl+j+m−1 ), and their admissible permutations as in Definition n < k−m i by Am (l, i) the set of admissible permutations of block Bk (l)

j = i · n, 0 ≤ 2. We denote with overlapping parameter i. (4) We fix parameters r for Renyi entropy and q for Tsallis and Gaussian entropies and, taking the frequencies of the elements from Am (l, i), we compute the entropic measures for each block Bk (l). Finally, notice that for comparing we have to normalize. To this end, we divide our computations for the maximal entropy values on blocks. Often, we make pictures to visualize the results. The following example illustrates our computation of blocks, admissible permutations and frequencies. Assume T = 14, m = 2, k = 10, s = 2, i = 1 and the time series (0.1, 0.2, 0.25, 0.43, 0.01, 0.03, 0.02, 0.21, 0.23, 0.04, 0.022, 0, 0.12, 1). Then B10 (1) = (0.1, 0.2, 0.25, 0.43, 0.01, 0.03, 0.02, 0.21, 0.23, 0.04), B10 (3) = (0.25, 0.43, 0.01, 0.03, 0.02, 0.21, 0.23, 0.04, 0.022, 0), B10 (5) = (0.01, 0.03, 0.02, 0.21, 0.23, 0.04, 0.022, 0, 0.12, 1). Note that A2 (r, 1) = {(0, 1), (1, 0)} for r = 1, 3, 5 with relative frequencies for A2 (1, 1) and 59 and 49 for A2 (3, 1) and A2 (5, 1).

2 3

and

1 3

3.1. Simulated data. The objective of this section is to show that the above described procedure can be useful to detect structural changes in time series. For that, we consider simulated data in which we introduce a change in the data generation. Then, changes can be done in continuous and discontinuous ways. Let us start with the first case.

COMPARING PERMUTATION ENTROPY FUNCTIONS

5

Now let us consider different time series (xn )Tn=1 generated following the pattern (5)

xn+1 = χ[0, T ) f1 (xn ) + χ[ T ,∞) f2 (xn ), 2

2

where ( 1 χ[a,b) (x) = 0

if x ∈ [a, b) . otherwise

Our aim will be to detect the transition from f1 to f2 using the entropy measures and check what are the most suitable ones. Some examples are taken from [9], although we just consider selected examples. Example 1. We use the functions f1 (x) = 4x(1 − x) and f2 = , where  will denote a uniform random distribution on [0, 1]. Clearly, there will be a structural change but apparently does not happen when we have a look to the time series graph. Clearly, function f2 has a degree of disorder higher than f1 and therefore permutation entropies should address this change. We take T = 1200, m = 4, k = 120, s = 10 and i = 1. The results are shown in Figure 2. We check that Renyi permutation entropy with parameter r = 1/2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure. On the other hand, the map f1 admits only 12 permutations of 24. In other words, there are forbidden patterns and this is reflected in the measure of the dynamical behavior. In addition, this example is quite useful to analyze the parameter values in entropy functions. The number of permutations appearing when T < 6000 is at most 12, while close to 24 more or less uniformly distributed permutations may appear for T > 6000. Then we can easily measure the jump in entropy values for several parameter values. The jump is the difference of mean values of entropy for T > 6120 and the mean values for T < 5980. In addition, we consider the Pearson coefficient of variation CV = σµ where σ is the standard deviation and µ is the mean (see [?]). The variation will come from the jump and the changes in mean are not big enough to be the main reason for the obtained values. We will proceed in this way for other examples. In this case, the jump for Shannon permutation entropy is 0.1717 while the coefficient of variation is 0.0772 and the mean 0.9298. The results are shown in Table 1. Although this table does not constitute any mathematical proof, from it we can check that Tsallis and Nonextensive Gaussian entropies are not good candidates when q > 1. The variation of Nonextensive Gaussian entropy with parameter values q < 1 improves Shannon entropy but not much. Tsallis entropy with q < 1 and Renyi entropies seem to be more suitable to show the structural changes. In addition, in practice Renyi entropy is not good for high values of r because for x ∈ (0, 1) we have that xr tends to zero as r increases. Then, finite precision computations may treat xr as zero. For instance, for r = 1000 we cannot compute the value of Renyi entropy. Applying the same argument, we see that if q is big enough, then xq−1 is computationally treated as zero and then   X X 1 1 1 p(π) lnTq ≈ p(π) = . p(π) q−1 q−1 π∈Am

π∈Am

So, one may expect that for q big enough Tsallis entropy is not good for measuring data structural changes, as all the examples in this paper shown. Finally, note that time series are finite. The parameter values m, s and i are then restricted by the data length but also by the computing time, which increases when m increases and decreases when the overlapping parameters s and i increases. The size k = 5m! is used because it is the regular size in Chi–square statistical tests (see e.g. [1]). Then, only some values of m are possible for data length T . For

6

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Data series of Example 1.

(b) Shannon permutation entropy.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

(e) Tsallis permutation entropy with parameter q = 1/2.

(f) Tsallis permutation entropy with parameter q = 2.

(g) Gaussian permutation entropy with parameter q = 1/2.

(h) Gaussian permutation entropy with parameter q = 2.

Figure 2. For T = 1200, m = 4, k = 120, s = 100 and i = 1 we show the data series of Example 1 and the variation of entropies along the data. We check that Renyi permutation entropy with parameter r = 2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure. On the other hand, the map f1 admits only 12 permutations of 24. In other words, there are forbidden patterns and this is reflected in the complexity. Note that Y–axis in Figure (c) is different from the others figures. instance for this example T = 1200, m = 5 gives us k = 600, which is half of the whole data. So, m = 5 is not suitable for this example and only m = 3 and 4 can be taken. Note that for m = 2 we have only two symbols and consequently we do not consider this case. Table 2 shows the results. In our experiments variations of overlapping parameters s and i do not produce significative changes in the results if s ≤ k and i ≤ m. The size of m depends on two factors, the data length and, as Example 6 shows, the degree of disorder of the data. Example 2. We use the functions f1 (x) = 0.5x +  and f2 = , where  denotes a normal distribution with zero mean and variance 1. Apparently, the function

COMPARING PERMUTATION ENTROPY FUNCTIONS

7

Renyi r 1/2 1/3 1/4 1/5 1/6 1/7 1/8 1/9 1/10 1/25 1/100 1/1000

Jump 1.032 0.417 0.343 0.314 0.298 0.287 0.282 0.277 0.273 0.255 0.225 0.246

CV 0.7679 0.2174 0.1735 0.1570 0.1484 0.1430 0.1394 0.1368 0.1348 0.1254 0.1215 0.1204

Mean 0.5837 0.8352 0.8655 0.8776 0.8842 0.8884 0.8912 0.8933 0.8949 0.9026 0.9060 0.9069

r 2 3 4 5 6 7 8 9 10 25 100 1000 Tsallis

Jump 0.140 0.182 0.205 0.220 0.230 0.238 0.244 0.248 0.251 0.265 0.269 *

CV 0.0574 0.0712 0.0777 0.0811 0.0831 0.0843 0.0851 0.0856 0.0860 0.0882 0.0895 *

Mean 0.9428 0.9260 0.9162 0.9096 0.9048 0.9013 0.8985 0.8963 0.8945 0.8841 0.8781 *

q 1/2 1/3 1/4 1/5 1/6 1/7 1/8 1/9 1/10 1/25 1/100 1/1000

Jump 0.374 0.438 0.468 0.484 0.495 0.503 0.508 0.512 0.516 0.533 0.541 0.544

CV 0.1948 0.2387 0.2601 0.2726 0.2808 0.2866 0.2909 0.2942 0.2968 0.3107 0.3174 0.3194

Mean 0.8447 0.8190 0.8078 0.8017 0.7979 0.7954 0.7935 0.7921 0.7910 0.7856 0.7832 0.7825

q 2 3 4 5 6 7 8 9 10 25 100 1000 NeG

Jump 0.009 1.95 × 10−4 4.02 × 10−6 8.51 × 10−8 4.17 × 10−11 9.53 × 10−13 2.19 × 10−14 4.44 × 10−16 5.89 × 10−16 6.66 × 10−16 6.66 × 10−16 9.99 × 10−16

CV 0.0030 5.46 × 10−5 8.54 × 10−7 1.29 × 10−8 2.01 × 10−10 3.22 × 10−12 5.31 × 10−14 9.03 × 10−16 0 0 1.28 × 10−16 1.81 × 10−16

Mean 0.9970 0.9999 1 1 1 1 1 1 1 1 1 1

q 1/2 1/3 1/4 1/5 1/6 1/7 1/8 1/9 1/10 1/25 1/100 1/1000

Jump 0.191 0.198 0.201 0.204 0.205 0.206 0.207 0.207 0.208 0.210 0.211 0.212

CV 0.0870 0.0904 0.0922 0.0932 0.0939 0.0944 0.0948 0.0951 0.0954 0.0967 0.0973 0.0975

Mean 0.9215 0.9186 0.9172 0.9163 0.9157 0.9153 0.9149 0.9147 0.9145 0.9134 0.9129 0.9127

q 2 3 4 5 6 7 8 9 10 25 100 1000

Jump 0.136 0.105 0.080 0.059 0.043 0.031 0.022 0.015 0.011 2.15 × 10−5 0 0

CV 0.0599 0.0455 0.0339 0.0248 0.0179 0.0127 0.0089 0.0062 0.0042 7.75 × 10−6 0 0

Mean 0.9449 0.9577 0.9682 0.9766 0.9831 0.9880 0.9916 0.9942 0.9960 1 1 1

Table 1. Jumps, Pearson coefficient of variation and mean in permutation entropies for time series of Example 1.

Jump Shannon 0.1309 Renyi (r = 1/2) 0.2914 Renyi (r = 2) 0.1012 Tsallis (q = 1/2) 0.2658 Tsallis (q = 2) 0.0062 NeG (q = 1/2) 0.1346 NeG (q = 2) 0.1237

m=3 CV 0.0317 0.0233 0.0661 0.074 0.0012 0.0327 0.0298

Mean 0.9637 0.8987 0.9710 0.923 0.9987 0.9626 0.9659

Jump 0.1756 1.0325 0.1464 0.3753 0.0092 0.1956 0.1393

m=4 CV 0.0783 0.8248 0.0583 0.1976 0.0032 0.0882 0.0609

Mean 0.9209 0.5418 0.9347 0.8287 0.9965 0.9115 0.9378

Table 2. Jumps, Pearson coefficient of variation and mean in permutation entropies for time series of Example 1 when m changes and s = i = 1. With m = 4 the jump and coefficient of variation values are higher. Note that the different values of the means are not the main reason for the different values of coefficient of variation, even for Renyi with parameter value r = 1/2, where we find the more significative difference in the all the mean values.

8

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Data series of Example 2.

(b) Shannon permutation entropy.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

(e) Tsallis permutation entropy with parameter q = 1/2.

(f) Tsallis permutation entropy with parameter q = 2.

(g) Gaussian permutation entropy with parameter q = 1/2.

(h) Gaussian permutation entropy with parameter q = 2.

Figure 3. For T = 6000, m = 5, k = 600, s = 100 and i = 1 we show the data series of Example 2 and the variation of entropies along the data. We check that Renyi permutation entropy with parameter r = 1/2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure.

f2 seems to have a degree of disorder higher than f1 and therefore permutation entropies should address this change. We take T = 6000, m = 5, k = 600, s = 100 and i = 1. We show the result in Figure 3. We check that Renyi permutation entropy with parameter r = 1/2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure. Again, we can investigate the role of parameters measuring the jumps along the data series. This jump for Shannon permutation entropy is 0.0305 while the coefficient of variation is 0.0101 and the mean 0.9860. The results are shown in Table 3. We find the same situation as in Example 1. Table 4 shows the jumps in entropy values for different values of m, as we did in Example 1.

COMPARING PERMUTATION ENTROPY FUNCTIONS

9

Renyi r 1/2 1/3 1/4 1/5 1/6 1/7 1/8 1/9 1/10 1/25 1/100 1/1000

Jump 0.053 0.028 0.023 0.020 0.019 0.018 0.017 0.016 0.016 0.014 0.013 0.013

CV 0.0178 0.0091 0.0070 0.0060 0.0054 0.0050 0.0047 0.0045 0.0043 0.0035 0.0032 0.0031

Mean 0.9704 0.9876 0.9906 0.9917 0.9924 0.9928 0.9931 0.9934 0.9936 0.9945 0.9950 0.9770

r 2 3 4 5 6 7 8 9 10 25 100 1000 Tsallis

Jump 0.039 0.059 0.073 0.084 0.092 0.099 0.107 0.113 0.118 0.138 0.143 *

CV 0.0131 0.0201 0.0257 0.0302 0.0337 0.0366 0.0387 4.6 × 10−16 0.0417 0.0469 0.0479 *

Mean 0.9821 0.9725 0.9649 0.9588 0.9537 0.9496 0.9462 0.9435 0.9412 0.9296 0.9255 *

q 1/2 1/3 1/4 1/5 1/6 1/7 1/8 1/9 1/10 1/25 1/100 1/1000

Jump 0.064 0.067 0.066 0.066 0.065 0.064 0.064 0.063 0.063 0.061 0.059 0.058

CV 0.0217 0.0218 0.0208 0.0199 0.0192 0.0186 0.0181 0.0178 0.0175 0.0158 0.0150 0.0148

Mean 0.9704 0.9706 0.9722 0.9726 0.9731 0.9734 0.9738 0.9740 0.9743 0.9758 0.9767 0.9807

q 2 3 4 5 6 7 8 9 10 25 100 1000 NeG

Jump 4.13 × 10−4 1.54 × 10−6 4.64 × 10−9 1.36 × 10−11 4.24 × 10−14 2.77 × 10−15 1.33 × 10−15 1.66 × 10−15 1.77 × 10−15 2.55 × 10−15 1.77 × 10−15 1.77 × 10−15

CV 1.34 × 10−4 5.02 × 10−7 1.52 × 10−9 4.51 × 10−12 1.39 × 10−14 6.19 × 10−16 4.77 × 10−16 6.73 × 10−16 5.75 × 10−16 5.68 × 10−16 6.45 × 10−16

Mean 0.9998 1 1 1 1 1 1 1 1 1 1 1

q 1/2 1/3 1/4 1/5 1/6 1/7 1/8 1/9 1/10 1/25 1/100 1/1000

Jump 0.036 0.039 0.039 0.039 0.040 0.040 0.040 0.041 0.041 0.041 0.042 0.042

CV 0.0120 0.0126 0.0130 0.0132 0.0133 0.0134 0.0135 0.0136 0.0136 0.0139 0.0140 0.0140

Mean 0.9835 0.9826 0.9821 0.9818 0.9816 0.9815 0.9814 0.9813 0.9812 0.9809 0.9807 0.9807

q 2 3 4 5 6 7 8 9 10 25 100 1000

Jump 0.021 0.014 0.009 0.006 0.004 0.002 0.001 7.01 × 10−4 3.96 × 10−4 3.56 × 10−8 0 0

CV 0.0071 0.0047 0.0031 0.0019 0.0012 6.89 × 10−4 4.01 × 10−4 2.29 × 10−4 1.29 × 10−4 1.15 × 10−8 0 0

Mean 0.9902 0.9935 0.9958 0.9974 0.9984 0.9991 0.9994 0.9997 0.9998 1 1 1

Table 3. Jumps, Pearson coefficient of variation and mean in permutation entropies for time series of Example 2.

Shannon Renyi (r = 1/2) Renyi (r = 2) Tsallis (q = 1/2) Tsallis (q = 2) NeG (q = 1/2) NeG (q = 2)

Jump 0.0655 0.0960 0.0504 0.1623 6.9 × 10−4 0.0660 0.0645

Shannon Renyi (r = 1/2) Renyi (r = 2) Tsallis (q = 1/2) Tsallis (q = 2) NeG (q = 1/2) NeG (q = 2)

0.0315 0.0598 0.0395 0.0724 4.2 × 10−4 0.0371 0.022

m=3 CV 0.0078 0.0101 0.0069 0.0181 8.4 × 10−5 0.0079 0.0077 m=5 0.01 0.0176 0.0128 0.0215 1.3 × 10−4 0.0118 0.007

Mean 0.9911 0.9824 0.9910 0.9824 0.9999 0.991 0.9912

Jump 0.0419 0.2687 0.0456 0.1019 5.6 × 10−4 0.0433 0.0392

0.9856 0.9741 0.9817 0.9682 0.9998 0.983 0.99

0.0237 0.0174 0.0376 0.0525 3.2 × 10−4 0.0553 0.0016

m=4 CV 0.0078 0.0425 0.0081 0.0179 9.2 × 10−5 0.008 0.0072 m=6 0.0073 0.0054 0.0117 0.0165 9.7 × 10−5 0.0173 4.8 × 10−4

Mean 0.9873 0.8421 0.9862 0.9732 0.9999 0.9869 0.9882 0.9903 0.9930 0.9844 0.9787 0.9999 0.9773 0.9994

Table 4. Jumps, Pearson coefficient of variation and mean in permutation entropies in permutation entropies for time series of Example 2 when m ranges from 3 to 6, s = 10 and i = 1.

Example 3. We use the functions f1 (x) = 0.9x + 0.5 and f2 = 0.1x + , where  denotes a normal distribution with zero mean and variance 1. Again, the function f2 has a degree of disorder higher than f1 and permutation entropies show this situation. We take T = 6000, m = 5, k = 600, s = 100 and i = 1. We show the

10

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Data series of Example 3.

(b) Shannon permutation entropy.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

(e) Tsallis permutation entropy with parameter q = 1/2.

(f) Tsallis permutation entropy with parameter q = 2.

(g) Gaussian permutation entropy with parameter q = 1/2.

(h) Gaussian permutation entropy with parameter q = 2.

Figure 4. For T = 6000, m = 5, k = 600, s = 100 and i = 1 we show the data series of Example 3 and the variation of entropies along the data. We check that Renyi permutation entropy with parameter r = 1/2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure.

result in Figure 4 and Table 5. We check that Renyi permutation entropy with parameter r = 1/2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure.

Jump CV Mean

Shannon

Renyi (r = 1/2)

Renyi (r = 2)

Tsallis (q = 1/2)

Tsallis (q = 2)

NeG (q = 1/2)

NeG (q = 2)

0.0678 0.0258 0.9726

0.1180 0.0465 0.9510

0.0900 0.0334 0.9649

0.1374 0.0550 0.9418

0.0013 4.2 × 10−4 0.9996

0.0794 0.0303 0.9678

0.0481 0.0181 0.9807

Table 5. Jumps, Pearson coefficient of variation and mean in permutation entropies for time series of Example 3.

COMPARING PERMUTATION ENTROPY FUNCTIONS

11

Now, we consider a different time series generation where the structural change is produced continuously. Hence we construct the sequence (xn )Tn=1 by (6)

xn+1 = (1 − cos(1 − 2nπ/T )2 )f1 (xn ) + cos(1 − 2nπ/T )2 f2 (xn ).

This time, we consider the same functions as in Examples 1 and 3 to produce the following. Example 4. We use the functions f1 (x) = 4x(1 − x) and f2 = , where  will denote a uniform random distribution on [0, 1]. Clearly, there will be a structural change but apparently does not happen when we have a look to the time series graph. Clearly, function f2 has a degree of disorder higher than f1 and therefore permutation entropies should address this change. We take T = 1200, m = 4, k = 120, s = 10 and i = 1. We show the result in Figure 5 and Table 6. We check that Renyi permutation entropy with parameter r = 1/2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that show the change in the data structure more clearly.

Jump CV Mean

Shannon

Renyi (r = 1/2)

Renyi (r = 2)

Tsallis (q = 1/2)

Tsallis (q = 2)

NeG (q = 1/2)

NeG (q = 2)

0.1521 0.0584 0.9321

0.9033 0.4827 0.6515

0.1252 0.0472 0.9385

0.3348 0.1372 0.8635

0.0073 0.0025 0.9969

0.1699 0.0659 0.9238

0.1199 0.0452 0.9470

Table 6. Jumps(OJO?), Pearson coefficient of variation and mean in permutation entropies for time series of Example 4.

Example 5. We use the functions f1 (x) = 0.9x + 0.5 and f2 = 0.1x + , where  denotes a normal distribution with zero mean and variance 1. Again, the function f2 has a degree of disorder higher than f1 and permutation entropies shows this situation. We take T = 6000, m = 5, k = 600, s = 100 and i = 1. We show the result in Figure 6 and Table 7. We check that Renyi permutation entropy with parameters r = 1/2 and r = 2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure.

Jump CV Mean

Shannon

Renyi (r = 1/2)

Renyi (r = 2)

Tsallis (q = 1/2)

Tsallis (q = 2)

NeG (q = 1/2)

NeG (q = 2)

0.0566 0.0176 0.9784

0.1045 0.0309 0.9632

0.0758 0.0241 0.9702

0.1223 0.0368 0.9558

0.0010 3 × 10−4 0.9997

0.0663 0.0207 0.9746

0.0400 0.0123 0.9849

Table 7. Jumps(OJO?), Pearson coefficient of variation and mean in permutation entropies for time series of Example 5.

As the above examples show, the entropy functions that reflect better the structural changes are Tsallis function with parameter q = 1/2 and Renyi function with parameters r = 1/2 and r = 2. Of course, our approach is not completely rigorous in the sense that we do not provide any mathematical proof of that fact, which is of course a very good question. However, the above analysis is enough for our aims in this paper. On the other hand, this method can distinguish only time series with different dynamical behavior. Sometimes, the permutation length is not enough to show the changes. For instance, we consider the maps f1 = g 4 where g(x) = 4x(1 − x) and f2 = , a uniformly distributed random variable on [0, 1]. We refer this as Example 6. We construct the data series as it is shown in Example 1. Figure 7 shows that the entropy functions that according to our experiments better show the structural changes are unable to detect them in this example.

12

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Data series of Example 4.

(b) Shannon permutation entropy.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

(e) Tsallis permutation entropy with parameter q = 1/2.

(f) Tsallis permutation entropy with parameter q = 2.

(g) Gaussian permutation entropy with parameter q = 1/2.

(h) Gaussian permutation entropy with parameter q = 2.

Figure 5. For T = 1200, m = 4, k = 120, s = 100 and i = 1 we show the data series of Example 4 and the variation of entropies along the data. We check that Renyi permutation entropy with parameter r = 2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure. Note that Y–axis in Figure (c) is different from the others figures.

4. Application to real data 4.1. Seismic time series (Lorca, 2011). After the previous results the natural question that arises is what happens in real time series. In this section we apply the theoretic experiments to a seismic real series corresponding to the data obtained in the station EMUR (National Geographic Institute of Spain-IGN), located in La Murta (Murcia, Spain). We concentrate our analysis in the previous days to 11th of May of 2011 in which a catastrophic earthquake happened in the village of Lorca. The size of the series is 8.640.000 a day, which is a large enough quantity to make a descriptive analysis. According to our previous experiments, we consider the Renyi

COMPARING PERMUTATION ENTROPY FUNCTIONS

13

(a) Data series of Example 5.

(b) Shannon permutation entropy.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

(e) Tsallis permutation entropy with parameter q = 1/2.

(f) Tsallis permutation entropy with parameter q = 2.

(g) Gaussian permutation entropy with parameter q = 1/2.

(h) Gaussian permutation entropy with parameter q = 2.

Figure 6. For T = 6000, m = 5, k = 600, s = 100 and i = 1 we show the data series of Example 5 and the variation of entropies along the data. We check that Renyi permutation entropy with parameters r = 1/2 and r = 2 and Tsallis permutation entropy with parameter q = 1/2 are the entropy functions that better show the change in the data structure.

entropies with parameters r = 2 and r = 1/2 and Tsallis entropy with parameter q = 1/2. It is worth to mention that peaks in the seismic time series do not imply changes in the entropy or an earthquake. Figure 8 shows a peak and that day there are not registered earthquakes in the area, also we can see as entropy does not show any significant change. Thus, a relevant fact for us will be a peak in the time series plus a change in the entropy. This will be the type of events that we are looking for. This type of facts will occur when an earthquake happens, but we are interested in it even when an earthquake does not occur at the same time. Table 8 shows the seismic activity in Lorca on the day of the earthquake and the following ones. Previously, there was not significant seismic activity. We are going

14

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Data series of Example 6.

(b) Shannon permutation entropy.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

(e) Tsallis permutation entropy with parameter q = 1/2.

(f) Tsallis permutation entropy with parameter q = 2.

(g) Gaussian permutation entropy with parameter q = 1/2.

(h) Gaussian permutation entropy with parameter q = 2.

Figure 7. For T = 100000, m = 7, k = 25200, s = 1000 and i = 1 we show the data series of Example 6 and the variation of entropies along the data. We check that the change in the data structure is not found. The topological entropy of f1 is log 4 and hence there exists a 16–horseshoe and a symbolic dynamics with 16 symbols (see e.g. [7]) and hence permutations of length at least 17 are necessary to find forbidden patterns. Additionally, the frequency of permutations seems to be as regular as for an i.i.d. noise. Notice that Y–axis are different for figures in order to show the better range to find the change, but it is not exhibited.

to analyze previous days in order to detect possible structural changes in the time series which indicates a precursory behavior in the seismic series. Thus, we analyze the previous days and the goal day, 11th of May, taking m = 6, q = 0.5, k = 3600, s = 20000 and i = 1. We normalize the computations of the entropy to get values in the interval [0, 1]. Figures 8, 9, 10 11 and 12, that is, 6th, 7th, 8th and 9th of May, do not show any significant change in the entropy. During the previous day (10th of May of 2011) changes in the entropy are detected

COMPARING PERMUTATION ENTROPY FUNCTIONS

Date 11/05/2011 11/05/2011 11/05/2011 11/05/2011 11/05/2011 11/05/2011 11/05/2011 13/05/2011 14/05/2011 14/05/2011 14/05/2011 15/05/2011

Time (UTC) 15:05:13 15:21:01 16:47:25 16:53:15 19:28:18 20:37:45 20:44:06 21:08:37 00:49:32 21:10:25 21:54:35 00:03:03

Coordinates 37 25’55”N 1◦ 24’56” O 37◦ 40’16”N 1◦ 39’02” O 37◦ 26’01”N 1◦ 24’40” O 37◦ 39’50”N 1◦ 38’01” O 37◦ 43’21”N 1◦ 39’51” O 37◦ 41’38”N 1◦ 39’10” O 37◦ 41’46”N 1◦ 36’35” O 37◦ 41’07”N 1◦ 39’48” O 37◦ 41’37”N 1◦ 39’57” O 37◦ 39’56”N 1◦ 40’13” O 37◦ 40’27”N 1◦ 39’51” O 37◦ 39’53”N 1◦ 39’59” O ◦

MW 4,5 2,6 5,1 2,8 2,9 3,9 2,7 2,6 2,8 2,9 2,7 2,8

15

Location Lorca (NE) Lorca (E) Lorca (NE) Lorca (E) Lorca (NE) Lorca (NE) Lorca (E) Lorca (E) Lorca (NE) Lorca (SE) Lorca (E) Lorca (SE)

Table 8. Earthquakes in the village of Lorca.

and they do not correspond to an earthquake. Finally, during the fateful day, we observe as the different earthquakes produce also entropy changes. 4.2. Exchange rates US dolar vs. gold time series. Next, we consider the daily exchange rates between gold and US dolar. We have about 7500 points from the day–by–day official changes taken from the Central Bank of England at the web site http://www.bankofengland.co.uk/. We consider both daily exchange rates and returns, that is, the time series yn = xn −xn−1 of daily variations of exchange rates. Since permutations do not measure quantitative but qualitative properties of time series it is not necessary to consider log–returns, as usual in financial time series. As in the previous case, we consider Renyi entropies with parameters r = 1/2 and 2 and Tsallis entropy with parameter q = 1/2, for m = 4, k = 120, s = 10 and i = 1. The result is shown in Figure 15. 4.3. Nasdaq time series. Finally, we consider the daily closing NASDAQ index time series taken from https://es.finance.yahoo.com. We have about 12000 points in the time series and we consider both index and returns. As in the previous case, we consider Renyi entropies with parameters r = 1/2 and 2 and Tsallis entropy with parameter q = 1/2, for m = 5, k = 600, s = 10 and i = 1. The result is shown in Figure 16. It is remarkable that NASDAQ changed from public to private management and the change about year 2000 produced that entropy values increase. 5. Conclusions Different entropy-like measures applied to time series to detect structural changes produced by changes in the dynamical behavior of the data. Symbolic permutation coding and different entropy functions are considered to try to detect the data changes not only when them are visible but also when they are not perceptible when the series is represented. It can be seen that the entropy functions that better reflect the structural changes are Renyi and Tsallis entropies with parameter smaller than 1, in our example 1/2. In addition, Renyi entropy with parameter r = 1/2 shows quite well the transition between a deterministic chaotic time series and an i.i.d. noise, although it does not work well with the real application we have chosen in this paper. As a real application we analyze the structural changes of a seismic time series. We choose a seismic phenomenon that has been previously analyzed to compare

16

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Seismic data of May 5, 2011.

(b) Tsallis permutation entropy with parameter q = 1/2.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

Figure 8. 5th of May of 2011 (6 days before). During this day there are not registered earthquakes in the area. The seismic data and the permutation entropies are shown for m = 6, k = 600, s = 3600, i = 1. A peak is detected in the data series but not significant changes are detected in the entropy.

the results with our ones. In [3], the earthquake of Lorca has been analyzed using multifractal dimensional dependence analysis based on Tsallis Mutual Information. Making a different analysis, they show that visible changes in the dynamics during the previous days are detected. In this paper we encode the seismic series and use permutation entropies to study the structural changes.

COMPARING PERMUTATION ENTROPY FUNCTIONS

17

(a) Seismic data of May 6, 2011.

(b) Tsallis permutation entropy with parameter q = 1/2.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

Figure 9. 6th of May of 2011 (5 days before the earthquake). During this day there are not registered earthquakes in the area. Permutations entropies are shown for m = 6, k = 3600, s = 3600, i = 1.

6. Acknowledgments We would like to thank to Instituto Geogr´afico Nacional (IGN) Spain for providing the data that have been used, in particular to J.M. Alcalde, R. Ant´on and A. Crespo. We also wish to thank the reviewers of this paper for their interesting comments and advices.

18

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Seismic data of May 7, 2011.

(b) Tsallis permutation entropy with parameter q = 1/2.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

Figure 10. 7th of May of 2011 (4 days before the earthquake). During this day there are not registered earthquakes in the area. Permutation entropies are shown for m = 6, k = 3600, s = 3600, i = 1.

This work has been supported by the grants MTM2014-52920-P from Ministerio de Econom´ıa y Competitividad (Spain). References [1] J. M. Amig´ o, Permutation Complexity in Dynamical Systems: Ordinal Patterns, Permutation Entropy and All that, Springer Series in Synergetics, (2010).

COMPARING PERMUTATION ENTROPY FUNCTIONS

19

(a) Seismic data of May 8, 2011.

(b) Tsallis permutation entropy with parameter q = 1/2.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

Figure 11. 8th of May of 2011 (3 days before the earthquake). During this day there are not registered earthquakes in the area. Permutation entropies are shown for m = 6, k = 3600, s = 3600, i = 1.

[2] J. M. Amig´ o and K. Keller, Permutation entropy: One concept, two approaches, Eur. Phys. Spec. Top. 222 (2013) 263-273. [3] J. M. Angulo and F. J. Esquivel, Multifractal Dimensional Dependence Assesment Based on Tsallis Mutual Information, Entropy 17 (2015) 5382-5401. [4] C. Bandt and B. Pompe, Permutation entropy: a natural complexity measure for time series, Phys. Rev. Lett. 88 (2002) 174102. [5] C. Bandt, G. Keller and B. Pompe, Entropy of interval maps via permutations, Nonlinearity, 15 (2002) 1595-1602.

20

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Seismic data of May 9, 2011.

(b) Tsallis permutation entropy with parameter q = 1/2.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

Figure 12. 9th of May of 2011 (2 days before the earthquake). During this day there are not registered earthquakes in the area. Permutation entropies are given for m = 6, k = 3600, s = 3600, i = 1. Our results show a large in time variation of entropy values with no apparent variation in the structure of data.

[6] A. Batou and C. Soize, Generation of accelerograms compatible with design specifications using information theory, Bull. Earthquake Eng. 12 (2014) 796-794. [7] L. S. Block and W. A. Coppel, Dynamics in one dimension, Lecture Notes in Math. 1513, Springer, Berlin, (1992). [8] R. Bowen, Entropy for group endomorphism and homogeneous spaces, Trans. Amer. Math. Soc. 153 (1971) 401-414.

COMPARING PERMUTATION ENTROPY FUNCTIONS

21

(a) Seismic data of May 10, 2011.

(b) Tsallis permutation entropy with parameter q = 1/2.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

Figure 13. 10th of May of 2011 (1 day before the earthquake). During this day there are not registered earthquakes in the area. Permutation entropies are shown for m = 6, k = 3600, s = 3500, i = 1, detecting some changes on the data series.

[9] J. S. C´ anovas, A. Guillam´ on and M. C. Ru´ız, Using permutations to find structural changes in time series, Fluctuation and Noise Letters, 10 (2011), 13-30. [10] Y. Cao, W. Tung, J. B. Gao, V. A. Protopopescu and L. M. Hively, Detecting dynamical changes in time series using the permutation entropy, Phys. Rev. E. 70 (2004), 046217. [11] T.D. Frank and A. Daffertshofer, Exact time–dependent of the Renyi Fokker-Planck equation and the FokkerPlanck equations related to the entropies proposed by Sharma and Mittal, Physica A, 285 (2000), 351–366.

22

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Seismic data of May 11, 2011.

(b) Tsallis permutation entropy with parameter q = 1/2.

(c) Renyi permutation entropy with parameter r = 1/2.

(d) Renyi permutation entropy with parameter r = 2.

Figure 14. 11th of May of 2011 (the day of the earthquake). Seismic movements are detected (structural changes) correspond to changes in the entropy. Permutation entropies are shown for m = 6, k = 3600, s = 3600, i = 1. Note that for Renyi with parameter r = 2 the earthquake is not detected.

[12] K. Keller, A. M. Unakafov and V. A. Unakafova, On the Relation of KS Entropy and Permutation Entropy, Physica D: Nonlinear Phenomena, 241 (2012) 1477-1481. [13] K. Lehnertz and al.,Chaos in Brain? Nonlinear Analysis of physiological Data, H. Kantz and al. (World Scientific, Singapore 1999). [14] T. Oikonomou and U. Tirnakli, Generalized entropic structures and non-generality of Jayne’s Formalism, Chaos, Solitons and Fractals, 42 (2009) 3027-3034. [15] A. R´ enyi, Probability theory, Amsterdam: Norh-Holland (1970).

COMPARING PERMUTATION ENTROPY FUNCTIONS

23

(a) Data series of daily exchange rate between US dolar and gold.

(b) Data series of daily returns between US dolar and gold.

(c) Renyi permutation entropy with parameter r = 1/2 of data series (A).

(d) Renyi permutation entropy with parameter r = 1/2 of data series (B).

(e) Renyi permutation entropy with parameter r = 2 of data series (A).

(f) Renyi permutation entropy with parameter r = 2 of data series (B).

(g) Tsallis permutation entropy with parameter q = 1/2 of data series (A).

(h) Tsallis permutation entropy with parameter q = 1/2 of data series (B).

Figure 15. Parameters m = 4, k = 120, s = 10 and i = 1 are considered. Our results show around = 4000 and 5800 the permutation entropies decrease for both time series. The first decreasing value happens at the beginning of 1990 and could be motivated by early 1990’s recession. The second one is close to the 11th of September of 2001 and could be motivated by the crisis after that date.

[16] C. E. Shannon, A Mathematical Theory of Communication, Bell System Technical Journal, 27 (1948) 379-423. [17] L. Telesca, M. Lovallo, A. El-Ela Amin Mohamed, M. ElGabry, S. El-hady, K. M. About Elenean and R. Elshafey Fat ElBary, Informational analysis of seismic sequences by applying the Fisher Information Measure and the Shannon entropy: An application the the 2004-2010 seismicity of Aswan area (Egypt), Physica A, 391 (2012) 2889-2897. [18] C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys. 52 (1988) 479-487. [19] R. Yan, Y. Liu and R. X. Gao, Permutation entropy: A nonlinear statistical measure for status characterization of rotary machines, Mechanical Systems and Signal Processing, 29 (2012), 474-484.

24

´ ˜ J.S. CANOVAS, G. GARC´IA-CLEMENTE AND M. MUNOZ-GUILLERMO

(a) Data series of daily NASDAQ closing index.

(b) Data series of daily returns of NASDAQ closing index.

(c) Renyi permutation entropy with parameter r = 1/2 of data series (A).

(d) Renyi permutation entropy with parameter r = 1/2 of data series (B).

(e) Renyi permutation entropy with parameter r = 2 of data series (A).

(f) Renyi permutation entropy with parameter r = 2 of data series (B).

(g) Tsallis permutation entropy with parameter q = 1/2 of data series (A).

(h) Tsallis permutation entropy with parameter q = 1/2 of data series (B).

Figure 16. Parameters m = 4, k = 120, s = 10 and i = 1 are considered. Around n = 7000 NASDAQ changed from public to private management and then entropy started to increase by that time. The change of entropy values is more clear for the NASDAQ index than for the returns time series. [20] M. Zanin, L. Zunino, O. A. Rosso And D. Papo, Permutation Entropy and Its Main Biomedical and Econophysics Applications: A Review, Entropy, 14 (2012), 1553-1577. [21] L. Zunino, F. Olivares and O. A. Rosso, Permutation min-entropy: An improved quantifier for unveiling subtle temporal correlations, EPL, 109 (2015), doi: 10.1209/02955075/109/10005.

Address: Departamento de Matem´atica Aplicada y Estad´ıstica. Universidad Polit´ecnica de Cartagena. Antiguo Hospital de Marina. 30202 Cartagena, Murcia, Spain. Emails: [email protected]. [email protected]. [email protected].