Cryptography using chaotic discrete-time delayed ...

2 downloads 0 Views 2MB Size Report
In Section 3 we present the proposed encryption algorithm, and show .... If we calculate separately the histograms for the red, green, and blue components for ...
MATHEMATICS IN ENGINEERING, SCIENCE AND AEROSPACE MESA - www.journalmesa.com Vol. 3, No. 1, pp. 31-40, 2012 c CSP - Cambridge, UK; I&S - Florida, USA, 2012 ⃝

Cryptography using chaotic discrete-time delayed Hopfield neural networks Mirela Darau1,2 , Eva Kaslik1,3 , Stefan Balint1 1

Department of Mathematics and Computer Science, West University of Timis¸oara Bd. C. Coposu nr. 4, 300223, Timis¸oara, Romania. 2

Technische Universiteit Eindhoven, Den Dolech 2, 5612 AZ Eindhoven, Netherlands

3 Institute ⋆

e-Austria Timisoara, Romania

Corresponding Author. E-mail: [email protected]

Abstract. A cryptosystem is proposed, based on a class of chaotic discrete-time delayed Hopfield neural networks of two non-identical neurons with no self-connections previously analyzed by E. Kaslik and S. Balint [Journal of Nonlinear Science, 18(4): 415-432, 2008]. The security of this cryptosystem is discussed and some simulations are presented which illustrate its efficiency.

1 Introduction Before modern times, cryptography only dealt with message confidentiality: conversion of messages from a comprehensible form into an incomprehensible one, and back again. Far beyond that, the actual and potential applications of cryptography have expanded to include many other areas such as log-in protocols, electronic payment, democratic voting schemes, distributed management of databases and others. Recent researches have shown that chaotic communication systems are a suitable environment for cryptography because of some interesting properties like ergodicity, sensitive dependence of initial conditions and control parameters and high speed of information transmission. There is an increasing number of researchers working in this field, resulting in a variety of designs of cryptosystems based on chaotic systems [2, 3, 4, 5, 6, 7]. Another advantage of chaotic cryptosystems is the ease of their implementation. Conventional chaos-based cryptosystems such as the well known Baptista-type [2, 3] have shown some inherent drawbacks [8], in particular, security weaknesses even with chaotic dynamics completely hidden and slow performance due to analytical floating point computation and small key space. 2010

Mathematics Subject Classification: Keywords: cryptography; chaos; neural network; discrete-time.

32

Mirela Darau, Eva Kaslik, Stefan Balint

In [4] the authors propose a cryptosystem based on continuous Hopfield neural networks with delays that repairs all the above mentioned drawbacks. The distribution of the ciphertext is flat, and the information entropy is high, which means that less information of the plaintext is revealed. However, as computers work in discrete time only, there is a certain difficulty arising from the transformation of the continuous neural network into a discrete system. In this paper, we propose a cryptosystem based on chaotic discrete-time Hopfield neural networks with delays, presented and analyzed in [1]. This type of network is utilized to generate binary sequences that are used in permuting each plaintext block, in encrypting it, and in choosing the trajectory of the next iteration. Another novelty of this work is that the trajectory may be switched for every term of the binary sequence, providing a higher degree of randomness. We have based our conclusions on the study of image encryption, for it is known that, due to some inherent features of image, such as bulk data capacity and high correlation among pixels, image encryption is more complex than the usual text encryption. The paper is organized as follows. In Section 2 we describe the class of Hopfield neural networks used for encryption-decryption. In Section 3 we present the proposed encryption algorithm, and show that decryption is completely possible. In Section 4, we exemplify this on a particular Hopfield neural network with delays. The security analysis makes the subject of Section 5, where we check the resistance on statistical and differential attack, we compute the information entropy and check the key security. We finally draw some conclusions in Section 6.

2 Discrete-time chaotic Hopfield neural networks with delays Consider the classical discrete-time Hopfield neural network of two neurons with two delays and no self-connections: { xn+1 = a1 xn + T12 g2 (yn−k2 ) ∀n ≥ max(k1 , k2 ) (2.1) yn+1 = a2 yn + T21 g1 (xn−k1 ) In this system a1 , a2 ∈ (0, 1) are the internal decays of the neurons, T12 and T21 are the interconnection coefficients, gi : R → R represent the neuron input-output activations and k1 , k2 ≥ 0 represent the delays. The activation functions gi are of class C3 and gi (0) = 0. In [1] we have the following result concerning the chaotic behavior of (2.1): Theorem 2.1. Suppose that the activation functions satisfy the following conditions: ′

i. gi (0) ̸= 0, i = 1, 2; ′ ii. there exists i ∈ {1, 2} and s ̸= 0 such that gi (s) = 0 and gi (s) ̸= 0. If |T12 | and |T21 | are sufficiently large, then system (2.1) is chaotic. This class of discrete-time delayed neural networks constitutes the basis of the cryptosystem described in the following.

3 Encryption Algorithm In order to encrypt a given plaintext we will use the chaotic trajectories of the two neurons of system (2.1) to generate some basic binary sequences, as follows.

Cryptography using chaotic discrete-time delayed Hopfield neural networks

33

For any t defined in the interval I = [d, e], we can express the value of (t − d)/(e − d) ∈ [0, 1] in the following binary representation: t −d = 0.b1 (t)b2 (t) . . . bi (t) . . . , e−d

t ∈ [d, e],

bi (t) ∈ {0, 1}.

(3.1)

The ith bit bi (t) can be represented as 2i −1

bi (t) =

∑ (−1)r−1 Θ(e−d)(r/2 )+d (t) i

(3.2)

r=1

where Θt (x) is a threshold function defined by { Θτ (t) =

0, t < τ 1, t ≥ τ

(3.3)

By equation (3.2), a binary sequence Bki = {bi (tk )}∞ k=0 is obtained, where tk is the kth iteration of x or y. We now give the main algorithm (a modification of the one described in [4]): Step 1. Take an initial condition, x0 , x1 , x2 and y0 , y1 , y2 , and iterate the neural network N0 times (in order for its dynamics to become chaotic. Get the start point x0 from the last N0 transient iterations, x0 = xN0 . Step 2. Divide the message m into subsequences Pj of length l bytes (in our scheme l = 4): m = p0 , p1 , . . . , pl−1 , pl , pl+1 , . . . , p2l−1 , p2l , . . . {z }| {z } | P0

P1

1 33 34 37 Step 3. Compute the binary sequences A j = B1i B2i . . . B32 i , A j = Bi Bi . . . Bi supplied all by the fourth bits, i.e., i = 4 in equation (3.2). An integer D j is computed as the decimal value of A1j . This value will be used to iterate the neural network successively for D j times after the current block has been encrypted. The choice between the two neurons of the network is made by the following rule: if the first byte of the message sequence is being encrypted, choose x; otherwise, choose the neuron according to the previous bit generated, bi : if bi = 0 choose x, otherwise use y. Step 4. Permute the message block Pj with left cyclic shift D j bits, and message block A j with ′ ′ right cyclic shift D j bits. Thus the message blocks Pj and A j are obtained. ′ ′ Step 5. Generate C j using Pj and A j , according to the following rule: ′



C j = Pj ⊕ A j , where ⊕ is the XOR operation. Thus we obtain the ciphertext C j for the message block Pj . Dividing the ciphertext C j into 8-bits partitions, we obtain c j , c j+1 , c j+2 , c j+3 of the plaintext bytes p j , p j+1 , p j+2 , p j+3 , respectively. Step 6. If all plaintext blocks have been encrypted, the encryption process is completed. Otherwise, let N0 = N0 + D j (if N0 is larger than a threshold, reset N0 to 1) and x0 be the N0 th iterate of the neuron according to the last computation made at step 3, and go to step 2. Fig. 7 is a diagram of the proposed encryption scheme. The decryption process is the same as the encryption one. We just need the following equation: ′



Pj = C j ⊕ A j ,

34

Mirela Darau, Eva Kaslik, Stefan Balint ′

from which we can obtain the permuted message block Pj . By inverse permutation we obtain the message block Pj based on the value of D j . Separating the block message into bytes, the plaintext is recovered. Note that the ciphertext has the same size as the plaintext. The key of this algorithm is given by the initial conditions; the functions g1 , g2 ; a1 , a2 ; T12 , T21 ; the delays k1 , k2 and the number of transient iterations N0 .

4 Experimental results In this section, some experiments have been done to evaluate the performance of the proposed algorithm. We have based our evaluations on a particular cryptosystem (a particular Hopfield neural network): { xn+1 = 41 xn + sin(yn−2 ) ∀n ≥ 2 (4.1) yn+1 = 43 yn + b tanh(xn−1 ) In [1] it has been shown that (4.1) exhibits hyperchaotic behavior in a neighborhood of the trivial solution for b = −2, therefore we have chosen this value for b in the computations we made. When iterating (2.1) we have chosen x0 = x1 = x2 = y0 = y1 = 0.01) as initial conditions, and the transient time N0 = 10000. Experimental analysis of the algorithm presented in this paper has been done with an image. Fig. 1 shows the encryption and the decryption of a color image of size 240 · 320. Regarding the obtained results, it can be seen that the decrypted image is clear and correct, without any distortion. The performed experiments were done with Mathematica 6.0.

Fig. 1 (a)Original image. (b)Encrypted image. (c)Decrypted image.

5 Security analysis Security is a major goal of a cryptosystem designer. In this section we analyze in detail the performance of the proposed encryption scheme, including several tests to check the security of the cryptosystem: statistical tests (histogram analysis and computation of the correlation coefficient of adjacent pixels), security test against differential attack (computation of NPCR and UACI), information entropy evaluation and security key analysis. 5.1 Statistical analysis According to Shannon’s theory, a good encryption scheme should have excellent performance in resisting statistical attack, hence the ciphertext has to possess certain random properties. The distribution of the ciphertext is very important because it has to hide the redundancy of the plaintext and

Cryptography using chaotic discrete-time delayed Hopfield neural networks

35

should not leak any information about plaintext or any relation between ciphertext and plaintext. We have made our tests on a JPEG file. Histograms. The histograms in Fig. 2 represent the distribution of the averaged (grey) values of the pixels. One can see that the distribution is approximately Gaussian, i.e. random. The same distribution of the pixels values in ciphertext is observed in the case of a black or white image. Thus, a cryptanalyst gets no information whether an image is or not sent. The Gaussian distribution appears because we take an average (actually Grayscale = 0.3 · R + 0.59 · G + 0.11 · B) of the three color components (RGB). If we calculate separately the histograms for the red, green, and blue components for the cipher-image, we get the uniform distribution shown in Fig. 4 versus the red, green, blue distribution for the plain-image (Fig. 3). After encryption, the distributions of ciphertext are all flat, no matter what the original image was, and hence does not provide any clue to statistical attacks.

1000

500

800

400

600

300

400

200

200

100

50

100

150

200

50

250

100

150

200

250

Fig. 2 Grayscale histogram of (a) the original image (b) the encrypted image.

1400

600

1200 1200

500

1000

400

800

300

600

200

400

100

200

1000

50

100

150

200

250

800 600 400 200

50

100

150

200

250

50

100

150

200

250

200

250

Fig. 3 Histogram of the original image in the (a) red (b) green (c) blue components.

500

700 800

450

600

400 500

600 350

400 300

400 300

250 200

200

200 50

100

150

200

250

50

100

150

200

250

50

100

150

Fig. 4 Histogram of the encrypted image in the (a) red (b) green (c) blue components.

36

Mirela Darau, Eva Kaslik, Stefan Balint

Correlation of adjacent pixels. It is well known that adjacent image pixels are highly correlated either in horizontal, vertical or diagonal directions. Such high correlation property can be quantified by means of correlation coefficients which are given by: ρ= √ where D(p) =

cov(p, q) =

cov(p, q) √ , D(p) D(q)

1 N ¯ 2, ∑ (pi − p) N i=1

1 N ¯ i − q). ¯ ∑ (pi − p)(q N i=1

qi and pi denote two adjacent pixels (either horizontal or vertical). N is the total number of adjacent pixel pairs obtained from the image; p¯ and q¯ are the mean values of pi and qi , respectively. Table 1 shows the correlation coefficients of the image considered for the study and those of its encrypted correspondent. In the experiment we have randomly selected 1000 pairs of two horizontal (vertical) adjacent pixels (gray level) from the plain-image and the cipher-image, respectively. In Fig. 5(a), (c), the points are distributed about the line of perfect equality, which shows that adjacent pixels in the plain-image are linearly correlated. In Fig. 5 (b), (d) the points are dispersed, which indicates a low correlation of adjacent pixels in the cipher-image. Experiment results show that these correlation coefficients are very small. They indicate that the plaintext and ciphertext sequences are independent of each other. Table 1 Correlation coefficient of adjacent pixels in the two images Correlation coefficient Original image Encrypted image Horizontal 0.964699 0.0362374 Vertical 0.952944 -0.0105557

5.2 Differential attack As a general requirement for all image encryption schemes [7], the encrypted image should be greatly different from its original form. Such difference can be measured by means of two criteria namely, the number of pixel change rate (NPCR) and the unified average changing intensity (UACI). The NPCRR,G,B means the change rate of the number of pixels of ciphered image while one pixel of plainimage is changed (for red, green, blue, respectively). Let S(i, j) and S′ (i, j) be the (i, j)th pixel of two images S and S′ , respectively. The NPCRR,G,B can be defined as: NPCRR,G,B =

∑i, j DR,G,B (i, j) × 100%, N

where N is the total number of pixels in the image and DR,G,B (i, j) is defined as { 0 if SR,G,B (i, j) = S′ R,G,B (i, j) DR,G,B (i, j) = 1 if SR,G,B (i, j) ̸= S′ R,G,B (i, j)

Cryptography using chaotic discrete-time delayed Hopfield neural networks

37

250 200

200

150

150

100

100

50

50

50

100

150

200

50

250

100

150

200

250

250

250

200

200

150

150

100

100

50

50

50

100

150

200

250

50

100

150

200

Fig. 5 Correlation of two adjacent pixels. The first two frames show the distribution of two horizontally adjacent pixels in the plain and encrypted image, respectively. The last two frames show the distribution of two vertically adjacent pixels in the plain and the encrypted image, respectively.

where SR,G,B (i, j) and S′ R,G,B (i, j) are the values of the corresponding color component red (R), green (G) or blue (B) in the two images, respectively. For example, for two random images with 512 × 512 pixels and 24-bit true color: NPCRR = NPCRG = NPCRB = 99.609375%. The second criterion, UACIR,G,B , is used to measure the average intensity of the differences between the plain-image and ciphered image, and can be defined as ( ) |SR,G,B (i, j) − S′ R,G,B (i, j)| 1 UACIR,G,B = Σi, j × 100%, N 2BR,G,B − 1 where BR,G,B is the number of bits used to represent the color component of red, green or blue, respectively. In the case of two random images, the expected value of UACIR,G,B is UACIR = UACIG = UACIB = 33.46354%. We have computed the NPCRR,G,B and the UACIR,G,B with the proposed cryptosystem to assess the influence of changing a single pixel in the original image on the encrypted image. We have found that NPCRR = 99.6146%,

NPCRG = 99.6055%,

NPCRB = 99.6094%

UACIG = 40.3828%,

UACIB = 41.9202%

which is over 99% and UACIR = 38.9223%,

which is over 33%, showing thereby that the encryption scheme is very sensitive with respect to small changes in the plaintext.

38

Mirela Darau, Eva Kaslik, Stefan Balint

5.3 Information entropy analysis The entropy H(m) of a message source m can be calculated as 2B −1

H(m) =



i=0

p(mi ) log(

1 ), p(mi )

where B is the number of bits to represent a symbol mi ∈ m; p(mi ) represents the probability of symbol mi and log represents the base 2 logarithm so that the entropy is expressed in bits. For a purely random source emitting 2B symbols, the entropy is H(m) = B. For encrypted messages, the entropy should ideally be H(m) = B (in our case, as the representation is on 8 bits, H(m) = 8). When a cipher emits symbols with entropy less than B, there exists a certain degree of predictability, which threatens its security. Let us consider the ciphertext of a our image, encrypted using the proposed scheme. The number of occurrences of each ciphertext pixel mi is recorded and the probability of occurrence is computed for the three image color components (R, G, B). The entropy for the three image color components is: HR (m) = 7.98765 ≈ 8, HG (m) = 7.98795 ≈ 8, HB (m) = 7.98844 ≈ 8. This means that information leakage in the encryption process is negligible and the encryption system is secure against the entropy attack. 5.4 Security key analysis We know that chaotic behavior of a map means that the map is sensitive to initial conditions. We have changed the initial conditions in our experimental Hopfield neural network and have considered x0 = x1 = x2 = 0.018, y0 = y1 = 0.016 instead of x0 = x1 = x2 = y0 = y1 = 0.01. The decrypted algorithm resulted in Fig. 6, which indicates that our cryptosystem is key-sensitive with respect to initial conditions.

Fig. 6 Decryption of the original image using slightly different initial condition

A secure cryptosystem requires also a large key space. In our example, we may consider the keyspace infinitely large because of the large number of key parameters, including the activation functions.

Cryptography using chaotic discrete-time delayed Hopfield neural networks

Fig. 7 Block diagram of the proposed encryption scheme

39

40

Mirela Darau, Eva Kaslik, Stefan Balint

6 Conclusion In this paper, we proposed a novel approach of encryption based on chaotic discrete-time Hopfield neural networks with delays. The chaotic neural network is used for generating binary sequences which will be used in masking the plaintext, in permuting it and choosing the iterations of the trajectory. Simulation results show that the distribution of ciphertext is flat. Moreover, security analysis tests confirm the difficulty of the synchronization of chaotic neural networks with time delay, resulting in a secure cryptosystem.

Acknowledgements This work has been supported by CNCSIS-UEFISCSU under the contracts PN-II-11028/14.09.2007 (NatComp - New Natural Computing Models in the Study of Complexity) and PN-II-RU-PD−145/2010 (Advanced impulsive and fractional-order neural network models).

References [1] E. Kaslik, St. Balint. Chaotic dynamics of a delayed discrete-time Hopfield network of two nonidentical neurons with no self-connections. Journal of Nonlinear Science, 18(4): 415-432, 2008. [2] M. S. Baptista. Cryptography with chaos. Physiscs Letter A, 240: 50-4, 2008. [3] K-W Wong, S-W Ho, C-K Yung. A chaotic cryptography scheme for generating short ciphertext. Physiscs Letter A, 310: 67-73, 2003. [4] W. Yu, J. Cao. Cryptography based on delayed chaotic neural networks. Physiscs Letter A, 356: 333-8, 2006. [5] H. Yang, X. Liao, K. Wong, W. Zhang, P. Wei. A new cryptosystem based on chaotic map and operations algebraic. Chaos, Solitons & Fractals, DOI 10.1016/j.chaos.2007.10.046., in press, 2007. [6] S. Behnia, A. Akhshani, H. Mahmodi, A. Akhavan. A novel algorithm for image encryption based on mixture of chaotic maps. Chaos, Solitons & Fractals, 35: 408-419, 2008. [7] R. Rhouma, S. Meherzi, S. Belghith. OCML-based colour image encryption. Chaos, Solitons & Fractals, DOI 10.1016/j.chaos.2007.07.083, in press, 2007. [8] G. Jakimoski, L. Kokarev. Analysis of some recently proposed chaos-based encryption algorithms. Physiscs Letter A, 291: 381-4, 2001.

Suggest Documents