Pixel Correlation Behavior in Different Themes

2 downloads 50 Views 1MB Size Report
different themes based on statistical analysis of pixel correlation evaluation. ... photo themes for extracting patterns are wide flat region, wide smooth region, ...
Pixel Correlation Behavior in Different Themes Alaa A. Jabbar1, Shahrin Bin Sahib1, and Mazdak Zamani2 1

Faculty of Information & Communication Technology, Universiti Teknikal Malaysia Melaka, 76100 Melaka, Malaysia 2 Advanced Informatics School, Universiti Teknologi Malaysia, 54100 Kuala Lumpur, Malaysia [email protected], [email protected], [email protected]

Abstract. This paper intends to study pixel correlation behavior of photos in different themes based on statistical analysis of pixel correlation evaluation. Keywords: Pixel Correlation, Steganography, Watermarking, Steganalysis, Photo Themes.

1

Introduction

Conducted statistical analysis of proposed pixel correlation evaluation revealed that photos in different themes show different pixel correlation behavior. Therefore for more accurate steganalysis it is very important to analyze a photo based on its content. To do so, it is required to analyze original photos and their stego copies to discover the relations and patterns [1-4]. The known themes of photos are solid, smooth, and edgy photos. Furthermore, results of pixel’s correlation analysis shows that there is difference between photos with wide smooth area and photos with partial smooth area. Thus the four analyzed photo themes for extracting patterns are wide flat region, wide smooth region, partial smooth region, and edgy photos [5-9].

2

Correlativity Analysis of Pixels

Layer is the main concept for correlativity analysis in offered method. All the pixels located in the same layer of neighbors have same importance and if only one of the pixels in the layer be correlated to central pixel then that layer and all outer layers would be counted as correlated. To approve correlativity of a layer at least one of the pixels in that layer or one of the pixels in inner layers should be correlated to central pixel [10-14]. Figure 1 illustrates concept of layer by separating pixels of each layer in different colors. Central pixel is a single pixel colored in red. All the pixels next to the central pixel constitute first layer and the outer adjacent pixels to the pixels of first layer constitute second layer. F. Bian et al. (Eds.): GRMSE 2013, Part I, CCIS 398, pp. 223–234, 2013. © Springer-Verlag Berlin Heidelberg 2013

224

A.A. Jabbar, S.B. Sahib, and M. Zamani

Fig. 1. Illustration of layer concept

White pixels are in third layer and the pixels numbered by 4 are located in fourth layer. Pixels of last layer, fifth layer, are colored in violet and numbered as 5. If any correlativity exist in one of the layers then from that particular layer up to fifth layer would be counted as correlated layers. For instance, we assume no correlativity exists between central pixel and the pixels in first layer, but there is a correlated pixel in second layer. Therefore second, third, fourth, and fifth layers would be seen as correlated layers to the central pixel as at least one correlated pixel has been found in their internal territory. Since there is no correlated pixel in first layer also it has no inner layer containing correlated pixel then first layer is not correlated. Pixel color matching and channel color matching are the two proposed pixel’s correlativity evaluation methods to discover the correlativity of a pixel with the neighbor pixels in different layers. Since in pixel matching all three color channels of Red, Green, and Blue (RGB) between central pixel and one of the neighbor pixels must be the same, then logically this comparison could also be called as AND correlativity analysis. In second way, if either of Red, Green, or Blue be the same between central pixel and one of neighbor pixels then they could be considered as correlated pixels. In view of the fact that the neighbor pixels would be analyzed up to five layers then first five rows and column of pixels from top and left, and last five rows and columns from bottom and right side of photo will be ignored in analysis process as they do not have enough neighbors for analysis. In the Figure 2 the pixels in small yellow, blue, and white squares have enough neighbors to be considered for analysis process while the small violet square only has four layers of neighbors in left and down sides and therefore is not qualified for analysis.

Pixel Correlation Behavior in Different Themes

2.1

225

AND Correlativity Analysis

Pixel color matching or in another word AND color analysis examines the correlativity of the central pixel with neighbors in different layers in a way that all red, green, and blue channels of both pixels should be same. This analysis starts from comparing the central pixel with first neighbor pixel in first layer. Then the process continues with comparing color channels of the central pixel with next neighbors in the same layer and then the neighbors in next layers up to fifth layer [15-19]. If there is a correlated pixel in boundary of first layer, then this correlativity also exists in boundary of second, third, fourth and fifth layers. Therefore there is no need to continue pixel color matching correlation analysis process for outer layers if any similar pixel be found in inner layers. This fact also speeds up the process of correlativity analysis [20-24]. In case there was any pixel with the same color channels of the central one, the comparison process will stop and from that particular layer until last layer flags of finding correlated pixel would change to true. For each layer proportion of correlated pixels to total number of analyzed pixels gives correlativity percentage of photo for that particular layer [25-31]. Following pseudo code shows process of AND correlativity analysis of a pixel. The used variables are: Found: a flag to verify whether the loop should continue or not based on finding a correlated pixel. Value of true means that correlativity is found and there is no need to continue the process. Layer_Flagi: value of this flag shows whether there is any correlated pixel for the layer of i. Layer: identifier of layer which will vary from 1 to 5 layer.n: number of pixels in particular layer (not whole territory of the layer) Found= false For i=1 to 5 do Layer_Flagi =false For layer =1 to 5 do Begin For Neighbor =1 to layer.n do Begin If ((CentralPixel.Red = Layer.Neighbor.Red) and (CentralPixel.Green = Layer.Neighbor.Green) and (CentralPixel.Blue = Layer.Neighbor.Blue) ) then Begin For i=layer to 5 do Layer_Flagi =true Found= true End If found then break End If found then break End

226

A.A. Jabbar, S.B. Sahib, and M. Zamani

Fig. 2. Pixel analysis area

Figure 3 illustrates how the pixels would be analyzed for AND correlation analysis. In upper half there is a photo and in lower half a rectangle of 11*11 pixels of it is zoomed in. The numbers in red square show red, green, and blue values of the central pixel. The 8 pixels between red and green squares are first layer neighbors. The 16 pixels between blue and green squares are second layer neighbors and this order continues until last layer which is constituted from 40 pixels between orange and black squares. RGB values of each pixel are written in top-down order in yellow color. RGB values of central pixel and similar values in other pixels are highlighted by white color. As is visible in the Figure 3 there is no similar value of central pixel in the first layer. In second layer only red value of one pixel is the same but since in AND pixel correlativity analysis all three values should be the same then there is no correlated pixel in second layer. In third layer three pixels have the same red or green or blue value, but only one of them has the same red and green and blue codes. Therefore in layer three there is a correlated pixel to the central pixel which means the layers of three, four, and five are correlated to the central pixel as there is at least one correlated pixel in their territories. After finding correlated pixel in third layer there is no need to continue RGB comparison process for fourth and fifth layers and they would be considered as correlated layers to the central pixel despite there is no correlated pixel to the central one in those particular layers. To calculate percentage of correlated pixels of this photo, both layer one and layer two have the value of zero and third, fourth, and fifth layers have value of one. Total number of pixels which are RGB correlated in layer (i) timed by 100 and divided by total number of analyzed pixels gives correlativity percentage of that layer. Layer (i) R|G|B pixel correlativity = [total number of layer (i) R|G|B correlated pixels*100] / [total number of analyzed pixels]%

Pixel Correlation Behavior in Different Themes

227

Fig. 3. AND pixel correlativity analysis illustration

2.2

OR Correlativity Analysis

Channel color matching, or in other hand OR correlativity analysis, measures correlativity of the central pixel with neighbor pixels up to five layers in a way that if red, green, or blue channel values of the central pixel be the same as one of neighbors then that layer up to fifth layer would be considered as OR correlated layers [25-31].

228

A.A. Jabbar, S.B. Sahib, and M. Zamani

The comparison process starts from first pixel in first layer and continues spiral like until the last pixel of the fifth layer. Since the inner layers are located into territory of outer layers then as soon as finding first identical color channel between the central pixel and one of neighbors the analysis process will stop and flags of finding correlated pixel of that layer and outer ones will convert into true. For each of layers proportion of correlated ones to total number of analyzed pixels will give correlativity percentage of that layer. Following pseudo code shows color channel correlativity analysis method for a pixel. The used variables are: Found: a flag to verify whether the loop should continue or not based on finding a correlated pixel. Value of true means that correlativity is found and there is no need to continue the process. Layer_Flagi : value of this flag shows whether there is any correlated pixel for the layer of i. Layer: identifier of layer which will vary from 1 to 5. Layer.n: number of pixels in particular layer (not whole territory of the layer) Found= false For i=1 to 5 do Layer_Flagi =false For layer =1 to 5 do Begin For Neighbor =1 to layer.n do Begin If ((CentralPixel.Red = Layer.Neighbor.Red) or (CentralPixel.Green Layer.Neighbor.Green) or (CentralPixel.Blue = Layer.Neighbor.Blue) ) then Begin For i=layer to 5 do Layer_Flagi =true Found= true End If found then break End If found then break End

=

Figure 4 illustrates process of color channel correlativity analysis. Upper half of this figure has a photo which a 11*11 square of its pixels is zoomed in lower half for color channel OR correlativity analysis. Central pixel is highlighted by a red square and the layers of one to five are separated respectively in blue, green, white, orange, and black squares. Pixels of each layer are located between the squares but to consider a layer as correlated is enough to find a correlated pixel either in the layer or in inner layers.

Pixel Correlation Behavior in Different Themes

229

Fig. 4. OR pixel correlativity analysis illustration

Numbers of the pixels in layers of one to five are respectively 8, 16, 24, 32, and 40. Numbers of whole pixels in territories of layers of one to five respectively are 8, 24, 48, 80, and 120. It means that, for example, for fifth layer if any of 120 pixels be OR correlated to the central one then this layer will considered as correlated layer regardless of exact location of the pixel. In Figure 4 the central pixel RGB values respectively are 104, 112, and 61. In first and second layers there is no channel with the same value of central channels. Therefore first and second channels are not correlated. In third channel red value of one of pixels, which is highlighted by white color, is the same as red value of central pixel. Hence layer three and outer layers will be considered as correlated layers and there is no need to continue analysis process after finding the first match channel. Layer (i) R|G|B pixel correlativity = [total number of layer (i) R|G|B correlated pixels*100] / [total number of analyzed pixels]%

230

3

A.A. Jabbar, S.B. Sahib, and M. Zamani

Correlativity Patterns

Pixels of photos naturally are correlated to the neighbor pixels. Photos with wide flat regions have strongest correlativity and then it follows by wide smooth region, partial smooth region, and edgy photos. When an external object manipulates RGB values of the pixels to embed external data, correlativity of the pixels would be affected and for higher embedding bit-rates correlativity will be degraded more. Farness of pixels’ correlativity from its average values in natural photos can be used as steganalysis measurement for both recognizing stego images than original ones, and in more advanced stage predicting ratio of data embedment. To find measurements of steganalysis, correlativity of stego images should be compared with average correlativity of original images in that particular photo theme. Proposed pixel color and channel color correlativity examination methods can be used to define pattern of each photo theme. In addition of the original photos, stego photos also will have different patterns than original ones as their correlativity is affected through insertion of data. In more depth, stego photos with different bit-rates and different photo themes will have different pattern of correlativity than other types. Therefore to have accurate steganalysis method based on pixel color and channel color correlativity of pixels, stego photos in different photo themes with different embedding rates should be analyzed for extracting patterns. After extraction of the patterns they can be used for analysis of photos to firstly distinguish stego ones from original images and secondly predict bit-rate of embedded data into the object. 3.1

Determining Layer Examination Threshold

Correlativity analysis will examine the pixels in five layers but for higher performance and enhancing speed of steganalysis system best threshold of analysis of layers should be determined based on experimental results. By moving from first layer toward fifth layer likelihood of having correlated layer increases as there are more pixels. It increases chance of finding correlation and correlation of inner layers automatically will be inherited to outer layers. Therefore probability of having correlated layers for outer layers is higher and they will overlap each other. Experimental correlation analysis on data set can determine the best analysis threshold layer. 3.2

Pattern Matching System

After determination of the best threshold for correlativity analysis, patterns of different original photos themes and their embedded copies must be extracted. Since some samples of data set in correlativity analysis may have results far than normal behavior of that category then through a statistical process the generated results must be purified. Standard deviation can be used to filter out abnormal results from generated analysis results.

Pixel Correlation Behavior in Different Themes

231

The standard deviation of a data set, statistical population, random variable, or probability distribution can be calculated by computing the square root of its variance. Unlike variance, an important property of standard deviation is that its result is in the same unit as the data is. It is commonly utilized to measure margin of errors and statistical conclusions confidence. The reported error margin is normally about twice of the standard deviation. In researches generally standard deviation of the experimental data will be reported and the data located outside of the range will be ignored or only will be statistically significant. Standard deviation distinguishes normal error or variation from casual variation. To calculate standard deviation firstly average of data set values must be computed and then the mean will be used to calculate variance. Square root of the variance is standard deviation. The variables used are as follow:

X = item of the dataset f = plenty of the item in dataset N= number of the items in the dataset

X= arithmetic mean S : Variance S: standard deviation

X S S

1 N

K

f X 1

N

1

f x

x

S

Standard deviation gives range of the most reliable data from mean. Therefore values in both right and left sides of average with the range of standard deviation would be reliable data. The values are located from border of mean value minus standard deviation toward minimum and mean value plus standard deviation toward maximum value will lose their importance and reliability. Therefore, to extract the patterns from each category in the data set mean value, standard deviation, minimum, and maximum values are required. After extracting patterns a marking system should measure the new samples to find the most similar pattern to correlativity behavior of analyzed photo. Since the results in the boundaries of standard deviation are considered as genuine values then in the evaluation system they should have maximum worth and from the boundaries toward maximum and minimum the analysis results should lose their importance. If the results in the reliable area get the full mark, then the results out of the boundaries of standard deviation from mean point should lose their importance based on their distance to the genuine area. Therefore as is shown in the Figure 5 the analysis results in the genuine area will get full mark of one and other results based on their distance will have a value between 0 and 1.

232

A.A. Jabbar, S.B. Sahib, and M. Zamani

Fig. 5. Pattern analysis marking system

Analysis of a photo from first layer to the last layer will generate separate results for each layer. Comparing the results with each of the extracted patterns will lead in having separate mark for matching to each pattern. Sum of the similarity marks of each layer will determine similarity of the results to that particular pattern. By comparing the results with each of the patterns and sorting them in descending order the marking system will find the most similar pattern and determine whether the photo is original or with which embedding bit-rate if recognized as stego image.

4

Conclusion

This paper starts with elaborating details of proposed pixel color and color channel correlativity analysis methods. Based on the proposed method each type of photo and their embedded copies will have two correlativity patterns which should be extracted. The extracted patterns later can be utilized to distinguish original photos from stego objects and also prediction of embedding rate. To assess photos a proposed pattern matching system based on standard deviation will examine similarity of the given object with the extracted patterns to firstly recognize stego files and secondly predict the embedment rate if the given file is stego object.

References 1. Zamani, M., Manaf, A.B.A., Ahmad, R.B., Jaryani, F., Chaeikar, S.S., Zeidanloo, H.R.: Genetic audio watermarking. In: Das, V.V., Vijayakumar, R., Debnath, N.C., Stephen, J., Meghanathan, N., Sankaranarayanan, S., Thankachan, P.M., Gaol, F.L., Thankachan, N. (eds.) BAIP 2010. CCIS, vol. 70, pp. 514–517. Springer, Heidelberg (2010) 2. Zamani, M., Manaf, A.B.A., Abdullah, S.M., Chaeikar, S.S.: Mazdak Technique for PSNR Estimation in Audio Steganography. In: Applied Mechanics and Materials, vol. 229-231, pp. 2798–2803. Trans Tech Publications, Switzerland (2012) 3. Zamani, M., Manaf, A.A., Ahmad, R., Zeki, A., Abdullah, S.: A Genetic-Algorithm-Based Approach for Audio Steganography. In: World Academy of Science, Engineering and Technology, June 24-26, pp. 355–358 (2009)

Pixel Correlation Behavior in Different Themes

233

4. Zamani, M., Taherdoost, H., Manaf, A.A., Ahmad, R.B., Zeki, A.M.: An ArtificialIntelligence-Based Approach for Audio Steganography. Journal of Open Problems in Science and Engineering 1(1), 64–68 (2009) 5. Zamani, M., Manaf, A.A., Ahmad, R., Jaryani, F., Taherdoost, H., Chaeikar, S.S., Zeidanloo, H.R.: A Novel Approach for Genetic Audio Watermarking. Journal of Information Assurance and Security 5, 102–111 (2010) 6. Zamani, M., Manaf, A.A., Ahmad, R., Jaryani, F., Taherdoost, H., Chaeikar, S.S., Zeidanloo, H.R.: Genetic Audio Steganography. International Journal on Recent Trends in Engineering & Technology 3(2), 89–91 (2010) ISSN: 2158-5563 7. Abdullah, S.M., Manaf, A.A., Zamani, M.: Recursive Reversible Image Watermarking Using Enhancement of Difference Expansion Techniques. Journal of Information Security Research 1(2), 64–70 (2010) 8. Zamani, M., Manaf, A.A., Zeidanloo, H.R., Chaeikar, S.S.: Genetic Subtitution-Based Audio Steganography for High-Capacity Applications. International Journal for Internet Technology and Secured Transactions 3(1), 97–110 (2011) 9. Zeki, A.M., Manaf, A.A., Ibrahim, A.A., Zamani, M.: A Robust Watermark Embedding in Smooth Areas. Research Journal of Information Technology 3(2), 123–131 (2011) 10. Zamani, M., Manaf, A.A., Ahmad, R.: Knots of Substitution Techniques of Audio Steganography. In: International Proceedings of Computer Science and Information Technology, vol. 2, pp. 370–374. IACSIT Press, Singapore (2011) 11. Zamani, M., Manaf, A.B.A., Abdullah, S.M.: An Overview on Audio Steganography Techniques. Journal of Digital Content Technology and its Applications. Advanced Institute of Convergence Information Technology (2012) 12. Zamani, M., Manaf, A.A.: Genetic algorithm for fragile audio watermarking. In: Telecommunication Systems, Special Issue on Innovations in Emerging Multimedia Communication Systems. Springer (in press) ISSN: 1018-4864 13. Zamani, M., Manaf, A.A., Ahmad, R.: Knots of Substitution Techniques of Audio Steganography. In: The 2009 International Conference on Telecom Technology and Applications, Manila, Philippines, June 6-8, pp. 415–419 (2009) 14. Zamani, M., Manaf, A.A., Ahmad, R.: Current Problems of Substitution Technique of Audio Steganography. In: International Conference on Artificial Intelligence and Pattern Recognition, Orlando, Florida, USA, July 13-16, pp. 154–160 (2009) 15. Zamani, M., Manaf, A.A., Ahmad, R., Zeki, A., Abdullah, S.: Genetic Algorithm as an Approach to Resolve the Problems of Substitution Techniques of Audio Steganography. In: The 2009 International Conference on Genetic and Evolutionary Methods, Las Vegas, Nevada, USA, July 13-16, pp. 170–175 (2009) 16. Zamani, M., Manaf, A.A.: Azizah’s Method to Measure the Efficiency of Steganography Techniques. In: 2nd International Conference on Information and Multimedia Technology (ICIMT 2010), Hong Kong, China, December 28-30, pp. 385–389 (2010) 17. Zamani, M., Manaf, A.A., Ahmad, R., Zeki, A.: An Approach to Improve the Robustness of Substitution Techniques of Audio Steganography. In: 2nd IEEE International Conference on Computer Science and Information Technology, Beijing, China, August 811, vol. 2, pp. 5–9 (2009) 18. Zamani, M., Taherdoost, H., Manaf, A.A., Ahmad, R., Zeki, A.: Robust Audio Steganography via Genetic Algorithm. In: Third International Conference on Information & Communication Technologies, Karachi, Pakistan, August 15-16, pp. 149–153 (2009) 19. Zamani, M., Manaf, A.A., Ahmad, R., Zeki, A., Magalingam, P.: A Novel Approach for Audio Watermarking. In: Fifth International Conference on Information Assurance and Security, Xi’an, China, August 18-20, pp. 83–86 (2009)

234

A.A. Jabbar, S.B. Sahib, and M. Zamani

20. Zamani, M., Manaf, A.A., Ahmad, R., Jaryani, F., Taherdoost, H., Zeki, A.: A Secure Audio Steganography Approach. In: The 4th International Conference for Internet Technology and Secured Transactions, November 9-12, pp. 501–506 (2009) 21. Zeki, A.M., Manaf, A.A., Zamani, M.: Bit-Plane Model: Theory and Implementation. In: Engineering Conference 2010, Kuching, Malaysia, April 14-16 (2010) 22. Abokhdair, N.O., Manaf, A.B.A., Zamani, M.: Integration of Chaotic Map and Confusion Technique for Color Medical Image Encryption. In: 6th International Conference on Digital Content, Multimedia Technology and its Applications, Seoul, Korea, August 1618, pp. 20–23 (2010) 23. Abdullah, S.M., Manaf, A.A., Zamani, M.: Capacity and Quality Improvement in Reversible Image Watermarking Approach. In: 6th International Conference on Networked Computing and Advanced Information Management, Seoul, Korea, August 16-18, pp. 81–85 (2010) 24. Zamani, M., Manaf, A.A.: “Mazdak’s Method for PSNR Estimation in Audio Steganography”. In: International Conference on Computer and Computational Intelligence, Nanning, China, December 25-26, pp. 574–577 (2010) 25. Zamani, M., Manaf, A.B.A., Abdullah, S.M., Chaeikar, S.S.: Correlation between PSNR and Bit per Sample Rate in Audio Steganography. In: 11th International Conference on Signal Processing, Saint Malo, Mont Saint-Michel, France, April 2-4, pp. 163–168 (2012) 26. Zamani, M., Manaf, A.B.A., Abdullah, S.M.: Correlation between PSNR and Size Ratio in Audio Steganography. In: 11th International Conference on Telecommunications and Informatics, France, April 2-4, pp. 82–87 (2012) 27. Zamani, M., Manaf, A.A., Shahidan, M.A.: Efficient Embedding for Audio Steganography. In: 2nd International Conference on Environment, Economics, Energy, Devices, Systems, Communications, Computers, Mathematics, April 2-4, pp. 195–199 (2012) 28. Zamani, M., Manaf, A.A., Daruis, R.: Azizah Technique for Efficiency Measurement in Steganography. In: 8th International Conference on Information Science and Digital Content Technology, Jeju, Korea, June 26-28, vol. 3, pp. 480–484 (2012) 29. Jabbar, A.A., Sahib, S.B., Zamani, M.: An Introduction to Image Steganography Techniques. In: International Conference on Advanced Computer Science Applications and Technologies (ACSAT 2012), November 26-28 (2012) 30. Jabbar, A.A., Sahib, S.B., Zamani, M.: Multimedia Data Hiding Evaluation Metrics. In: 7th WSEAS International Conference on Computer Engineering and Applications (CEA 2013), Milan, Italy, January 9-11 (2013) 31. Alaa, A.: Jabbar, Shahrin Bin Sahib, Mazdak Zamani. An Introduction to Watermarking Techniques. In: 12th WSEAS International Conference on Applications of Computer Engineering (ACE 2013), Cambridge, MA, USA, January 30-February 1 (2013)

Suggest Documents