ISSN: 1748-0345 (Online)
www.tagajournal.com
Prediction of Surface Roughness in turning process by using artificial neural network model 1
B.Radha Krishnan, 2R.Giridharan, 3S.Rajaram, 4R.Sanjeevi, 5M.Ramesh 1,2,3,4,5 Assistant Professor, Mechanical Engineering K.Ramakrishnan College of Engineering
[email protected]
Abstract This paper proposed a method for predicting the surface roughness using the machine vision system, which is influenced through an artificial network. Images at captured from the turning work piece and it will be extracted by using algorithm. Then the frequency extraction process has done by the Fourier transformation, which are major frequency F1 and principal component magnitude squared value F2. Experimental variable speed, depth of cut, feed rate and extracted image variables major frequency principal component magnitude squared value and gray level feed as input data and the output values are surface roughness as measured by the stylus probe. the ANN was trained by the input and output values, then it was tested by some input variables .based on the input the machine vision surface roughness values was getting as output values. Finally machine vision roughness value compared with stylus probe value for prediction of accuracy.
1. Introduction Surface roughness value influence the quality inspection parameter in many industries industries. Through proper optimum cutting parameters (tool feed rate, spindle speed, depth of cut) which is obtain the minimum surface roughness value. Artificial neural network is the first choice for the researcher to predict the surface roughness value. After the testing result comparison with stylus probe value there is attain the high accuracy level [1]. Computer vision is set new approach the for the inspection system. Machine vision has performed not only for the surface roughness and it also explore the many other fields, like welding inspection and shop floor. In this process the workpiece image captured by the computer vision system and it has extracted as the format of gray level co-occurrence matrix. Finally predict the surface roughness value based on the image extraction. Compare with the manual method it has used to rapid manufacturing process [2]. Manual measuring system has used for the measure for past half century. Due to the high production rate the conventional system has used as the wide range in production industries. Main disadvantage of the Conventional measurement is damage the probe tip by continuous usage in inspection process. Manual method does not fulfill the recent year automation development especially for rapid manufacturing. So that only most of the researcher concentrate in the machine vision inspection system to replace the manual inspection and measuring methods [3]. Before the 15 years ago the machine vision system appeared in research. The cost of the equipments and rough guide was creating the problem to further implementation. At the time the cost of the robot welding was very high. After the successful research and implement now it was used to improve the production range and automotive applications and etc.[4].Sub pixel value has extracted in gray scale image and it was used to predict the gray level range (Ga). In this process used the Olympus DSLR camera foe captured the specimen images. Speed, feed rate, depth of cut and frequency values feed into the input to the ANN. stylus probe roughness value train as the output. Finally the testing output was compared with the machine vision roughness value to analyze the accuracy level.
© 2018 SWANSEA PRINTING TECHNOLOGY LTD
2823
TAGA JOURNAL VOL. 14
ISSN: 1748-0345 (Online)
www.tagajournal.com
2. Methodology a) System configuration The basic elements of the machine vision system consist with a positioning mechanism, backlighting and 18-megapixel DSLR camera (model: canon 1300d) having a picture resolution of 3,872 2,592 pixels. The camera was fitted with a 6X close-up lens and connected through USB cable to the computer. The camera was arranged on an X–Y-axis motion camera mount. Image quality and focus can ensure by adjusting the X-Y .The machine tool used is the Optimum Vario D320 × 630 conventional lathe, and the work piece material is Al 6063 of 14-mm diameter. The camera was set to a maximum shutter speed of 1/4000 seconds, so that blurring caused by the rotation of work piece can be prevented. Work piece was produced by turning at a feed rate of 0.1 mm/ revolutions and, 0.5 mm depth of cut. After that machining process surface roughness values of the work piece has been measured by using stylus probe method. Then the image was captured by the DSLR camera and import to the MAT LAB software for the grey scale conversion. As the output of the DSLR camera is in pixels and the surface roughness of work piece must be determined in micro meter. b) Cutting parameters and feature extraction The constant cutting parameters were selected for the Al 6063 Material, the cutting speed range 2000 rpm, the feed rate is 0.1 mm per revolution, the depth of cut is 0.5 mm. The average value of the surface roughness Ra is the mostly used as the output parameter in various industries. Fig.1 below the image captured by the canon 1300D DSLR camera for surface extraction
Fig .1. Machining surface of the Turning Work piece
Frequency extracted as two types 1. Major peak frequency – FR 1 2. Principal component magnitude squared – FR 2 The major peak frequency derived from the machining surface of the turning workpiece in Figure 1. Analyze and derived the data and various gray level for each surface images. Image texture was derived through calculation by using the FR 1 and FR 2 .Then predicting the average gray level for each surface texture by using mathematical calculation , Totally 40 training set of data feed as the input and output to the Artificial neural network.
© 2018 SWANSEA PRINTING TECHNOLOGY LTD
2824
TAGA JOURNAL VOL. 14
ISSN: 1748-0345 (Online)
www.tagajournal.com
Table. 1. Training database
Frequency Speed (rpm)
Depth of Cut (mm/rev)
1
2000
2
Stylus Probe Surface roughness value (Ra)
Feed Rate (mm)
Gray Level (Ga)
F1
F2
0.5
0.1
125.236
121.56
32.65
0.869
2000
0.5
0.1
134.564
65.89
38.45
0.759
3
2000
0.5
0.1
164.466
64.89
39.46
1.236
4
2000
0.5
0.1
94.464
102.65
40.56
0.987
5
2000
0.5
0.1
123.445
73.58
41.16
0.852
6
2000
0.5
0.1
100.564
95.66
38.45
0.875
7
2000
0.5
0.1
105.464
90.65
37.49
0.863
8
2000
0.5
0.1
105.464
85.45
33.29
1.895
9
2000
0.5
0.1
104.44
100.85
35.08
1.987
10
2000
0.5
0.1
115.564
118.65
36.89
0.876
11
2000
0.5
0.1
112.754
95.87
41.89
0.985
12
2000
0.5
0.1
135.454
62.32
40.89
0.963
13
2000
0.5
0.1
118.645
74.23
40.89
0.987
14
2000
0.5
0.1
114.454
88.55
36.48
0.856
15
2000
0.5
0.1
115.845
75.44
34.89
0.845
16
2000
0.5
0.1
118.454
95.11
32.78
0.962
17
2000
0.5
0.1
117.842
75.49
36.48
1.358
18
2000
0.5
0.1
95.275
65.11
35.98
1.025
19
2000
0.5
0.1
98.844
88.44
40.48
1.269
20
2000
0.5
0.1
126.445
86.56
33.25
0.987
21
2000
0.5
0.1
99.425
64.89
34.89
0.852
22
2000
0.5
0.1
109.454
97.55
39.75
0.965
23
2000
0.5
0.1
104.421
67.49
37.29
1.265
24
2000
0.5
0.1
105.874
109.4
32.56
0.896
25
2000
0.5
0.1
125.942
100.48
35.69
0.965
26
2000
0.5
0.1
118.845
98.45
36.59
0.875
27
2000
0.5
0.1
108.854
83.15
42.56
0.856
28
2000
0.5
0.1
101.154
94.22
34.98
0.836
29
2000
0.5
0.1
100.541
97.46
37.59
1.165
30
2000
0.5
0.1
95.788
94.56
31.59
1.036
S.NO
© 2018 SWANSEA PRINTING TECHNOLOGY LTD
2825
TAGA JOURNAL VOL. 14
ISSN: 1748-0345 (Online)
www.tagajournal.com
31
2000
0.5
0.1
121.45
90.65
38.69
1.115
32
2000
0.5
0.1
120.545
89.46
42.15
1.111
33
2000
0.5
0.1
114.856
80.46
35.25
0.987
34
2000
0.5
0.1
102.864
84.46
36.33
0.874
35
2000
0.5
0.1
107.858
73.6
37.98
0.875
36
2000
0.5
0.1
100.965
79.49
33.65
0.965
37
2000
0.5
0.1
98.121
74.46
37.45
0.987
38
2000
0.5
0.1
100.884
69.4
35.69
1.125
39
2000
0.5
0.1
135.484
67.49
41.22
0.965
40
2000
0.5
0.1
128.875
66.48
35.99
1.104
In the experiments, Single specimen was operated at the constant cutting parameter range. The above table showed the data for training the neural network. The architecture of the artificial neural network show in fig 2. Back propagation method used to train the neural network for prediction of surface roughness. Here that derived the number of input hidden layer and neurons count also. The neural networks model can modified based on the requirement and accuracy level. The sequence arrangement of the ANN structure was 6-10-1.
Fig . 2. Structure of ANN
Execution of training procedure, analyze the validation performance for each neuron by using the performance graph. Artificial neural network performance Graph has shown in Fig. 3. Real time surface roughness can perform by the artificial neural network
© 2018 SWANSEA PRINTING TECHNOLOGY LTD
2826
TAGA JOURNAL VOL. 14
ISSN: 1748-0345 (Online)
www.tagajournal.com
Fig. 3. Performance of ANN (6-10-1)
3. Experimental verification Reviewing the neural network to identify the surface roughness of Turning at single specimens using constant cutting parameter has performed. Input of the ANN is the spindle speed, tool feed rate, depth of cut, major FR 1, FR 2 and the Gray level (Ga), then check the output roughness value based on the testing input parameter it was displayed in MAT LAB work space sheet. Below Table 2 show the results about the manual surface roughness and machine vision surface roughness method. Table . 2. Experimental Testing Results Frequency
Stylus Probe Surface roughness value (Ra)
Machine vision roughness value
S.NO
Speed (rpm)
Depth of Cut (mm/rev)
Feed Rate (mm)
Gray Level (Ga)
F1
F2
1
2000
0.5
0.1
114.854
80.46
34.88
1.103
1.084
2
2000
0.5
0.1
110.454
97.49
31.99
0.978
1.104
3
2000
0.5
0.1
104.754
108.49
33.45
1.045
1.104
4
2000
0.5
0.1
98.456
88.49
38.46
1.236
1.051
5
2000
0.5
0.1
97.205
90.49
41.33
0.963
1.022
6
2000
0.5
0.1
100.201
98.95
40.88
0.875
1.064
7
2000
0.5
0.1
104.568
90.46
42.69
0.852
1.019
8
2000
0.5
0.1
107.894
97.46
32.59
0.963
1.114
9
2000
0.5
0.1
132.845
99.56
33.45
1.364
1.022
10
2000
0.5
0.1
128.892
106.49
36.38
1.045
1.033
Formula for calculate the percentage of error: % of Error ₌ (Predicted value - Experimental) / Experimental value
© 2018 SWANSEA PRINTING TECHNOLOGY LTD
2827
TAGA JOURNAL VOL. 14
ISSN: 1748-0345 (Online)
www.tagajournal.com
Fig 4. Comparison between stylus probe and machine vision roughness value
The machine vision surface roughness value evaluated by manual stylus probe surface roughness value for remaining sets. Comparison result between manual and machine vision shown in fig 4.based on the comparison value to predict the percentage of error and accuracy of the vision system. The predicted roughness values through machine vision result validated by the ten sets of testing data from the experimental value of the surface roughness in end turning in Fig. 4. The result show the average error of the prediction of surface roughness in turning using ANN is 2.46%, i.e., the accuracy is 97.54%.
4. Conclusion This proposed method not only for surface roughness prediction, it also used to the welding and damage inspection also. When increase the feature extraction and number of testing samples as used to improve the accuracy of the machine vision system. In this paper by using machine vision system achieved average error percentage was 2.46 % and 97.54% accuracy. Apart from the machining process the machine vision can utilize in assembly and medical care and automotive applications REFERENCES 1.
Ranganath M S *, Vipin, R S Mishra, “Application of ANN for Prediction of Surface Roughness in Turning Process: A Review, International Journal of Advance Research and Innovation”, Volume 1, Issue 3 (2013) 229-233(2013).
2.
E.S. Gadelmawla, “A vision system for surface roughness characterization using the gray level co-occurrence matrix” NDT&E International 37 (2004) 577–588.
3.
Kiran MB, Ramamoorthy B, Radhakrishnan B (1998), “Evaluation of surface roughness by vision system”. Int J Mach 38(5–6):685–690.
4.
Mike Wilson, “Vision systems in the automotive industry” Industrial Robot: An International Journal Volume 26 . Number 5 . 1999 . pp. 354-357.
5.
B. M. Kumar, M. M. Ratnam, (2015) "Machine vision method for non-contact measurement of surface roughness of a rotating workpiece", Sensor Review”, Vol. 35 Issue: 1, pp.10-19.
6.
Damodarasamy S, Raman S (1991), “Texture analysis using computer vision”. Comput Ind 16:25–34 .
© 2018 SWANSEA PRINTING TECHNOLOGY LTD
2828
TAGA JOURNAL VOL. 14
ISSN: 1748-0345 (Online)
www.tagajournal.com
7.
Gupta M, Raman S (2001), “Machine vision assisted characterization of machined surfaces”. Int J Prod Res 39(4):759–784.
8.
Vorburger TV, Rhee H-G, Renegar TB, Song J-F, Zheng A (2007) “Comparison of optical and stylus methods for measurement of surface texture”. Int J Adv Manuf Technol 33:110–118 4.
9.
Younis MA (1998) “On line surface roughness measurements using image processing towards an adaptive control”. Comput Ind Eng 35(1–2):49–52 5.
10. Galante G, Piacentini M, Ruisi VF (1991) “Surface roughness detection by tool image processing”. Wear 148:211–
220. 11. Choudhury IA, El-Baradie MA (1997) “Surface roughness in the turning of high-strength steel by factorial design of
experiments”. J Mater Process Technol 67:55–61. 12.
Dimla E, Dimla S (1999) “Application of perceptron neural network to tool-state classification in a metal-turning operation”. Eng Application Artif Intell 12:471–477.
13. Al-Kindi GA, Baul RM, Gill KF (1992) “An application of machine vision in the automated inspection of engineering
surfaces”. Int J Prod Res 30(2):241–253. 14. Luk F, Huynh V “A vision system for in-process surface quality assessment” In: Proceedings of the Vision_87 SME
Conference, Detroit, Michigan, 1987, vol 12 p. 43–58. 15. Venkata Ramana K, Ramamoorthy B (1996) “Statistical methods to compare the texture features of machine surfaces”.
Pattern Recognit 29(9):1447–1459.
© 2018 SWANSEA PRINTING TECHNOLOGY LTD
2829
TAGA JOURNAL VOL. 14