Document not found! Please try again

Enhanced Automated Guidance System for Horizontal Auger Boring ...

2 downloads 0 Views 6MB Size Report
Feb 15, 2018 - Horizontal Auger Boring Based on Image Processing. Lingling Wu 1, Guojun Wen 1,*, Yudan Wang 1, Lei Huang 2 and Jiang Zhou 1. 1.
sensors Article

Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing Lingling Wu 1 , Guojun Wen 1, *, Yudan Wang 1 , Lei Huang 2 and Jiang Zhou 1 1 2

*

School of Mechanical & Electronic Information, China University of Geosciences, Wuhan 430074, China; [email protected] (L.W.); [email protected] (Y.W.); [email protected] (J.Z.) Shandong Institute of Space Electronic Technology, Yantai 264670, China; [email protected] Correspondence: [email protected]; Tel.: +86-027-6788-3086

Received: 18 December 2017; Accepted: 8 February 2018; Published: 15 February 2018

Abstract: Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, feature extraction algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system. Keywords: trenchless; image processing; guidance system; auto-focus

1. Introduction Horizontal auger boring (HAB) is a common trenchless construction method for installing gravity or pressure pipes, including sewer and hot water pipelines. It has advantages of low disruptions to the environment, low footprint, small earthwork quantity, and less external interference [1–4]. Before drilling, two pits, a “jacking pit” and a “reception pit”, are excavated. A drilling rig is installed inside the jacking pit. During drilling, the rig pushes the pipelines towards the reception pit in the designated direction. The spoil is removed with helical screw conveyors [5]. Owing to differing soil conditions and the influence of gravity, the drilling track often skews. Therefore, a guidance system is required to monitor the line and grade in real time for taking corresponding control measures. Common guidance for monitoring line and grade is divided into two categories. One is the electromagnetic method, and the other is the optical method. The electromagnetic method is comprised of a receiver, a display, and a transmitter. The transmitter is installed underground at the end of the drill bit, whereas the receiver is held by an operator on the ground. The system requires one operator to scan the ground, decreasing efficiency [6]. By this method, the monitoring error increases with the depth of the transmitter. The guidance precision barely meets the requirements of gravity and pressure pipelines. Additionally, electromagnetic signal is highly vulnerable to the underground pipes and geomagnetic field [7]. The optical methods have the advantage of high precision and unaffected by the geomagnetic field. They can be further divided into two parts, based on the optical target. Those are laser and LED guidance systems. Normally, a laser guidance system is employed when the project is large-diameter and long distance. The LED guidance system employs a target made of LEDs, Sensors 2018, 18, 595; doi:10.3390/s18020595

www.mdpi.com/journal/sensors

Sensors 2018, 18, x

2 of 14

Sensors 2018, 18, x

2 of 14

on the optical target. Those are laser and LED guidance systems. Normally, a laser guidance system ison employed when the project is large-diameter long distance. LED guidance employs the optical target. Those are laser and LEDand guidance systems.The Normally, a laser system guidance system aistarget madewhen of LEDs, used in andlong short-distance projects. However, its guidance employed the project is small-diameter large-diameter and distance. The LED guidance system employs Sensors 2018, 18, 595 2 of 14 accuracy on operator control, because one is required to observe the image using a a target greatly made ofdepends LEDs, used in small-diameter and short-distance projects. However, its guidance charge-coupled device (CCD) camera and to manually estimate the deflection, resulting in an elusive accuracy greatly depends on operator control, because one is required to observe the image using a operating standard [8]. (CCD) camera and to manually estimate the deflection, resulting in an elusive charge-coupled device used in small-diameter and short-distance projects. However, its guidance accuracy greatly depends This article presents enhanced automated detection method based on image operating [8]. anone on operatorstandard control, because is required to observedeflection the image using a charge-coupled device (CCD) processing to achieve high guidance system accuracy for HAB. Emphasis is placed on efforts to This article presents an enhanced automated deflection detection method based on camera and to manually estimate the deflection, resulting in an elusive operating standard [8]. image enhance image quality,high including optimization of LEDfor targets, image reprocessing and featureto processing to achieve accuracy HAB. Emphasis is placed This article presents anguidance enhancedsystem automated deflection detection method basedononefforts image extraction. Further pixel calculations are applied to reprocessed images. The auto-focus algorithm enhance image quality, includingsystem optimization image reprocessing feature processing to achieve high guidance accuracyofforLED HAB.targets, Emphasis is placed on effortsand to enhance maintains clarity of images. Additionally, concrete values of drill-bit deflection and rotation angles extraction. Further pixel calculations are applied to reprocessed images. The auto-focus algorithm image quality, including optimization of LED targets, image reprocessing and feature extraction. are calculated andof displayed on a personalconcrete computer, so that the operators canand be rid of manual maintains clarity images. of drill-bit deflection rotation angles Further pixel calculations areAdditionally, applied to reprocessedvalues images. The auto-focus algorithm maintains drilling estimations. The proposed guidance system can overcome many limitations of the current are calculated and displayed on a personal computer, so that the operators can be rid of manual clarity of images. Additionally, concrete values of drill-bit deflection and rotation angles are calculated manual evaluation andThe canproposed improve the level ofsystem automation. drilling estimations. guidance can overcome limitations the current and displayed on a personal computer, so that the operators can be ridmany of manual drillingofestimations. manual evaluation andsystem can improve the levelmany of automation. The proposed guidance can overcome limitations of the current manual evaluation and 2. Components of the Enhanced Automated Guidance System can improve the level of automation. 2. Components ofautomated the Enhanced Automated System The enhanced guidance systemGuidance consists of an LED target, a personal computer, and Components of the Enhanced Automated Guidance System a2.theodolite, as shown in Figure 1. The LED target acts as the following the drill bit. A CCD The enhanced automated guidance system consists of ansignal, LED target, a personal computer, and camera, onautomated thein theodolite, shoots images of the LED target. The personal computer is a theodolite, as shown Figure 1. The LED target acts as of the following the drill bit. A CCD Themounted enhanced guidance system consists ansignal, LED target, a personal computer, employed to preprocess images, extract features, calculate deflection and display results. camera, mountedason the theodolite, shoots of the target. Thefollowing personal the computer and a theodolite, shown in Figure 1. The images LED target actsLED as the signal, drill bit.is to preprocess images, extract features, calculate and display results. Aemployed CCD camera, mounted on the theodolite, shoots images of deflection the LED target. The personal computer is employed to preprocess images, extract features, calculate angles and deflection and display results.

Figure 1. Main components of the proposed guidance system. Figure 1. Main components of the proposed guidance system. Figure 1. Main components of the proposed guidance system.

The components are set up as shown in Figure 2. The LED target (Figure 2, part 4), installed at the end ofcomponents a drill bit (Figure 2, part 5), advances with the drill bit. The(Figure theodolite partat2)the isat The are upup asas shown in in Figure 2. The LED target 2, part 4), installed The components aresetset shown Figure 2. The LED target (Figure 2,(Figure part 4),2,installed in the jacking pit, at the same grade with the center of the drill rod (Figure 2, part 3). It continues end of a drill bit (Figure 2, part 5), advances with the drill bit. The theodolite (Figure 2, part 2) is in the the end of a drill bit (Figure 2, part 5), advances with the drill bit. The theodolite (Figure 2, part 2) is monitoring thethe LED through thecenter vacant string during At It set intervals, of jacking pit, at same grade with the ofdrill the drill rod (Figure 2, part 3). continues in the jacking pit, attarget the same grade with the center of the drilldrilling. rod (Figure 2, part 3). monitoring Itimages continues LED target are shot and transmitted to theduring personal computer (Figure 2, images part 1).of the LED target theimmediately vacant drill drilling. At set intervals, images of LED target monitoring thethrough LED target through thestring vacantduring drill string drilling. At set intervals, Subsequently these images are processed to obtain deflection and rotation angle, which are also are shot and immediately transmitted to the personal computer (Figure 2, part 1). Subsequently these LED target are shot and immediately transmitted to the personal computer (Figure 2, part 1). displayed thethese monitor. images areon processed to obtainare deflection andtorotation whichand are rotation also displayed the monitor. Subsequently images processed obtain angle, deflection angle, on which are also displayed on the monitor.

Figure 2. Relative Relativeposition positionofofcomponents components in the guidance system: 1, monitor; 2, charge-coupled Figure 2. in the guidance system: 1, monitor; 2, charge-coupled device device (CCD) camera; 3, drill rod; 4, light-emitting diode (LED) target; and 5, drill bit. Figurecamera; 2. Relative position of componentsdiode in the guidance 1, monitor; 2, charge-coupled (CCD) 3, drill rod; 4, light-emitting (LED) target;system: and 5, drill bit. device (CCD) camera; 3, drill rod; 4, light-emitting diode (LED) target; and 5, drill bit.

3. Design and Optimization of the LED Target The LED target contains a filter plate, a fixing plate, a printed circuit board (PCB), and LEDs, as shown in Figure 3. The filter plate is employed to filter the LED light. The PCB is used to simplify

Sensors 2018, 18, x

3. Design and Optimization of the LED Target

Sensors 2018, 18, 595

3 of 14

3 of 14

The LED target contains a filter plate, a fixing plate, a printed circuit board (PCB), and LEDs, as shown in Figure 3. The filter plate is employed to filter the LED light. The PCB is used to simplify the the electro circuit. All LEDs are laid out in two circles and one long line. The two circles are used to electro circuit. All LEDs are laid out in two circles and one long line. The two circles are used to show show the center of the target, whereas the long line indicates direction. The target is customized for the center of the target, whereas the long line indicates direction. The target is customized for the drill the the inside diameter of Thus, 60 mm. the diameter is designed as 60 mm. rod drill withrod the with inside diameter of 60 mm. theThus, diameter is designed as 60 mm.

Figure 3. Constitution of designed LED target: 1, filter plate; 2, fixing plate; 3, printed circuit board Figure 3. Constitution of designed LED target: 1, filter plate; 2, fixing plate; 3, printed circuit board (PCB); and 4, LEDs. (PCB); and 4, LEDs.

3.1. Optimization of LED Target 3.1. Optimization of LED Target LED target image clarity is very important in calculating deflection and drill-bit angle. The LED clarity is very in calculating deflection andimage drill-bit angle.Therefore, The LED color,LED filtertarget plate,image light intensity, and important distance between LEDs may all affect quality. color, filter plate, light intensity, and distance between experiments are conducted to optimize the LED target. LEDs may all affect image quality. Therefore, experiments are conducted to optimize the LED target. 3.1.1. LED Color Selection 3.1.1. LED Color Selection The target is mounted in a dark environment. Thus, proper light color will benefit image quality. The target is used in a dark environment. Thus, proper light color will benefit image quality. Short wavelength light is more easily scattered. Red light, common among LED colors, has the longest Short wavelength light is more easily scattered. Red light, common among LED colors, has the longest wavelength. Therefore, we use red LEDs to build the target. To ensure that the red LED works best, wavelength. Therefore, we use red LEDs to build the target. To ensure that the red LED works best, experiments are conducted with different colored targets (i.e., red, pink, yellow, blue, green, and experiments are conducted with different colored targets (i.e., red, pink, yellow, blue, green, and white). white). Testing the images at 100 m, the red LEDs are clearest, as expected, for distinguishing the Testing the images at 100 m, the red LEDs are clearest, as expected, for distinguishing the center and center and direction of the target. direction of the target. 3.1.2. Filter Plate Color Selection 3.1.2. Filter Plate Color Selection To improve the clarity of the image, we install a filter plate in front of the LED target to improve To improve the clarity of the image, we install a filter plate in front of the LED target to improve light purity. Four filter plates (i.e., red, yellow, green, and blue) are fabricated and installed in front light purity. Four filter plates (i.e., red, yellow, green, and blue) are fabricated and installed in front of of the target. Checking the images at 100 m, the blue filter, combined with the red LED target, has the the target. Checking the images at 100 m, the blue filter, combined with the red LED target, has the best image quality. best image quality. 3.1.3. Resistance 3.1.3. Resistance Selection Selection Proper lightintensity intensityisis important to image quality. intensity be changed with Proper light important to image quality. LEDLED lightlight intensity can becan changed with electro electro circuit resistance. Therefore, a slide rheostat is connected series LED We target. We circuit resistance. Therefore, a slide rheostat is connected in seriesinwith thewith LEDthe target. change change the slide rheostat every 20 Ω and observe the image. When resistance rests between 140 Ω to the slide rheostat every 20 Ω and observe the image. When resistance rests between 140 Ω to 180 Ω, 180 images Ω, the images are relatively clear. Therefore, weaemploy a resistance of the 160 electrical Ω on thecircuit. electrical the are relatively clear. Therefore, we employ resistance of 160 Ω on circuit. 3.1.4. Distance between LEDs Selection 3.1.4. Distance between LEDs Selection The distance between LEDs affects image quality. If the distance is large, it is advantageous to The distance between LEDs affects image quality. the of distance is large, it is advantageous distinguish every LED in the image while increasing theIfsize the target. Otherwise, the outline to of distinguish LEDtoinbethe image while increasing the target. Otherwise, the outline of each LED isevery too near distinguished. Therefore, the we size findof the optimum distance between LEDs each LEDthe is too near of to every be distinguished. Therefore, wethe find the target optimum betweenselection LEDs to to clarify outline diode. Figure 4 displays LED useddistance in the distance clarify the outline of every diode. Figure 4 displays LED target in the distance experiment. The LEDs on the target are very denselythe installed. We used successively turn off selection LEDs in experiment. Theand LEDs on the to target are very densely As installed. off at LEDs in different circles intervals contrast the outlines. before,We we successively evaluate the turn images 100 m. When the distance between LEDs is 11 mm, the outline of every LED is the clearest.

Sensors 2018, 18, x

4 of 14

Sensors 2018, 18, x

4 of 14

different circles and intervals to contrast the outlines. As before, we evaluate the images at 100 m. When thecircles distance between LEDs is 11 mm, outline As of every LED the clearest. Sensors 2018, 18, 595 and 4 ofm. 14 different intervals to contrast thethe outlines. before, we isevaluate the images at 100 When the distance between LEDs is 11 mm, the outline of every LED is the clearest.

Figure 4. LED target for distance selection experiment. Figure 4. LED target for distance selection experiment.

Figure 4. LED target for distance experiment. Based the experiments above, the parameters for theselection LED target are obtained. The target requires a 65-mm diameter, red LED, blue filter plate, 11-mm distance, 160-Ω and target two 1.5-V dry Based the experiments above, the parameters for the LED target areresistance, obtained. The requires Based the experiments above, the parameters for the LED target are obtained. The target requires batteries. a 65-mm diameter, red LED, blue filter plate, 11-mm distance, 160-Ω resistance, and two 1.5-V dry a 60-mm diameter, red LED, blue filter plate, 11-mm distance, 160-Ω resistance, and two 1.5-V batteries. dry Verification batteries. of LED Target 3.2. 3.2. Verification of LED Target To verify the effect of the optimized LED target, an indoor experiment is conducted to shoot 3.2. Verification of LED Target images. In this test, our optimized LED target is installed the endexperiment of each drillisrod, whose length is To verify the effect of the optimized LED target, anatindoor conducted to shoot To verify the effect of the optimized LED target, an indoor experiment is conducted to shoot about 9 In m.this Figure shows the images the is target shot at bythe a CCD Therod, first imagelength is shot images. test,5our optimized LED of target installed end ofcamera. each drill whose is images. this test,is5our optimized is installed at endimage ofcamera. eachisdrill whose is when installed m LED fromtarget string. shot when the length target is about the 9 In m.camera Figure shows the1 images ofthe thedrill target shotThe bythe anext CCD Therod, first image is shot about 9 m. Figure 5 shows the images of the target shot by a CCD camera. Figure 5a is shot when the installed the endisofinstalled the first 1drill rod, and forth. By checking images in Figure wetarget can see when theatcamera m from the so drill string. The nextthe image is shot when6,the is camera installed mthe from the drillrod, string. 5b By iscircle, shot when theimages target isininstalled the47.7 endsee of that the isoutline of 1each LED, even those in Figure the inner is clearly distinguished within m. installed at the end of first drill and so forth. checking the Figure 6,atwe can the rod, so LED, forth. By able checking Figure 6,the weouter can see that outline of each Otherwise, the inner LEDs are not to be discerned. However, ones arethe stillwithin clear enough thatfirst the drill outline ofand each even those inthe theimages inner in circle, is clearly distinguished 47.7 m. LED, even those in the inner circle, is clearly distinguished within 47.7 m. Beyond that, the inner LEDs to be discriminated. are still useful for pointing theones direction, even atenough 98.8 m. Otherwise, the innerAdditionally, LEDs are notLED able images to be discerned. However, the outer are still clear are hardly not able LED to be target, discerned. However, the outer ones are still clearthe enough to beeven discriminated. Our designed at LED long distance, following characteristics: simple fabrication, to be discriminated. Additionally, images arehas stillthe useful for pointing direction, at 98.8 m. Additionally, LED images areand still useful for pointing thefollowing direction,characteristics: even at 98.8 m.simple fabrication, ease of installation, low cost, image quality. Our designed LED target, at good long distance, has the ease of installation, low cost, and good image quality.

(a) 1.0 m (a) 1.0 m

(b) 10.0 m (b) 10.0 m

(c) 19.3 m (c) 19.3 m

(d) 28.6 m (d) 28.6 m

(e) 37.9 m (e) 37.9 m

(f) 47.7 m (f) 47.7 m

(g) 56.7 m (g) 56.7 m

(h) 65.7 m (h) 65.7 m

(i) 75.3 m (i) 75.3 m

Figure 5. Cont.

Sensors 2018, 18, 595

5 of 14

Sensors 2018, 18, x

5 of 14

(j) 84.8 m

(k) 91.8 m

(l) 98.8 m

Figure 5. Images of LED target captured by CCD camera at different distances.

4. Software System Design For HAB, if the drill bit advances in the predesigned track, the center of the LED target (i.e., point O) coincides with the center of the drill rod (i.e., point O'), because these images are shot (j) 84.8 m (k) 91.8 mrod and the LED target are through a vacant drill string, as shown in Figure 6a. m Otherwise,(l) the98.8 drill not concentric,Figure as illustrated inofFigure 6b. Therefore, monitoring theattarget state is important to obtain 5. Images LED target captured by CCD camera different distances. Figure 5. Images of LED target captured by CCD camera at different distances. the drilling situation. 4. Software System Design For HAB, if the drill bit advances in the predesigned track, the center of the LED target (i.e., point O) coincides with the center of the drill rod (i.e., point O'), because these images are shot through a vacant drill string, as shown in Figure 6a. Otherwise, the drill rod and the LED target are not concentric, as illustrated in Figure 6b. Therefore, monitoring the target state is important to obtain the drilling situation.

(a)

(b)

Figure 6. Schematic diagram of target state: (a) target coinciding with drill rod; (b) target deviating Figure 6. Schematic diagram of target state: (a) target coinciding with drill rod; (b) target deviating from drill rod (drill bit deviating from planned track). The blue reference lines are to show the center from drill rod (drill bit deviating from planned track). The blue reference lines are to show the center of of the drill rod. The red lines are to show the layout of LEDs. the drill rod. The red lines are to show the layout of LEDs.

Once deviation occurs, an operator must immediately stop boring and steer the drill bit a certain target, at long distance, hasrun thethrough following characteristics: fabrication, angleOur (i.e.,designed rotation LED angle) to make the line, OCBA, point O'. Then, thesimple forward direction (a) (b) ease of installation, low cost, and good image quality. can be changed by thrusting the drill bit, because it has a slanted face that enables it to move in any Figure 6. Schematic diagram of target state: (a) target coinciding with drill rod; (b) target deviating desired direction at Design any time [9]. Thrusting should be stopped only when point O meets point O', 4. Software System from drill rod (drill bit deviating from planned track). The blue reference lines are to show the center meaning the drill bit returns to the pre-defined track. Thus, the rotation angle and deflection are of the drill rod. The redbit lines are to show thepredesigned layout of LEDs. For HAB, the drill in they the track, the between center of line the LED (i.e.,OCBA, point significant for if correction. Inadvances the image, appear as the angle OO' target and line 0 ), because these images are shot through O) coincides with the center of the drill rod (i.e., point O and the length of line OO'. Once deviation occurs, an operator must immediately stop boring and steer the drill bit a certain a vacant drill string, as technology shown in Figure Otherwise, the drill rod and which the LED target are not Image processing is the6a.core of the software design, can extract the angle (i.e., rotation angle) to make the line, OCBA, run through point O'. Then, the forward direction concentric, as illustrated in Figure 6b. Therefore, monitoring the target state is important to obtain the characteristic information of the target. Image preprocessing is employed to obtain suitable images can be changed by thrusting the drill bit, because it has a slanted face that enables it to move in any drilling situation. for feature extraction and pixel calculation (i.e., angle measurement and deflection detection). desired direction at any time [9]. Thrusting should be stopped only when point O meets point O', Once deviation occurs, operator immediately stop based boringon andthe steer the drillimages. bit a certain Additionally, auto-focus is an achieved bymust mechatronics design processed The meaning the drill bit returns to the pre-defined track. Thus, the rotation angle and deflection are 0 . Then, angle (i.e., rotation angle) to make the line, OCBA, run through point O the forward direction software system design block diagram is given in Figure 7. All image processing is conducted in significant for correction. In the image, they appear as the angle between line OO' and line OCBA, can be changed by thrusting the® ,drill bit, because it has a slanted face that enables it to move in any MATLAB (R2017b, MathWorks Natick, MA, USA). and the length of line OO'. desired direction at any time [9]. Thrusting should be stopped only when point O meets point O0 , Image processing technology is the core of the software design, which can extract the meaning the drill bit returns to the pre-defined track. Thus, the rotation angles and deflection are characteristic information of the target. Image preprocessing is employed to obtain suitable images significant for correction. In the image, they appear as the angle between line OO0 and line OCBA, and Image Feature Angleangle measurement Deflection for feature extraction and pixel calculation (i.e., and Auto-focus deflection detection). the length ofPreprocessing line OO0 . Extraction Measurement Detection Additionally, auto-focus is achieved by mechatronics design based on the processed images. The Image processing technology is the core of the software design, which can extract the characteristic software system design block diagram is given in Figure 7. All image processing is conducted in information of the target. Image preprocessing is employed to obtain suitable images for feature MATLAB (R2017b, MathWorks® , Natick, MA, USA). extraction and pixel calculation (i.e., angledesign measurement andofdeflection detection). Additionally, Figure 7. Software system block diagram image processing. auto-focus is achieved by mechatronics design based on the processed images. The software system design block diagram is given in Figure 7. All image processing is conducted in MATLAB (R2017b, Image MA, USA). Feature Angle Deflection MathWorks® , Natick, Auto-focus Preprocessing

Extraction

Measurement

Detection

Figure 7. Software system design block diagram of image processing.

Image processing technology is the core of the software design, which can extract the characteristic information of the target. Image preprocessing is employed to obtain suitable images for feature extraction and pixel calculation (i.e., angle measurement and deflection detection). Additionally, auto-focus is achieved by mechatronics design based on the processed images. The software Sensors 2018,system 18, 595 design block diagram is given in Figure 7. All image processing is conducted 6 of in 14 ® MATLAB (R2017b, MathWorks , Natick, MA, USA).

Image Preprocessing

Feature Extraction

Angle Measurement

Deflection Detection

Auto-focus

Figure 7. Software system design block diagram of image processing. Figure 7. Software system design block diagram of image processing.

Sensors 2018, 18, x

6 of 14

4.1. Image Image Preprocessing Preprocessing Algorithm 4.1. Sensors 2018, 18, x

6 of 14

Image preprocessing important to promote the efficiency of feature pixel calculation Image preprocessingis is important to promote the efficiency ofextraction, feature extraction, pixel 4.1. Image Preprocessing (i.e., angle measurement and deflection detection) and auto-focus. The fowchart of image preprocessing calculation (i.e., angle measurement and deflection detection) and auto-focus. The fowchart of image is shownImage in Figure 8. Corresponding target ofthe each preprocessing procedure areprocedure shown in is important to images promotetarget efficiency feature extraction, pixel preprocessing ispreprocessing shown in Figure 8. Corresponding images ofofeach preprocessing Figure 9. calculation (i.e., angle are shown in Figure 9. measurement and deflection detection) and auto-focus. The fowchart of image preprocessing is shown in Figure 8. Corresponding target images of each preprocessing procedure are shown in Figure 9. Image Acquisition

Image Enhancement

Image Acquisition

Image Sharpening

Image Enhancement

Image Sharpening

Image Segmentation Image Segmentation

ConnectedComponent Analysis Connected-

Image Display Image Display

Component Analysis

Figure 8. Flowchart of image preprocessing. Figure 8. Flowchart of image preprocessing. Figure 8. Flowchart of image preprocessing.

(a) (a)

(b) (b)

(e)

(c)(c)

(e)

(f)

(f)

(d) (d)

(g)

(g)

Figure Target images processing procedure. (a) Original (a) image; (b) Grayimage; image; (c) Image after Figure 9. 9.Target imagesin each in each processing procedure. Original (b) Gray image; Figure 9. Target images in eachSmoothed processing procedure. (a) Original image; (b)image; Gray(g) image; (c) Image after histogram (e) Sharpened image; Binary image. (c) Image afterequalization; histogram (d) equalization;image; (d) Smoothed image; (e)(f) Sharpened image;Final (f) Binary image; histogram equalization; (d) Smoothed image; (e) Sharpened image; (f) Binary image; (g) Final image. (g) Final image. The color image (Figure 9a) captured by the camera, term as original image, must be firstly The colortoimage (Figure 9a) in captured the storage camera,space termand as promote originaloperation image, must be firstly converted a grayscale image order to by reduce efficiency. The color image (Figure 9a) captured by the camera, term as original image, must be firstly The conversion from RGB to gray can be achieved using Equation (1). converted to a grayscale image in order to reduce storage space and promote operation efficiency.

converted to a grayscale order to reduceusing storage space and The conversion from RGBimage to grayincan (1). promote operation efficiency. (1) Ybeachieved 0.3R  0.59G  Equation 0.11B The conversion from RGB to gray can be achieved using Equation (1).  0.3 R  0.59ofGthe  0.11 B respectively. Figure 9b shows (1) where R, G, B are the red, green andY blue components RGB image the gray image after conversion. Y = 0.3R + 0.59G + 0.11B (1) where R, G, B are the red, green and blue components of the RGB image respectively. Figure 9b shows Image enhancement can highlight the region-of-interest (ROI) contrast against the background. the gray image afterorconversion. It also eliminates to make images more suitable special applications [10–shows where R, G,weakens B are the red, green interference and blue components of the RGB imagefor respectively. Figure 9b Image enhancement can highlight the region-of-interest (ROI) contrast against the background. 12]. To enhance target images, techniques, including gray histogram, image smoothing, and the gray image afterLED conversion. It also weakens or eliminates interference to make images more suitable for special applications [10– image sharpening, are implemented on images. Image enhancement can highlight the region-of-interest (ROI) contrast against the background. The histogram of a digital image can reflectincluding an overview of histogram, this image. Itimage is a practical and and 12]. To enhance LED target images, techniques, gray smoothing, It also weakens or eliminates interference to make images more suitable for special applications [10–12]. effective way to are enhance the image on using histogram equalization, which expands the selected range image sharpening, implemented images. To enhance LEDThe target images, techniques, including gray histogram, image smoothing, and image of intensity. target image, processed by reflect histogram shown in Figure The histogram of a digital image can an equalization, overview ofis this image. It is9c.a practical and sharpening, arehistogram implemented on images. After imagehistogram clarity improves. Concurrently, image noise increases.range effective way to enhance equalization, the image using equalization, which expands the selected Image smoothing [13] can be used to compress image noise. In this paper, we choose the smoothing of intensity. The target image, processed by histogram equalization, is shown in Figure 9c. linear filters (i.e., averaging filters) for their real-time performance and reliable image quality, After histogram equalization, image clarity improves. Concurrently, image noise increases. compared to several other linear and non-linear filters. Averaging filters can be noted by the Image smoothing expression (2). [13] can be used to compress image noise. In this paper, we choose the smoothing linear filters (i.e., averaging filters) for their real-time performance and reliable image quality,

1

Sensors 2018, 18, 595

7 of 14

The histogram of a digital image can reflect an overview of this image. It is a practical and effective way to enhance the image using histogram equalization, which expands the selected range of intensity. The target image, processed by histogram equalization, is shown in Figure 9c. After histogram equalization, image clarity improves. Concurrently, image noise increases. Image smoothing [13] can be used to compress image noise. In this paper, we choose the smoothing linear filters (i.e., averaging filters) for their real-time performance and reliable image quality, compared Sensors 2018, 18, x 7 of 14 to several other linear and non-linear filters. Averaging filters can be noted by the expression (2). where M is the total number of coordinate points within the collection, f ( m, n) is the input image,

1

y) = f (m, n) points in the ( x , y) domain, but the (2) ∑coordinate a collection of g( x , y) is the processed image, S gis( x, M (m,n )∈S point ( x , y) is not included. The post-smoothing image is shown in Figure 9d. where MSmoothing is the total number of coordinate pointsThus, within the sharpening collection, [14] f (m,isn)required is the input image, makes the image blurred again. image to enhance g( x,edges y) is the image, S is A a collection of coordinate points in widely the ( x, yused ) domain, but the point andprocessed grayscale transitions. Gaussian filter is a high-pass filter to sharpen images. function ofpost-smoothing the nth Gaussianimage high-pass filter with cut-off ( x, yThe ) is transfer not included. The is shown in Figure 9d.frequency, D0 , is formulated Smoothing makes the image blurred again. Thus, image sharpening [14] is required to enhance as the Equation (3). edges and grayscale transitions. A Gaussian filter is a high-pass filter widely used to sharpen images. n  D0    with The transfer function of the nth Gaussian high-pass filter  cut-off frequency, D0 , is formulated (3) as D( u ,v )  H (u , v )  e  the Equation (3). D

H (u, v) = e where D0 is the cut-off frequency,

n

0 ] −[ D(u,v 1 )

D(u, v)  [u2 1 v2 ]2

(3) , and n controls the increase rate of

where cut-off frequency, D (u,after v) = [u2 + v2 ] 2 , and n controls the increase rate of H (u, v). H(D u,0vis) . the Figure 9e shows the image sharpening. Figure 9e shows the image after sharpening. Image segmentation is used to classify image features to highlight and extract ROIs [15]. Image Image segmentation is used to classify image features to highlight and extract ROIs [15]. thresholding enjoys a central position of image segmentation applications because of its intuitive Image thresholding enjoys a central position of image segmentation applications because of its intuitive properties and implementation simplicity. A thresholded image, g ( x, y) , is defined as the properties and implementation simplicity. A thresholded image, g( x, y), is defined as the expression (4). expression (4).

(

0,0,f (fx,( xy,)y< ) TT g( x,g(yx) ,= y)  1, f ( x, y) ≥ T 1, f ( x , y)  T

(4)(4)

where f ( x, yf)( is and T and is theTthreshold. Pixels labeled 1 correspond to objects, whereas where is the image, input image, is the threshold. Pixels labeled 1 correspond to objects, x ,the y) input pixels labeled 0 correspond to the background. A conventional gray-level format with an 8-bit depth is whereas pixels labeled 0 correspond to the background. A conventional gray-level format with an 8employed here; the brightness range is 0–255range [16]. A threshold is crucial to image effect during bit depth is employed here; the brightness is proper 0–255 [16]. A proper threshold is crucial to image threshold segmentation. Thesegmentation. segmentationThe point is obtained point with an intensity histogram. effect during threshold segmentation is analysis obtainedofwith an analysis of Figure 10 shows the intensity histogram of a sharpened target image. It contains three intensity histogram. Figure 10 shows the intensity histogram of a sharpened target image. Itdominant contains modes, theamongst brightest domain represents illuminating diodes in the target. threeamongst dominantwhich modes, which the brightest domain represents illuminating diodesThere in theis an obvious valleyisofanintensity, be chosen as a threshold to segment the image. In this paper, target. There obvious which valley can of intensity, which can be chosen as a threshold to segment the T = image. 235 andInthe outcome binary image is shown in Figure 9f. this paper, T = 235 and the outcome binary image is shown in Figure 9f.

Figure 10.10. Intensity histogram 6e). Figure Intensity histogramofofaasharpened sharpenedimage image (Figure (Figure 6e).

After the binary image processed, connected-component analysis is applied to further separate objects using binary large object tool in MATLAB (R2017b, MathWorks® , Natick, MA, USA) [17]. The final image of preprocessing is shown in Figure 9g.

Sensors 2018, 18, 595

8 of 14

After the binary image processed, connected-component analysis is applied to further separate objects using binary large object tool in MATLAB (R2017b, MathWorks® , Natick, MA, USA) [17]. The final Sensors 2018,image 18, x of preprocessing is shown in Figure 9g. 8 of 14 Sensors 2018, 18, x

8 of 14

4.2. Feature Feature Extraction Extraction Algorithm Algorithm 4.2. 4.2. Feature Extraction Algorithm The points features of of target image whose extraction algorithm bases on The points O O and and A O’ininFigure Figure6 6are are features target image whose extraction algorithm bases The points O and O’ in Figure 6 areimage. features of target are image whose extraction algorithm calculation on pixels in the preprocessed Algorithms to sum up pixels ofbases each on calculation on pixels in the preprocessed image. Algorithmsprogrammed are programmed to sum up pixels of on calculation on pixels in the preprocessed image. Algorithms are programmed to sum upthe pixels of connect-component and to calculate their mean value as the center point (i.e., point O) of target each connect-component and to calculate their mean value as the center point (i.e., point O) of the each connect-component and to calculate their mean value as the center point (i.e., point O) of the image.image. To locate the outermost component can be can recognized as a direction point whose target To direction, locate direction, the outermost component be recognized as a direction point target is image. To locate by direction, the outermost component can be two recognized asprecisely a direction point center also calculated average processing. Then a line joins these points to show the whose center is also calculated by average processing. Then a line joins these two points to precisely whose center isfinal alsoprocessed calculatedresult by average processing. Then a line joins these two points to precisely direction. The is shown in Figure 11. show the direction. The final processed result is shown in Figure 11. show the direction. The final processed result is shown in Figure 11.

Figure 11. Target image with direction location. (The above red dot means the center of the target Figure 11. 11. Target image image with with direction direction location. location. (The above red dot means the center of the target image while the below is the center of the direction point. The blue line joining there two points are image joining there two points areare to image while while the the below belowisisthe thecenter centerofofthe thedirection directionpoint. point.The Theblue blueline line joining there two points to show the direction of drill bit.). show the direction of drill bit.). to show the direction of drill bit.).

4.3. Angle Measurement Algorithm 4.3. Angle Measurement Algorithm To avoid visual evaluation of direction and to get precise guidance for operators, it is necessary To avoid avoidvisual visualevaluation evaluationofofdirection directionand andtotoget getprecise precise guidance operators, it necessary is necessary To guidance forfor operators, it is to to calculate the exact value of the face angle θ and rotation angle β, as shown in Figure 12. to calculate exact value of the face angle θ and rotation angle β, shown as shown in Figure calculate thethe exact value of the face angle θ and rotation angle β, as in Figure 12. 12.

Figure 12. Schematic diagram of target for calculating deflection and angle. (The blue circles show the Figure 12. 12. Schematic diagram ofoftarget for calculating deflection and angle. (The blue circles show the Figure Schematic diagram target (The blue circles show layout of LEDs. The red reference lines for arecalculating to show thedeflection center ofand drillangle. rod. The line AO shows the layout of LEDs. The red reference lines are to show the center of drill rod. The line AO shows the the layoutofofdrill LEDs. red reference arethe to show the center direction bit The while the line OO’lines shows deviation of it.). of drill rod. The line AO shows the 0 direction of drill bit while the line OO’ shows the deviation of it.). direction of drill bit while the line OO shows the deviation of it.).

The face angle θ can be described as two kinds of status in the expression (5). The face angle θ can be described as two kinds of status in the expression (5). The face angle θ can be described as two kinds of status in the expression (5).

  0, y  y  0

(   0, y11  y00  0 0, yy11 −  0,  yy0 > 0 θ > 0,yy1− yy00 ≤ 00 θ ≤ 0, 0 1 It is calculated with the expression (6). It is calculated with the expression (6).

y y

  arctan y11  y00   arctan x1  x0 x1  x0

From expressions (5) and (6), we obtain the face angle, θ, as the expression (7).

(5) (5) (5)

(6) (6)

Sensors 2018, 18, 595

9 of 14

It is calculated with the expression (6). y − y0 θ = arctan 1 x1 − x0

(6)

From expressions (5) and (6), we obtain the face angle, θ, as the expression (7).   arctan y1 −y0 , y1 − y0 > 0 x 1 − x0 θ=  −arctan y1 −y0 , y1 − y0 ≤ 0 x − x0

(7)

1

According to coordinates of point O, ( x0 , y0 ) , point A, ( x1 , y1 ) and point O0 , ( x 0, y0) the rotation angle, β, is calculated with the expression (8). β = arccos q

[( x 0 − x0 ) · ( x1 − x0 ) + (y0 − y0 ) · (y1 − y0 )] q ( x 0 − x0 )2 + ( y 0 − y0 )2 · ( x1 − x0 )2 + ( y1 − y0 )2

(8)

The variate V, as an indicator of rotating direction, can be also calculated with the expression (9). V = ( y 0 − y0 ) · x1 + ( x 0 − x0 ) · y1 + ( x0 · y 0 − x 0 · y0 )

(9)

If V ≥ 0, it means that the point A is on the right side of the line OO0 , as shown in Figure 12. Operators could steer the drill bit counterclockwise with the angle of β to make line AO run through point O0 . Otherwise, the drill bit rotates clockwise. 4.4. Deflection Detection Algorithm Drilling deflection is defined as the distance between the drill bit and the pre-defined track. In the image, it is the distance between the center of the LED target and center of the drill rod, or the length of O0 , as shown in Figure 11. The horizontal alignment is obtained by the expression (10). ∆x = x0 − x 0

(10)

And vertical alignment is denoted as the expression (11). ∆y = y0 − y0

(11)

where x0 , x 0, y0 , y0 are coordinates of points O and O0 . The deflection is automatically calculated by the expression (12). q L=

( x0 − x 0)2 + (y0 − y0)2

(12)

It is theoretically possible to calculate the deflection through multiplying L by the magnification of the CCD camera. However, in practice, the size of the image decreases with the length of drilling. For this reason, another algorithm is supplied to convert single pixel size to its real size in the target. Only when the magnification is calculated, can deflection in real be calculated and displayed on the interface. 4.5. Auto-Focus Algorithm The LED target gets increasingly farther away from the camera as the drill bit advances. Thus, images captured by the camera become increasingly blurred. An operator is required to continuously tune the lens of the camera to adjust the focus distance and to maintain image clarity. However, manual focus requires an operator to keep watching on the camera. This, too, is a waste of manpower and fails to improve efficiency [18]. To achieve true auto-focus, a motor is employed to

Sensors 2018, 18, 595

10 of 14

adjust the camera’s focus distance via a gear train. Thus, the relationship between the tuning angle and the object distance needs to be investigated. Images of the target are shot from the near to the distant. The angle tuned for lens increases as well. Concrete results are shown in Table 1. List 1 represents the object distance. List 2 shows the angle Sensors for 2018, 18, xto improve focus at different distances. 10 of 14 tuned lens Table1.1.Angles Anglestuned tunedfor forlens lensat atdifferent differentobject objectdistances distanceson onthe theverification verificationexperiment. experiment. Table

Distance (m) 0 0 1 1 10 10 100 100 +∞ +∞

Distance (m)

Angle Angle 0 0 π/2 π/2 3π 3π 7π/2 7π/2 4π 4π

Results in Table 1 can be processed by CFTOOL, a curve fitting toolbox in MATLAB (R2017b, Results in Table 1 can be processed by CFTOOL, a curve fitting toolbox in MATLAB (R2017b, MathWorks® , Natick, MA, USA) [19,20]. Angles are independent variable, x, and observation MathWorks® , Natick, MA, USA) [19,20]. Angles are independent variable, x, and observation distances are dependent variable, y. Drawing these points on an axis, it is easy to find that those distances are dependent variable, y. Drawing these points on an axis, it is easy to find points roughly obey the law of an exponential function. Therefore, exponential function, that those points roughly obey the law of an exponential function. Therefore, exponential f ( x)  a*exp(b * x)  c * exp(d * x) , in MATLAB (R2017b, MathWorks® , Natick, MA, USA), is function, f ( x ) = a ∗ exp(b ∗ x ) + c ∗ exp(d ∗ x ), in MATLAB (R2017b, MathWorks® , Natick, MA, employed to fit themto[21]. fitted is then 13, wherein, a = 3.244 10–15, b = USA), is employed fit The them [21].curve The fittedshown curveinisFigure then shown in Figure 13, ×wherein, c =× 0.4169, d = 0.3331. The focusing of the is denoted the expression (13). a3.437, = 3.244 10–15 , and b = 3.437, c = 0.4169, and d =rules 0.3331. Thetheodolite focusing rules of the as theodolite is denoted as the expression (13). (13) y  3.224  1015  e 3.437x  0.4169  e0.3331x y = 3.224 × 10−15 · e3.437· x + 0.4169 · e0.3331· x (13) Inthe thecurve-fitting, curve-fitting,there thereare aretwo twomain mainindices indicesto toevaluate evaluatethe thefitting fittingeffect: effect:the thesum sumof ofsquares squares In 2 [22]. SSE measures the deviation of 2 for error (SSE) and the coefficient of multiple determination R for error (SSE) and the coefficient of multiple determination R [22]. SSE measures the deviation of theresponses responsesfrom fromtheir theirfitted fittedvalues. values. A A value value closer closerto to00indicates indicatesaabetter betterfit. fit.R-square R-squaremeasures measures the how successful the fit is in explaining the variation of data. A value closer to 1 indicates a better fit. fit. how successful the fit is in explaining the variation of data. A value closer to 1 indicates a better 2 = 1, from which we can ensure that the focusing rules of the theodolite In our fitting, SSE = 0.2617, R 2 In our fitting, SSE = 0.2617, R = 1, from which we can ensure that the focusing rules of the theodolite arereliable. reliable. are

Figure 13. Curve of exponential fitting of focusing experimental results. Figure 13. Curve of exponential fitting of focusing experimental results.

In this paper, the length of OO' is used as the criterion for the clarity of image. When the length 0 is used as the criterion for the clarity of image. When the length is In this paper, the length of Othreshold), is short (i.e., within a certain the image is considered “unclear,” so that focusing is short (i.e., within a certain threshold), the image is considered that focusing is required. required. Otherwise the image is clear enough, and no further“unclear,” focusing issoneeded. Otherwise the image is clear enough, and no further focusing is needed. For instance, Figure 14b shows the result processed from Figure 14a. It clearly indicates the Forand instance, Figure 14btarget. shows Currently, the result processed frombetween Figure 14a. It clearly indicates thepoint centeris center direction of the the distance the center and direction and direction of the target. Currently, the distance between the center and direction point is relatively relatively long, indicating that there is no need to focus. During manual estimation, the image of the long, that there nounclear. need to focus. During estimation, image is of still the LED target LED indicating target in Figure 14cis is Although the manual proposed guidancethesystem capable of in Figure 14cthe is center unclear. the proposed guidance system is still capable of calculating the calculating of Although the LED target, as shown in Figure 14d, the distance between the center of center of the LEDand target, as shown in Figure theTo distance thesystem center judgment, of the LED target and the LED target direction point is too14d, short. avoidbetween incorrect a focusing direction point is too short. To avoid incorrect system judgment, a focusing operation is performed. operation is performed.

Sensors 2018, 18, 595

11 of 14

Sensors 2018, 18, x Sensors 2018, 18, x

11 of 14 11 of 14

(a) (a)

(b) (b)

(c) (c)

(d) (d)

Figure 14. Results of different original images. (a) clear original image; (b) clear image after Figure 14. 14. Results ofof different different original images. (a)original clear original (b) after clear image after Figure (a) clear image; (b)image; clear(The image processing; (c)Results unclear original original image; images. (d) unclear image after processing. aboveprocessing; red dot in (b) processing; (c) original unclearimage; original unclear after(The processing. (The red dot (c) unclear (d)image; unclear(d) image after image processing. above red dot above in (b) means thein (b) means the center of the target image while the below is the center of the direction point. The blue line center of the target image while the below is the center of the direction point. The blue line joining means the center of the target image while the below is the center of the direction point. The blue line joining there two points are to show the direction of drill bit. The red dot in (d) means the center of there twotwo points are toare show the direction of drill of bit.drill Thebit. red The dot in (d)dot means themeans center the of target joining there points to show the direction red in (d) center of target image calculated by software system.). image calculated by software system.). target image calculated by software system.).

5. Field Trial 5. Field Trial 5. Field Trial The The guidance system was used ofhot hotwater waterpipelines pipelines under Ye Road guidance system was usedfor foran aninstallation installation of under Jin Jin Ye Road and and The guidance system was used for an installation of hot water pipelines under Jin Ye Road and Zhang Ba Liu Road in in Xi’an city, employedthe the HAB method because Zhang Ba Liu Road Xi’an city,China. China. The The contractor contractor employed HAB method because of itsof its Zhang Ba adaption Liu Road in Xi’an city, and China. The contractor employed the HAB method because of its better soilconditions conditions and convenience of of removing spoil.spoil. In this a 40 m length better adaption to to soil convenience removing Inproject, this project, a 40 mand length better adaption to soil conditions and convenience of removing spoil. In this project, a 40 m length 800 m mm diameterhot hotwater water pipeline pipeline was at aatdepth of 8ofm.8 The inner diameter and 800 diameter wasburied buriedunderground underground a depth m. The inner diameter and 800 m diameter hot water pipeline was buried underground at a depth of 8 m. The inner diameter of pilot the pilot mm andthe theouter outer was was 80 construction sitesite is shown in in of the pipepipe waswas 60 60 mm and 80 mm. mm.The Theschematic schematic construction is shown of the pilot15. pipe was 60 mm and the outer was 80 mm. The schematic construction site is shown in Figure Figure 15. Figure 15.

Figure 15. Schematic construction site. Figure15. 15.Schematic Schematic construction Figure constructionsite. site.

The jacking and reception pits were excavated on both sides of the road in advance. The hot The jacking and reception pits were excavated on both sides of the road in advance. The hot water pipeline was thrusted from the jacking pit to the reception pit. Workers set a theodolitewater pipeline was thrusted from the jacking pit to the reception pit. Workers set a theodolitemounted a zoom CCD camera behind the power head of the auger boring machine to monitor the mounted a zoom CCD camera behind the power head of the auger boring machine to monitor the LED target. LED target.

Sensors 2018, 18, 595

12 of 14

The jacking and reception pits were excavated on both sides of the road in advance. The hot water pipeline was thrusted from the jacking pit to the reception pit. Workers set a theodolite-mounted a zoom CCD camera behind the power head of the auger boring machine to monitor the LED target. Images of the LED target were transmitted to the personal computer as they were shot, every 6 m. Sensors 2018, 18, x 12 of 14 Then, they were processed and calculated simultaneously. Figure 16 shows the interface of the12 guidance Sensors 2018, 18, x of 14 system with results obtained the image m, including length, horizontal vertical guidance system with resultsfrom obtained from at the36image at 36 m,drilling including drilling length,and horizontal guidance system with results obtained from the imagethe at 36 m,bit including drilling length, horizontal alignments, and LED target angle. An operator steered drill to change the drilling direction for and vertical alignments, and LED target angle. An operator steered the drill bit to change the drilling and vertical alignments, and LED target angle. An operator steered the drill bit to change the drilling correctingfor drilling trajectory according to these parameters. direction correcting drilling trajectory according to these parameters. direction for correcting drilling trajectory according to these parameters.

Figure 16. Interface of guidance system with results at 36 m.

Figure 16. Interface of guidance system with results at 36 m. Figure 16. Interface of guidance system with results at 36 m. The CCD camera captured a total of six images of the LED target during the drilling process. Table 2 displays vertical drilltarget bit. More intuitively, Figure 17 The CCD camera horizontal captured aand total of sixalignments images ofofthetheLED during the drilling process. The CCD camera captured a total of six images of the LED target during the drilling process. illustrates the deflection in a line graph along with drilling length. The largest deflection emerged at 17 Table 2 displays horizontal and vertical alignments of the drill bit. More intuitively, Figure 12 m, still within the acceptable range. After each image was processed, an operator checked the Table 2 displays horizontal vertical the drilllength. bit. More Figure 17 illustrates illustrates the deflection in and a line graphalignments along withofdrilling Theintuitively, largest deflection emerged at parameters and steered the drilling bit to drilling correct the drillingThe track,largest especially when theemerged deflectionat was the deflection in a line graph along with length. deflection 12 m, 12 m, still within the acceptable range. After each image was processed, an operator checked the relatively large. During the drilling, deflection gradually reduced. Using the checked enhanced automated still within the After image was processed, operator parameters parameters andacceptable steered therange. drilling bit each to correct the drilling track,an especially when the the deflection was guidance system, the operator carried a portable computer and mastered the drilling situation via the and steered the drilling bit todrilling, correct deflection the drillinggradually track, especially when thethe deflection wasautomated relatively relatively large. During the reduced. Using enhanced given parameters. The proposed guidance system thoroughly simplified the whole operation. large. During thethe drilling, deflection agradually reduced. Using the enhanced automated guidance guidance system, operator portable and mastered the drilling situation via the Additionally, they avoidedcarried subjectively judgingcomputer the deflection.

system, the operator carried a portable computer andthoroughly mastered the drilling situation viaoperation. the given given parameters. The proposed guidance system simplified the whole parameters. The proposed guidance system thoroughly simplified the whole Table 2. Deflection values after images of LED target.operation. Additionally, Additionally, they avoided subjectively judging theprocessing deflection. they avoided subjectively judging the deflection. Type Value Table 2. Deflection values after processing images of LED target. Drilling length 6m 12 m 18 m 24 m 30 m 36 m Table 2. Deflection values after processing images of LED target. Horizontal alignment (mm) –4.299 –9.157 3.207 1.243 –1.223 –0.407 Type Value Vertical alignment (mm) –2.759 –4.022 1.122 1.985 0.312 –2.602 DrillingType length 6m 12 m 18 Value m 24 m 30 m 36 m

Horizontal alignment (mm) the –4.299 –9.157 3.207 1.243 –1.223 –0.407 lengthsystem, 6 pilot m borehole 12 m was 18 msuccessfully 24 m drilled 30 m to 36 Guided byDrilling the enhanced thempreset target Horizontal alignment (mm) –4.299 –9.157 3.207 1.243 –1.223 –0.407 Vertical alignment (mm) –2.759 –4.022 1.122 1.985 0.312 area. The error between the excavated site and the destination was about 1.5 mm.–2.602 The enhanced Vertical alignment (mm) –2.759 –4.022 1.122 1.985 0.312 –2.602 automated guidance system successfully completed the construction of a pilot bore hole. Guided by the enhanced system, the pilot borehole was successfully drilled to the preset target area. The error between the excavated site and the destination was about 1.5 mm. The enhanced automated guidance system successfully completed the construction of a pilot bore hole.

Figure 17.Figure Line graph ofgraph deflections every 6 every m. (The green is the line meaning no deviation.). 17. Line of deflections 6 m. (Theline green line reference is the reference line meaning no deviation.).

Figure 17. Line graph of deflections every 6 m. (The green line is the reference line meaning no deviation.).

Sensors 2018, 18, 595

13 of 14

Guided by the enhanced system, the pilot borehole was successfully drilled to the preset target area. The error between the excavated site and the destination was about 1.5 mm. The enhanced automated guidance system successfully completed the construction of a pilot bore hole. 6. Conclusions An enhanced automated guidance system, based on image processing for HAB, used in the installation of gravity or pressure pipelines, was proposed in this paper. This system consisted of a lighted target, theodolite, a camera with stand, and a personal computer, which acquired images of the lighted target, via a camera, and automatically calculated the deflection using image processing technology. To improve guidance quality, the emphasis of the research focused on the optimization of the LED light target and the enhancement of algorithms of deflection computing and correction. To ensure target images were clear enough, multiple experiments were conducted to optimize the light color, filter plate color, luminous intensity, and LED target layout. LED targets are obtained with optimized key parameters, including red LED color, blue filter plate color, 11 mm spaces between LEDs, 160 Ω resistance, and two 1.5 V dry batteries. To analyze the images, preprocessing methods, including image enhancement, image sharpening, and image segmentation were implemented. Additionally, four algorithms (i.e., feature extraction, angle measurement, deflection detection, and auto-focus) were designed to accurately calculate the deflection and automatic camera focusing. The system was designed to guide the hot water pipeline installations. The accuracy was controlled within 2 mm over 48 m, meeting the installation requirement. The feasibility and reliability of this guidance system was also verified in a field trial. The guidance system is low-cost, and a small-workspace is required. The image processing technology and auto-focus method applied in this system promoted guidance automation, which freed machine operators from monitoring the screen all the time to estimate the deflection and to focus the target. The indoor experiments showed that the proposed system can acquire clear images from about 100 m, meeting the requirements of general practice engineering. Then, the system was used to guide a hot water pipeline installation at the length of 40 m and a diameter of 800 mm, buried 8 m beneath the surface. The location error at the location of emergence was about 1.5 mm, meeting the installation requirement. This strongly verifies the feasibility and reliability of the proposed guidance system in practice. Acknowledgments: This research program was funded by the grant of National Natural Sciences Foundation of China (41672155, 61733016) and Drillto Trenchless Technology & Equipment Co., Ltd. Author Contributions: Guojun Wen and Yudan Wang conceived and designed the experiments; Lei Huang performed the experiments; Lingling Wu and Jiang Zhou analyzed the data; Drillto Trenchless Technology & Equipment Co., Ltd. contributed drill rods and place for experiments; Lingling Wu wrote the paper. Conflicts of Interest: The authors declare no conflict of interest.

References 1. 2. 3. 4. 5.

Mohammad Najafi, P.E. Trenchless Technology: Planning, Equipment, and Methods; McGraw-Hill Companies Inc.: New York, NY, USA, 2013; pp. 241–262. Wu, X.M.; Hu, Y.L.; Li, L.G. Guided Drilling and Trenchless Pipe Laying Technology; China University of Geosciences Press: Wuhan, China, 2007; pp. 112–125, ISBN 9787562519584. Gerasimova, V. Underground Engineering and Trenchless Technologies at the Defense of Environment. Procedia Eng. 2016, 165, 1395–1401. [CrossRef] Ma, B.S. The Science of Trenchless Engineering; China Communications Press: Beijing, China, 2008; pp. 112–132. Consens, P.; Jandu, C. Stress Analysis of a Gas Pipeline Installed Using HDD and Auger Bore Techniques. In Proceedings of the 7th International Pipeline Conference, Calgary, AB, Canada, 29 September–3 October 2008; pp. 133–140.

Sensors 2018, 18, 595

6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22.

14 of 14

Liu, T.; Wang, B.X.; Cui, Y.Y.; Zhang, J. Direction and Position Measurement in HDD Using Two Magnetic Fields. Sens. Actuators A Phys. 2012, 185, 168–172. [CrossRef] Tian, Q.Z.; Zhang, Y.; Ban, Y. The Guide Hole Docking Technology of Horizontally Directional Drilling. Pipeline Technol. Equip. 2014, 3, 31–33. [CrossRef] Sinha, S.K.; Fieguth, P.W.; Polak, M.A. Computer Vision Techniques for Automatic Structural Assessment of Underground Pipes. Comput. Aided Civ. Infrastuct. Eng. 2003, 18, 95–112. [CrossRef] Xu, T.; Luo, W.S.; Lu, H.B.; Lu, Q. Design of Underground Sonde of a Directional Drilling Locator System. Sens. Actuators A Phys. 2005, 119, 427–432. [CrossRef] Yan, Z.G.; Li, M.; Xu, K.F. High-speed Robot Auto-sorting System Based on Machine Vision. Packag. Food Mach. 2014, 32, 28–31. [CrossRef] Gonzales, R.C.; Woods, R.E. Digital Image Processing, 2nd ed.; Pearson Education North Asia Limited: Hong Kong; Publishing House of Electronics Industry: Beijing, China, 2002. Gong, A.P. Study of the Information Acquisition and Processing Technology Based on Embedded Computer Vision. Master’s Thesis, Zhejiang University, Hangzhou, China, 2013. Yang, B. A Research and Implement of the Computer Vision Real-Time Measurement Technology. Master’s Thesis, Shenyang University of Technology, Shenyang, China, 2004. Iyer, S.; Sinha, S.K. Segmentation of Pipe Images for Crack Detection in Buried Sewers. Comput. Aided Civ. Infrastuct. Eng. 2006, 21, 395–410. [CrossRef] Castleman, K.R. Digital Image Processing; Prentice Hall, Inc.: Englewood Cliffs, NJ, USA, 1996. Nishikawa, T.; Yoshida, J.; Sugiyama, T.; Fujino, Y. Concrete Crack Detection by Multiple Sequential Image Filtering. Comput. Aided Civ. Infrastuct. Eng. 2012, 27, 29–47. [CrossRef] Ma, Y.H.; Geng, R.F.; Zhang, G. The Study of Robot Vision Image Identify Method Based on Blob Arithmetic. Electron. Instrum. Cust. 2008, 4, 1–2. [CrossRef] Ding, Y.L.; Tian, H.Y.; Wang, J.Q. Design on the Focusing Mechanism of Space Remote-Sensing Camera. Opt. Precis. Eng. 2010, 9, 35–38. [CrossRef] Liu, W.G. MATLAB Program Design and Application, 2nd ed.; Higher Education Press: Beijing, China, 2008; pp. 89–123. Hu, Q.W. Curve Fitting by Curve Fitting Toolbox of MATLAB. Comput. Knowl. Technol. 2010, 6, 5822–5823. [CrossRef] Tang, J.D. Nonlinear Curve Fitting Based on MATLAB. Comput. Mod. 2008, 6, 15–19. [CrossRef] Chen, L.F.; Yang, J.Y.; Cui, S.; Pan, Q.C.; Li, L. MATLAB Simulation of Curve Fitting Based on Least-Square. J. Shenyang Norm. Univ. Nat. Sci. Ed. 2014, 32, 75–79. [CrossRef] © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Suggest Documents