MILITARY TECHNICAL ACADEMY
Vol. XXVI, No. 1
Military Technical Academy Publishing House Bucharest, Romania, March 2016
Editorial Board: Col. Prof. Eng. CONSTANTIN-IULIAN VIZITIU, Ph.D. The Military Technical Academy of Bucharest, Romania Col. Prof. Eng. IOAN NICOLAESCU, Ph.D. The Military Technical Academy of Bucharest, Romania Col. Assoc. Prof. Eng. CONSTANTIN ENACHE, Ph.D. The Military Technical Academy of Bucharest, Romania Col. Assoc. Prof. Eng. TUDOR-VIOREL ŢIGĂNESCU, Ph.D. The Military Technical Academy of Bucharest, Romania Col. Prof. Eng. PAMFIL ȘOMOIAG, Ph.D. The Military Technical Academy of Bucharest, Romania Prof. Eng. VICTOR-VALERIU PATRICIU, Ph.D. The Military Technical Academy of Bucharest, Romania Comdr. Assoc. Prof. Eng. RĂZVAN NECHIFOR, Ph.D. The Military Technical Academy of Bucharest, Romania Prof. Eng. JÉRÔME MARS, Ph.D. Grenoble Institute of Technology, France Prof. Eng. SRDJAN STANKOVIĆ, Ph.D. The University of Montenegro, Podgorica, Montenegro Prof. Eng. VLADIMÍR HORÁK, Ph.D. The University of Defence in Brno, the Czech Republic Prof. Eng. EMANUEL RĂDOI, Ph.D. The University of Western Brittany, Brest, France Assoc. Prof. Eng. CORNEL IOANA, Ph.D. Grenoble Institute of Technology, France Col. Prof. Eng. ION BICA, Ph.D. The Military Technical Academy of Bucharest, Romania Prof. Eng. ALEXANDRU ŞERBĂNESCU, Ph.D. The Military Technical Academy of Bucharest, Romania Lect. RALUCA CONSTANTIN, Ph.D. The Military Technical Academy of Bucharest, Romania Lt. Col. Assoc. Prof. Eng. MARIN LUPOAE, Ph.D. The Military Technical Academy of Bucharest, Romania Assoc. Prof. Eng. FLORIN ENACHE, Ph.D. The Military Technical Academy of Bucharest, Romania Col. Assoc. Prof. Eng. DĂNUŢ GROSU, Ph.D. The Military Technical Academy of Bucharest, Romania
Considerations on the Safety and Performance Assessment Trials Conducted for the Energetic Materials Equipping Munitions – CONSTANTIN ENACHE, DANIEL ANTONIE, DORU-ADRIAN GOGA ............... 5 Study of the Effect of 7.62 mm Caliber Ammunition, on Concrete. Comparative Analysis between Eastern and Western Types Ammunition – ALIN-CONSTANTIN SAVA, CRISTIAN-EMIL MOLDOVEANU, PAMFIL ŞOMOIAG, DIANA NISTORAN ...................................................... 13 Vulnerabilities at the Sensor Level in Biometric Systems: A Review – STELIAN SPÎNU ......................................................................19 Least Squares Estimation of Round Convex Mirror Image Distortion for Indoor Localization in Robot Swarms – ADRIANA MILĂŞAN, CRISTIAN MOLDER ................................................................................. 27 Analysis of Impact between Piercing – Incendiary Bullets and Armoured Plate – ALIN-CONSTANTIN SAVA, CĂTĂLIN-EUGEN IONESCU, CONSTANTIN ENACHE, ALINA-DANIELA POPA, MARINEL CÎMPEANU ....... 39 Developing a Measuring System Able to Determine the Mechanical Stress on a Rectangular Cross-Section Bar – LASZLO BAROTHI, DANIELA VOICU ..................................................................................... 47 Romanian National Security: A Critical Space Infrastructure Perspective – ALEXANDRU GEORGESCU, ULPIA-ELENA BOTEZATU, ALINA-DANIELA POPA, ŞTEFAN-CIPRIAN ARSENI, ALIN-CONSTANTIN SAVA. ... 53 Constrained Path Planning for Mobile Robots in Indoor Environments – DAMIAN GORGOTEANU, CRISTIAN MOLDER ................. 65 3
4
CONTENTS
Study of the Unguided Rocket Effectiveness – CRISTIAN-EMIL MOLDOVEANU, SOPHIE BUONO, PAMFIL ŞOMOIAG.................................... 71 An Overview of the Space Debris Threat to Critical Space Infrastructures – ALEXANDRU GEORGESCU, ALINA-DANIELA POPA, ŞTEFAN-CIPRIAN ARSENI, ALIN-CONSTANTIN SAVA ................................. 81
MTA REVIEW • Vol. XXVI, No. 1, Mar. 2016
LEAST SQUARES ESTIMATION OF ROUND CONVEX MIRROR IMAGE DISTORTION FOR INDOOR LOCALIZATION IN ROBOT SWARMS ADRIANA MILĂŞAN 1 CRISTIAN MOLDER1 Abstract: The current paper deals with the polynomial estimation of deformations induced by convex mirrors in panoramic vision systems. In most situations, the deformations are either unknown or hard to compute using an embedded system mounted on a robot platform. With a polynomial estimation each position (azimuth and elevation) of a robot or artifact in a robot swarm arena can be computed using pixel distances as inputs. The polynomial estimation is obtained using a calibration template designed for this purpose. Keywords: robotics, robot swarm, robot vision, localization.
1. Introduction Position estimation in robot swarm is important for each individual (agent or robot) of the system during coordination and control. Indoor location can be obtained locally or from a supervisor, therefore localization techniques can be classified into two main classes: relative or global. Relative localization deals with determination of all objects (robots or artifacts) surrounding the current agent using sensors mounted on the agent. Such localization systems can use IR, LASER or ultrasonic distance sensors [1-8], wireless networks [9-10] or cameras [11-14]. These techniques are distributed and the swarm system continues to operate even if agents fail. Meanwhile the computing requirements of each agent are increased. Infrared-based systems consist of a number of equiradial sensors which can also be used for communication between close agents using a modulated carrier [3, 4]. Global localization uses overhead vision systems mounted above a robot system arena and maintain a permanent communication with each individual [15, 16]. These techniques remove part of the computing tasks from the agents but use an important bandwidth of the communication system, can have delays and the swarm system is dependent on its failure. 1
Faculty of Military Electronic and Information Systems, Military Technical Academy, 39-49 George Coşbuc Ave., Sector 5, 050141, Bucharest, Romania, e-mails:
[email protected],
[email protected] 27
ADRIANA MILĂŞAN, CRISTIAN MOLDER
28
In order to obtain a fully autonomous robot swarm, each robot must have its own relative localization system. In this paper an omnidirectional vision using a camera and a convex mirror is considered. Convex mirrors distort the image in such way that distant objects tend to close each other in the acquired image. The construction parameters of the round convex mirrors are usually unknown. Therefore, image distortions must be estimated in various ways. Even with mirror parameters known, the computing requirements of mirror deformation would be too complex to be implemented using onboard embedded hardware. This paper deals with the estimation of deformations by a least squares polynomial which is further used to apply corrections to images. Existing robot swarms use similar vision systems based on a convex camera mounted in top of a camera using a transparent plastic cylinder (Figure 1).
a)
b)
Figure 1. a) Swarm-Bot and b) Swarmanoid
Each robot can monitor the surrounding environment and then detect and locate close neighbors or obstacles. Such systems are the Swarmanoid [16] and the Swarm-Bot [17].
2. Experiment Description The current experiment uses a set of round convex mirrors radially mounted on top of a CCD color camera with a lens system (Figure 2). The distance between camera and mirror varies such as the resulting image will cover the entire surface of the mirror. The mirrors are based on LED lenses with a thin reflecting coating. Geometrical specifications of each of the four mirrors are shown in Table 1. A printed pattern bonded on a polystyrene plate is mounted on top of the table using threaded rods. The camera can be positioned in Cartesian coordinates using two orthogonal rods. Each of the mirrors is centered on the table using a printed positioning template with 1 mm steps and concentric circles printed each 5 mm and up to 50 mm (Figure 3).
LSE of Round Convex Mirror Image Distortion for Indoor Localization in Robot Swarms
29
Figure 2. Experimental platform with convex mirrors
The camera is placed above a mirror at a certain height and lens adjustment until the FOV covers the entire mirror surface and the image is focused. In order to determine each lens’ deformations, a printed calibration template is created and placed behind the camera to simulate the environment around the robot. The lens template consists of five equidistant points with a step of 5 cm placed each 30 degrees (Figure 4). For the ease of manufacture, the template is split printed on four A4 transparent foils, and the entire template will cover an A2 format. Therefore, there will be fewer points on a radius on the short edge than on the long edge. As the template is symmetrical, it can be easily rotated 90° to change position. Before each image acquisition, the camera is centered using the positioning template, and then the mirror is positioned using its scale.
a)
b)
Figure 3. Positioning template a) design and b) camera view sample (ROI shown as a rectangle in design)
ADRIANA MILĂŞAN, CRISTIAN MOLDER
30
Table 1. Convex mirror specifications Lens
Outer diameter [mm]
Inner diameter [mm]
Height [mm]
Focal length [mm]
No. 1 No. 2 No. 3 No. 4
23.0 40.0 50.0 66.2
21.1 36.5 46.2 61.2
7.6 16.8 18.0 27.5
18 19 24 42
For each mirror a template image is acquired once the camera is positioned and focused. In order to determine de image deformation induced by the mirror the distances from the center to pixels are measured offline. Calibration template images acquired for each of the four mirrors are shown in Figure 5. It can be easily seen that the camera mounting screw and rod covers calibration dots at indices 1 and 2. For better environmental lighting, two fluorescent lamps were positioned in opposition sideways.
Figure 4. Mirror calibration printed template
For each distance index 5k , the pixel distance mean value P5k is computed (Equation (1))
LSE of Round Convex Mirror Image Distortion for Indoor Localization in Robot Swarms
= P5k
∑
12 ( i ) 1 = P , k 1,...,5 12 i =1 5k
31
(1)
To compensate for the dot measurement errors, all 12 dot distances P5(k ) i
from the center are first measured and then estimated with their mean value P5k . As dots are positioned every 30°, the total number of dots at a certain distance 5k will be 360° 30° =12 . Measured calibration dots distances (in pixels) relative to the camera image center for each of the four mirrors are shown in Tables 2 to 6. The estimated distance from the center is then computed using the measured distance and shown in the last column of each table.
a)
b)
c) d) Figure 5. Calibration template with mirror no. 1 (a), no. 2 (b), no. 3 (c) and no. 4 (d) ( H C = 8 mm and H T = 50 cm )
Empty table cells correspond to missing template dots or dots that are blocked by the vision system mechanical support and lens. In the case of mirror no. 4, the camera lens covers a larger region of the calibration template projection.
ADRIANA MILĂŞAN, CRISTIAN MOLDER
32
Calibration dots up to 15 cm are never visible in the acquired images due to camera lens blocking. This also represents the system’s minimum range. Table 2. Measured dot distances for mirror no. 1
(i )
1
2
3
4
5
6
7
8
9
10
11
12
P5k
P15( )
–
–
41
40
40
41
41
41
40
40
40
40
40.44
() P20
52
52
–
–
53
53
53
52
51
–
52
52
52.34
() P25
63
63
–
–
–
63
63
63
–
–
–
64
63.07
i i i
Table 3. Measured dot distances for mirror no. 2
(i )
1
2
3
4
5
6
7
8
9
10
11
12
P5k
P15( )
–
–
62
61
63
65
63
63
62
62
62
63
62.62
() P20
82
81
–
–
81
83
81
80
81
–
81
82
81.41
() P25
97
96
–
–
–
97
97
95
–
–
–
98
96.83
i i i
Table 4. Measured dot distances for mirror no. 3
(i )
1
2
3
4
5
6
7
8
9
10
11
12
P5k
P15( )
–
–
76
74
75
77
77
77
76
75
75
77
75.79
() P20
100
98
–
–
98
98
99
98
98
–
98
98
98.50
() P25
119 118
–
–
–
117 119 118
–
–
–
119
118.31
i i i
Table 5. Measured dot distances for mirror no. 4
(i )
1
2
3
4
5
6
7
8
9
10
11
12
P5k
P15( )
–
–
–
–
–
–
–
–
–
–
–
–
–
() P20
–
135
–
–
164 161
–
–
i i
() P25 i
134 137 137 138 136 –
164 164 163
–
– –
137 139 136.60 –
165 163.47
LSE of Round Convex Mirror Image Distortion for Indoor Localization in Robot Swarms
33
3. Least Square Estimation of Deformation The image deformation caused by the mirror can be estimated with the help of a polynomial regression. The best way to estimate the deformation is by using the set dots from the template. A second order polynomial can be used (Equation (2))
y ( x ) = a2 x 2 + a1x + a0 .
(2)
The outcome of the y ( x ) polynomial shows the distance on the image measured in pixels, while the input 𝑥 is the real distance measured in cm (same as P5k ). For each mirror a polynomial is determined by least squares regression using the measured distances from each dots to the center of the acquired image. Examples of such images are shown in Figure 6. The height H C between camera lens and mirror base and the height H T between calibration template and mirror base is the same for all four mirrors.
a)
b)
c)
d)
Figure 6. Deformation LSE polynomial graphs for mirrors no. 1 (a), no. 2 (b), no. 3 (c) and no. 4 (d)
ADRIANA MILĂŞAN, CRISTIAN MOLDER
34
Table 6. LSE polynomial coefficients and ranges Lens
a2
a1
a0
No. 1 No. 2 No. 3
– 0.0182 – 0.0313 – 0.0273
2.9694 4.6682 5.4223
– 0.0977 0.0206 0.2062
Range Min
Max
11.32 cm 11.44 cm 12.82 cm
151.05 cm 138.09 cm 81.72 cm
As one can see from the images, there is always a tradeoff between the vision system field of view and the resolution. A small mirror such as no.1 will cover a large area at a low resolution, while a large mirror such as no. 4 will have a much higher resolution but with a smaller FOV. Depending on the mirror size, the vision system can observe other robots or obstacles from a minimum range. Under that limit the image contains the camera region that covers the observed area around the robot. Minimum range of each vision system configuration is shown in Table 6. Based on the measured calibration dot distances, a second degree polynomial is estimated by LSE for each mirror. The resulting coefficients for each polynomial are shown in Table 6, while the graphics are shown in Figure 6. Due to measurement errors, the least significant coefficient a0 is close to but not equal to zero. The LSE estimation can be further used to apply correction to deformed images. This can be achieved by applying the inverse polynomial to acquired images. The inverse transform is applied by associating corresponding pixels between distorted and corrected images. The corrected image will always have the same size as the distorted image. I r ( xr , yr ) is the current pixel from the corrected real image and I d ( xd , yd ) is the equivalent pixel from the distorted image
I r ( xr , yr ) = I d ( xd , yd ) ,
(3)
xr rr sin θ + xc = = yr rr cos θ + yc
(4)
xd rd sin θ + xc = = yd rd cos θ + yc
(5)
where
and
LSE of Round Convex Mirror Image Distortion for Indoor Localization in Robot Swarms
35
where rr is the real distance in pixels and θ is the angle of the image pixel in polar coordinates, ( xc , yc ) is the image center pixel. The real distance rr varies from 0 to half width of the image W 2 , considering that the corrected image will retain a square shape. For conformation, the image distance limit xmax will always correspond to a maximum distance of 55 cm. The distorted distance rd is computed from the real distance rr using the normalized LSE polynomial transformation (Equation (6)) rr = rd y xmax ⋅ . y x ) ( max
(6)
A validation template containing five solid dots is used to confirm the correction polynomials for each mirror used (Figure 7). As one can see, all dots from the distorted images became oval, while dots in the corrected image are perfectly rounded.
Figure 7. Validation template with solid dots
Each of the five black solid dots has an exact position and size in the validation template. Each solid disc has a diameter of 9 cm, the same size as the robots of the swarm system that the vision system is intended to work with. Figure 8 shows resulting image corrections. The image content is valid only in the region used to define the correction polynomial (e.g., 55 cm). Therefore, every object situated outside this range is distorted after correction.
ADRIANA MILĂŞAN, CRISTIAN MOLDER
36
a)
b)
c)
d)
Figure 8. Validation template shown in images without corrections (left) and with corrections applied (right) for mirrors no. 1 (a), no. 2 (b), no. 3 (c) and no. 4 (d)
The validation of the five dots positions is made by converting the color corrected image to a binary image and further applying morphological operations in order to remove noise and artifacts. The estimated dot positions for each of the four mirrors are shown in Table 7. Table 7. Validation dots estimated positions [cm] Lens
R1
R2
R3
R4
R5
MSE
No. 1 No. 2 No. 3 No. 4
55.50 55.05 56.60 52.54
28.93 28.64 28.60 27.08
23.63 23.46 24.07 22.38
23.46 23.28 21.58 21.96
36.80 36.53 38.69 34.58
0.98 0.59 3.02 1.72
4. Conclusion and Perspectives The current study has served as a basis for the design of an omnidirectional vision system that can be mounted on each robot of a robot swarm system to detect the surrounding objects and neighbors. Four sizes of round convex mirrors have been used to project images on a fixed video camera. Three templates have been designed: (1) a positioning template for mirror positioning, (2) a calibration template for mirror deformation estimation, and (3) a validation template for confirmation of distance measurements.
LSE of Round Convex Mirror Image Distortion for Indoor Localization in Robot Swarms
37
For each of the four mirrors a second order polynomial was estimated by LSE to determine the mirror deformations. Moreover, the polynomial validation was obtained by estimating distances to known solid dots. The results confirm the use of the proposed vision system. A smaller diameter mirror has a larger FOV, a lower MSE, and less invariant to lens positioning. A larger diameter mirror has a smaller FOV, higher MSE and is prone to lens positioning errors. Instead, the mirror reflective layer greatly influences the acquired image quality. The smaller the mirror is, the higher the quality of the reflective layer. Small diameter mirrors require wider calibration templates in order to correctly estimate the deformation polynomial. The valid region for position estimation is defined by the maximum distance of the farthest calibration point. The precision of surrounding object position is highly dependent on the vision system (camera and lens) positioning. The correction can be further done for robot and object shape detection and localization using pattern recognition algorithms. The use of a second order polynomial for image correction highly reduces the computational requirements of robotic embedded systems.
References [1] J. PUGH, X. RAEMY, C. FAVRE, R. FALCONI, A. MARTINOLI – A Fast OnBoard Relative Positioning Module for MultiRobot Systems, IEEE/ASME Transactions on Mechatronics, Vol. 14, No. 2, pp. 151-162, Apr. 2009 [2] J.F. ROBERTS, T.S. STIRLING, J.-C. ZUFFEREY, D. FLOREANO – 2.5D Infrared Range and Bearing System for Collective Robotics, Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3659-3664, St. Louis, MO, Oct. 10-15, 2009 [3] A. GUTIERREZ, A. CAMPO, M. DORIGO, D. AMOR, L. MAGDALENA, F. MONASTERIO-HUELIN – An Open Localization and Local Communication Embodied Sensor, Vol. 8, No. 11, pp. 7545-7563, Nov. 2008 [4] I.D. KELLY, A. MARTINOLI – A Scalable, On-Board Localisation and Communication System for Indoor Multi-Robot Experiments, Sensor Review, Vol. 24, No. 2, pp. 167-180, 2004 [5] G. BENET, F. BLANES, J.E. SIMÓ, P. PÉREZ – Using Infrared Sensors for Distance Measurement in Mobile Robots, Robotics and Autonomous Systems, Vol. 40, No. 4, pp. 255-266, Sep. 2002 [6] Y.A. ȘEKERCIOGĞLU, J. VIOLI, L. PRIESTNALL, J. ARMSTRONG – Accurate Node Localization with Directional Pulsed Infrared Light for Indoor Ad Hoc Network Applications, Proc. of the 22nd International Conference on Telecomunications - ICT, Sydney, Australia, Apr. 27-29, 2015
38
ADRIANA MILĂŞAN, CRISTIAN MOLDER
[7] F. RIVARD, J. BISSON, F. MICHAUD, D. LÉTOURNEAU – Ultrasonic Relative Positioning for Multi-Robot Systems, Proc. of the IEEE International Conference on Robotics and Automation - ICRA, pp. 323-328, Pasadena, CA, May 19-23, 2008 [8] R. GUTIERREZ-OSUNA, J.A. JANET, R.C. LUO – Modeling of Ultrasonic Range Sensors for Localization of Autonomous Mobile Robots, IEEE Transaction on Industrial Electronics, Vol. 45, No. 4, pp. 654-662, Aug. 1998 [9] G. PESSIN, F.S. OSORIO, J.R. SOUZA, F.G. COSTA, J. UEYAMA, D.F. WOLF, T. BRAUN, P.A. VARGAS – Evolving an Indoor Robotic Localization System Based on Wireless Networks, In C. Jayne et al. (Eds.), “Engineering Applications of Neural Networks”, Communications in Computer and Information Science, Vol. 311, pp. 61-70, Springer, Sep. 2012 [10] A. BOUKERCHE, H.A.B.F. OLIVEIRA, E.F. NAKAMURA, A.A.F. LOUREIRO – Vehicular Ad Hoc Networks: A New Challenge for Localization-Based Systems, Computer Communications, Vol. 31, No. 12, pp. 2838-2849, Jul. 2008 [11] H. ISHIGURO – Development of Low-Cost Compact Omnidirectional Vision Sensors and their Applications, In R. BENOSMAN et al. (Eds.), “Panoramic Vision: Sensors, Theory, and Applications”, pp. 23-38, Springer, New York, NY, 2001 [12] G.K. FRICKE, D.P. GARG – Discrimination and Tracking of Individual Agents in a Swarm of Robots, Proc. of the American Control Conference, pp. 2742-2747, Baltimore, MD, Jun. 30-Jul. 02, 2010 [13] J.A. da C.P. GASPAR – Omnidirectional Vision for Mobile Robot Navigation, Ph.D. Thesis, Technical University of Lisbon, Portugal, Dec. 2002 [14] T. KRAJNIK, M. NITSCHE, J. FAIGL, P. VANEK, M. SASKA, L. PREUCIL, T. DUCKETT, M. MEJAIL – A Practical Multirobot Localization System, Journal of Intelligent & Robotic Systems, Vol. 76, No. 3, pp. 539-562, Dec. 2014 [15] H. KOYUNCU, S.H. YANG – A Survey of Indoor Positioning and Object Locating Systems, International Journal of Computer Science and Network Security, Vol. 10, No. 5, pp.121-128, May 2010 [16] M. DORIGO, D. FLOREANO, L.M. GAMBARDELLA, F. MONDADA et al. – Swarmanoid. A Novel Concept for the Study of Heterogeneous Robotic Swarms, IEEE Robotics & Automation Magazine, Vol. 20, No. 4, pp. 60-71, Dec. 2013 [17] F. MONDADA, G.C. PETTINARO, A. GUIGNARD, I.W. KWEE, D. FLOREANO, J.-N. DENEUBOURG, S. NOLFI, L.M. GAMBARDELLA, M. DORIGO – Swarm-Bot: A New Distributed Robotic Concept, Autonomous Robots, Vol. 17, No. 2, pp. 193-221, Sep. 2004
Editorial Office: “MTA Review” 39-49 George Cosbuc Ave., Sector 5, 050141 Bucharest, R OMANIA Tel.: +4021 335 46 60 / 112, Fax: +4021 335 57 63, e-mail:
[email protected] Website: www.journal.mta.ro ISSN 1843-3391
Published: The Military Technical Academy Coordinating Editor: Col. Assoc. Prof. Eng. CONSTANTIN ENACHE, Ph.D. Text Editing: Eng. Magdalena Corina MAZILU Printing: Lt. Răzvan CHIRIŢĂ, Viorica TOMA, Adrian STĂNICĂ, Alina ANDREESCU
Printed in The Military Technical Academy 98 pages 0208