An Introduction to Applied Underwater Robotics - Interactive ...

53 downloads 1464 Views 6MB Size Report
Universitat de Girona. Computer Vision and Robotics Group. Presented by: Dr. Pere Ridao. An Introduction to Applied. Underwater Robotics. Robòtica.
Universitat de Girona Computer Vision and Robotics Group

An Introduction to Applied Underwater Robotics Presented by:

Dr. Pere

Ridao

VICOROB Research Team

Anàlisi d’imatge

Percepció 3D

Robòtica submarina

Visió submarina

Hardware en temps real

CIRS: Research Center in Underwater Robotics

Introduction Applications ICTINEUAUV, a research testbed Navigation & Mapping Conclusion Future Work

3

Introduction

OCEANS Exploration •  71 % earth surface is covered by water •  37 % of the populations lives at less than 100 km form the coast •  Oceans are a source of food and resources •  Oceans play an important role in the clima

Technology

Manned Submersibles

ROVs

AUVs

4

Introduction

Detph vs Technology

155 m 308 m 600 m

6000 m

10,911 m

5

Introduction: Marine Robots

ASC

Glid Gl ider er Glider Hybri Hybrid yb b id ROV/ OV//A AUV ROV/AUV

IAUV

ROV ROV Survey AUV

Hovering AUV

Introduction: UdG Robot Prototypes

CIRS-UdG Robots 1995

2001

2005

2006

Applications

Industrial

Scientific

2010

Introduction Applications ICTINEUAUV, a research testbed Navigation & Mapping Conclusion Future Work

8



Applications: Dam Inspection [P. Ridao et al., JFR10]

Pasteral dam Objective: Execute an inspection of a dam wall to search for cracks or other damages on the concrete.

9



Applications: Dam Inspection [P. Ridao et al., JFR10]

Pasteral dam

10



Applications: Habitat Mapping [P. Ridao et al., WPDC10]

Mequinenza dam Objective: Providing visual validation for a sonar-based system developed to detect zebra mussel colonies.

11



Applications: Habitat Mapping [P. Ridao et al., WPDC10]

Mequinenza dam

12



Applications: Seafloor Mapping

Dive 1 Dive 2

Dive 4

Dive 3

AZORES Workshop

FREESUBNET IN COOPERATION WITH FREESUBNET RTN NETWORK



Applications: Seafloor Mapping

20 m 5m

5m

20 m

Dive D Di ive ve 4

Dive 4

Di D ivvee 3 Dive AZORES Workshop

FREESUBNET IN COOPERATION WITH FREESUBNET RTN NETWORK



Applications: Multimodal Mapping [Escartin et al., GGG08]

Multimodal Maps

Very Large Maps

Image Mosaic & Bathymetry registration

20.000 images mosaic (6 days of ROV survey)

Eiffel Tower hydrothermal vent. Data from IFREMER

Lucky Strike Hydrothermal Vent site Data from WHOI

15



Applications: Micro-Bathymetry & 3D Mosaicing [Nicosivici et al. OCEANS08]

3D Mosaics

16

Introduction Applications ICTINEUAUV, a research testbed Navigation & Mapping Conclusion Future Work

17

ICTINEUAUV: A bit of history How did it start ... (2006)

18

ICTINEUAUV: A bit of history [D. Ribas et al., ICRA07]

How did it continue... (2006) 4 Phases

Pass the Gate

Score the Cross

Hit the target

Recover

19

Breaking the Surface 2009

ICTINEUAUV: A bit of history There are other ICTINEUS ...

Narcís Monturiol 1819-1885

ICTINEU II, Model Barcelona harbour

ICTINEUAUV, to pay homage to Narcís Monturiol

ICTINEU3, Manned Submersible under development 20

ICTINEUAUV: The Robot The Ictineu AUV Characteristics Open frame design Small form factor (74 x 46.5 x 52.4 cm) Lightweight (52 Kg) Complete sensor suite ROV/AUV

21

ICTINEUAUV: The Robot

Unthetered

Buoy

Tethered

The Ictineu AUV

22

ICTINEUAUV: The Robot The Ictineu AUV Pressure vessels Power module (2 sealed 12V 12Ah lead acid batteries) Computer module (PC104 and Mini-ITX computers)

23

ICTINEUAUV: The Robot The Ictineu AUV Thrusters 2 vertical thrusters 4 horizontal thrusters Motion controlled in 4 DoF (surge, sway, heave and yaw)

24

ICTINEUAUV: The Robot The Ictineu AUV Cameras Forward-looking color camera Downward-looking b&w camera DVL (Doppler Velocity Log)

3D velocities (bottom/water) Pressure Range

ICTINEUAUV: The Robot The Ictineu AUV AHRS

Heading, pitch, roll and heave acceleration.

Fibre optic gyro

Heading with low drift rate

ICTINEUAUV: The Robot The Ictineu AUV MSIS (Mechanically Scanned Imaging Sonar)

Generation of acoustic images of the surroundings 360º scans around the vehicle Maximum range of 100 m

27

ICTINEUAUV: The Robot The Ictineu AUV USBL transponder

Vehicle positioning Acoustic modem

28

ICTINEUAUV: The Software Architecture [Palomeras et al. MCMC09]

Robot interface Software objects that dialog with the hardware

Two types: Sensor objects Actuator objects

29

ICTINEUAUV: The Software Architecture [Palomeras et al. MCMC09]

Perception module Navigator object: Estimate the position and velocity of the robot (EKF) Obstacle Detector: Determine the position of obstacles (wall, bottom, )

30

ICTINEUAUV: The Software Architecture [Palomeras et al. MCMC09]

Control module Receives sensor inputs and sends command outputs to the actuators. Behaviours: GoTo WallInspection Distance Heading Start/Stop Camera Check Water Check Temperature and Pressure 31

ICTINEUAUV: The Software Architecture [Palomeras et al. MCMC09]

Mission control Defining the task execution flow to fulfill a mission

32

Introduction Applications ICTINEUAUV, a research testbed Navigation & Mapping Conclusion Future Work

33

Fundamental Problems in Underwater Robotics...

Where am I?

Navigation Where are the amphoras?

Mapping What path should f ll ? I follow?

What force should I Apply to achieve the desired speed?

Control

Path Planning

How should I steer To follow the desired path

Guidance

Breaking the Surface 2009

Navigation & Mapping SLAM: Simultaneous Localization And Mapping Localization Algorithms

Map

Environment

Mapping Algorithms

Robot Pose

Navigation & Mapping: 

The Navigation Problem Navigation: Estimate the position, orientation and velocity of a vehicle xb B yb

North Pole

{E} Origin at the centre of the earth. Earth fixed.

zb

{N} Origin at P=[l,] on the earth surface. Plane XY tg to earth surface. Axis pointing North-East-Down {L} Same origine than N. Rotated wrt to zN a certain angle to avoid the singularity in the pole. From Gade 2008

{B} Vehicle attached frame

Navigation & Mapping: 

Inertial Navigation Systems Navigation: Estimate the position, orientation and velocity of a vehicle Inertial Navigation Systems

Inertial sensors are used for the navigation. 3 Accelerometers are used for the linear motion estimation. 3 Gyroscopes are used for the angular motion estimation. The sensors are expensive and require an accurate calibration. The position estimate drifts over time.

Navigation & Mapping: 

Inertial Navigation Systems Navigation: Estimate the position, orientation and velocity of a vehicle Inertial Navigation Systems Measure acceleration & angular velocity. Computer linear velocity, position and attitude.

Strapdown systems avoid moving parts using virtual gyro-stabilization techniques

Early INS were based on gyro-stabilized gimbaled platforms 

Navigation & Mapping: 

Inertial Navigation Systems Navigation: Estimate the position, orientation and velocity of a vehicle Inertial Navigation Systems (Strapdown)

  F    f IB  a IB  g B  a IB  gravitation m

 +  !122#

 IBB

Can be measured using a triad acc. Sagnac effect (1925) Due to the rotation the light path is longer cw than acw. The phase delay is proportional to the Ω

 

 

DL c

Navigation & Mapping: 

IMU vs. AHRS vs. INS Navigation: Estimate the position, orientation and velocity of a vehicle Inertial Navigation Systems (Strapdown)

Navigation & Mapping: 

Inertial Navigation Systems [Gade, 2004]

Navigation: Estimate the position, orientation and velocity of a vehicle Inertial Navigation Systems (Strapdown)

Gyros

Accelerometers

Angular velocity,



Attitude, LB or roll/pitch/yaw

B IB

Velocity, Specific force,

IMU

 IBB

Navigation Navigation Equations Equations

L EB

Horizontal E or longitude/ position,  latitude Depth,

z

INS



Navigation & Mapping: 

Inertial Navigation Equations [Gade, 2004]

Gyros

 IBB

L  LB  LB  IBB    IEL  EL  LB

  dt

IEL  LEIEE

   

 

LB Accelerometers

 IBB

LB

L L EB  LB  IBB  BL  IEL  IEL  EB  L   2IEL  EL   EBL

  dt Assuming:  spherical earth  wander azimuth L

     

L EB

L EL 

L EB

1 L L EB  EB   rEB

Not included:  vertical direction  gravity calculation

L  EL  EL  EL 

    

  dt

EL



EL Kenneth Gade, FFI

Navigation & Mapping: 

Doppler Based Navigation Navigation: Estimate the position, orientation and velocity of a vehicle Inertial Navigation Systems  DVL Based Navigation

A Doppler Velocity Log (DVL) is used for measuring the velocity An AHRS is used to estimate the vehicle Attitude. Position is computed through dead reckoning. The position estimate drifts over time. time. More accurate than INS systems  (only one integral)

Navigation & Mapping: 

Doppler Based Navigation Doppler Effect Change in frequency of a wave for an observer moving relative to the source of the wave. The received frequency is higher (compared to the emitted frequency) during the approach It is identical at the instant of passing by It is lower during the recession.

Navigation & Mapping: 

Doppler Based Navigation v: Speed of the sound source c: Sound Speed T: Periode

Doppler Effect 1st wave ited emmited

2nd wave emmited

ft : Source (transmitted) frequency fr : Received frequency

λr : Received wavelength

ct = vT + c(t − T ) + λr

vT

c(t-T)

ct = vT + ct − cT + λr r

0 = vT − cT + λr

λr = (c − v)T ⎫ ⎫ c−v⎪ ⎪ 1 ⎪ ⎬ ⇒ λr = ⎛ c ⎞ T= = f ⇒ f f ⎬ ⎜⎝ ⎟ r t t ⎪ ft c − v⎠ ⎭ ⎪ ⎪ fr λ r = c ⎭

Navigation & Mapping: 

How the DVL works [Brokloff, OCEANS94]

1. Moving Sound Source & Static Detector The acoustic projector emits.



The seafloor acts as a hydrophone

bott

2. Moving Detector & Static Sound Source The acoustic projector emits. The seafloor acts as a hydrophone

 

trans

DVL Doppler Shift trans

fr

bott

= ftbott

Navigation & Mapping: 

How the DVL works [Brokloff, OCEANS94]

Robot Velocity in the bf-frame (Body) i axis direction

fa

vbf

vfa vi

vbv

i axis velocity Doppler shift at i axis

bv

LS Solution

Navigation & Mapping: 

Absolute Acoustic Positioning Navigation: Estimate the position, orientation and velocity of a vehicle Inertial Navigation Systems  DVL based Navigation Absolute Acoustic Positioning

Navigation & Mapping: 

LBL vs. GIB Long Base Line vs GPS Intelligent Equipped Buoys

Figures: [Alcocer, 2009]

4 beacons are needed for 3D Navigation (3 if depth is known) Transponders are fixed at the bottom

Transponders are drifting buoys

Transponders position need calibration

Transponders position is obtained with GPS Navigation is solved on surface

Navigation is solved on the AUV Proximity to the vehicle allows for high frequency & high accuracy

They buoys are RF connected The AUV sends sends the depth to the buoys

Navigation & Mapping: 

A Simple Trilateration algorithm [Alcocer, 2009]

Range Only Localization Problem 2.2.1 (Range-Only Localization). Let p ∈ Rn be the position of a vehicle and pi ∈ Rn ; i ∈ {1, . . . , m} be the positions of a set of landmarks. Further, let ri = p − pi  denote the distance between the vehicle and landmark i. Define

∈ Rn of p based on a set of range r = [r1 , . . . , rm ]T . Compute an estimate p measurements ¯r = r + w ∈ Rm , where w ∈ Rm is a zero mean Gaussian disturbance vector with covariance R ∈ Rm×m .

The vehicle is assumed to be static.

p2

Does not require knowledge about vehicle dynamics.

r2 r1 p1 r3

p3

p

{I}

Figure: [Alcocer, 2009]

Navigation & Mapping: 

A Simple Trilateration algorithm [Alcocer, 2009]

Range Only Localization: Unconstrained Least Squares (LS-U) P = p1 . . . pm ∈ Rn×m

p2

r2 r1 p1 r3 p

p3

di = ri2 = p − pi 2

Landmark positions Squared distance to landmark p1

= (p − pi )T (p − pi ) = pT p − 2pTi p + pTi pi

{I}

⎤ ⎡ ⎤ ⎡ ⎤ ⎤ ⎡ ⎡ ⎤ 1 d1 pT p − 2pT1 p + pT1 p1 pT1 p1 pT1 ⎥ ⎢ ⎥ ⎥ ⎢ . ⎥ ⎢ ⎢ ⎢ ⎥ .. ⎥ = p2 ⎢ ... ⎥ − 2 ⎢ ... ⎥ p + ⎢ ... ⎥ .. ⎥ = ⎢ d=⎢ . ⎦ ⎣ ⎦ ⎦ ⎣ ⎦ ⎣ ⎣ ⎣ ⎦ dm pT p − 2pTm p + pTm pm pTm pTm pm 1 ⎡

= p2 1m − 2PT p + δ(PT P)

¯ =d+ξ d

¯ +ξ 2PT p − p2 1m = δ(PT P) − d Now we should solve for p

Navigation & Mapping: 

A Simple Trilateration algorithm [Alcocer, 2009]

Range Only Localization: Unconstrained Least Squares (LS-U) ¯ +ξ 2PT p − p2 1m = δ(PT P) − d

Aθ = b + ξ  p −1m = δ(PT P) − d¯ + ξ p2

We reorganize the equation as



2PT

θ

A

b

And solve neglecting the constrain betwen p and  p2

θ ∗ = arg min Aθ − b2 θ∈Rn+1

θ ∗ = (AT A)−1 AT b

Navigation & Mapping: 

SBL vs USBL Short Base Line vs Ultra Short Base Line 

Figures: [Alcocer, 2009]

Navigation is solved in the boat Calibration is done at the factory Shorter baseline means less accuracy Baseline is reduced to few meters

Baseline is reduced to few centimeters

Fixed infrastructure in the ship. Does no allows for opportunity boats.

For very deep water, accuracy is bad May be mounted easily on ships of opportunity

Navigation & Mapping: 

Acoustic Wave Transmission [P. Milne, 1983]

Planar Wave Approximation

Spherical Wave

Plannar Wave approximation Range >> baseline

Navigation & Mapping: 

Transponders vs. Beacons (pingers) [P. Milne, 1983]

Projector: Converts an electrical signal into an acoustic wave. Hydrophone: Converts an acoustic wave into an electrical signal. Beacon: Acoustic device that uses a projector to generate a periodical acoustic pulse. Transponder: Acoustic device that uses a projector to generate a periodical acoustic pulse and then an hydrophone to detect the corresponding echo.

Beacon Mode

Transponder Mode

Navigation & Mapping: 

How SBL works [P. Milne, 1983]

Beacon-based Short Base Line 

bx

dt3−1 = t3 − t1 by

dR3−1 = R3 − R1 dR3−1 = c ⋅ dt3−1 c ⋅ dt3−1 = sin θ x ⋅ bx ⎧ Z a = D cosθ x ⎫ Za Xa given that ⎨ = ⎬⇒ ⎩ X a = Dsin θ x ⎭ cosθ x sin θ x ⇒ X a = Z a ⋅ tgθ x

x dR=c· dt x

Za D x Xa

the same applies for: Ya = Z a ⋅ tgθ y When the vessel is in the vertical ≅ 0 Z ⋅ c ⋅ dt3−1 Xa = a bx Ya =

Z a ⋅ c ⋅ dt2−1 by

Navigation & Mapping: 

How SBL works [P. Milne, 1983]

Transponder-based Short Base Line  H2 b

a b

H4

H1 Xa

a H3

Ya R4

x

y R2

Za

R3

R1

⎧ R12 = ( X a − a )2 + (Ya + b )2 + Z a2 ⎪ ⎪⎪ R22 = ( X a − a )2 + (Ya − b )2 + Z a2 ⎨ 2 2 2 2 ⎪ R3 = ( X a + a ) + (Ya + b ) + Z a ⎪ 2 2 2 2 ⎪⎩ R4 = ( X a + a ) + (Ya − b ) + Z a ⎧ R32 − R12 = 4aX a ⎫⎪ R32 − R12 + R42 − R22 = ⇒ X ⎪ 2 ⎬ a 2 8a ⎪ R4 − R2 = 4aX a ⎪⎭ ⎨ 2 2 R12 − R22 + R32 − R42 ⎪ R1 − R2 = 4bYa ⎪⎫ ⇒ Y = a ⎪ R 2 − R 2 = 4bY ⎬ 8b ⎪ 4 a⎭ ⎩ 3

(

) (

(

) (

)

)

Z a = ⎡ R12 − ( X a − a ) − (Ya + b ) + R22 − ( X a − a ) − (Ya − b ) ⎢⎣ 2

2

2

2

2 2 2 2 R32 − ( X a + a ) − (Ya + b ) + R42 − ( X a + a ) − (Ya − b ) ⎤ / 4 ⎥⎦

Navigation & Mapping: 

How USBL works [P. Milne, 1983]

Transponder-Based Ultra Short Base Line  b t



b· cos

t: wave time delay Δϕ: wave phase delay : incidence angle

b cosθ c Δϕ Δϕ ω= ⇒ Δt = Δt ω Δt =

⎛ b⎞ Δϕ = 2 π f ⎜ ⎟ cosθ ⎝ c⎠ ω

b: baseline f: frequency c: sound speed

⎛ Δϕ ⋅ c ⎞ θ = cos−1 ⎜ ⎝ 2π f ⋅ b ⎟⎠

Navigation & Mapping: 

How USBL works [P. Milne, 1983]

Ultra Short Base Line  Xa x

Ya y R Za

c⋅t 2 X a = R cosθ x

R=

Ya = R cosθ y R 2 = X a2 + Ya2 + Z a2

( )

Z a = R 1 − cos2 (θ x ) − cos2 θ y

Boat-Transceiver + Vehicle-Transponder

Navigation & Mapping: 

Map-based Navigation Navigation: Estimate the position, orientation and velocity of a vehicle Inertial Navigation Systems  DVL based Navigation Absolute acoustic Positioning Map–based navigation

Ground-fixed, localization given an a priori map of the environment. Bathymetric, gravitational anomaly and magnetic field maps.  An up-to-date map of enough resolution is not always available. Map may, or may not, be known a priori.

Mapping & Navigation

Non Structured

Structured

Environment

Mapping & Navigation

Sensing The World: Acoustic Imaging The operation of a MSIS 

Mapping & Navigation

Sensing The World: Acoustic Imaging The operation of a MSIS 

Beam Bin

Phd Thesis: Underwater SLAM for Structured Environments Using an Imaging Sonar 

Understanding MSIS Particularities of the MSIS  Indetermination in the vertical position of the target:

Acoustic beam reflected by a wall:

Mapping & Navigation

Sensing The World: Acoustic Imaging Particularities of the MSIS  Indetermination in the vertical position of the target:

Reflections of the acoustic beam:

Mapping & Navigation

Multibeam vs. Mechanical Scanning Motion Distortion Multibeam scanner

Full scan in time step k

Mechanical Scanned Imaging Sonar

Full scan in m time steps {k → k+m}

Mapping & Navigation

MSIS: Compensating for motion-distortions Motion Distortion

Distorted Data

Corrected Data

Mapping & Navigation

Sensing The World: Acoustic Imaging The operation of a Side Scan Sonar 

Most used underwater imaging method Uses beams of sound transmitted out a towfish The beams are narrow (1-2°) in the along-track direction and wide (40-50°) in the vertical one Towfish include port and starboard transducers Sound is reflected back from objects to the towfish Hard objects reflect more energy than soft Projected shadows will be observed behind objects The image is built up one line of data at a time 

Mapping & Navigation

Sensing The World: Range & Bearing The operation of a Multibeam Sonar Profiler 

Range & Bearing sensor Multiple beams simultaneously grabbed May be operated from a ship or from an AUV. The higher the altitude, the lower the resolution. The higher the frequency the higher the resolution but the lower the range. 

Mapping & Navigation

Terrain Based Navigation Where am I?

Mapping & Navigation

Terrain Based Navigation I Could be anywere !

Mapping & Navigation

Terrain Based Navigation

What do I see?

Mapping & Navigation

Terrain Based Navigation

Do I have a Map?

Mapping & Navigation

Terrain Based Navigation Let me check were I am!

Mapping & Navigation

Terrain Based Navigation

This is what I expect to see?

This is what I have seen! I’M NOT HERE

Mapping & Navigation

Terrain Based Navigation

This is what I expect to see?

This is what I have seen! BOTH AGREE!

Mapping & Navigation

Terrain Based Navigation

SO I’M HERE

Mapping & Navigation

Simultaneous Localization And Mapping

> Localization Through SLAM:

Step:k+1 k+1 Step:

Step: k

Internal Sensors

External Sensors

PREDICTION MEASUREMENT

Y

UPDATE

{B}

DATA ASSOCIATION

X



Experimental Results

1

Abandoned Marina Data-Set

2

Dead Reckoning

3

Scan Matching Improved Dead Reckoning

4

USBL Navigation

5

Map based Navigation

6

Underwater SLAM 79

Experimental Results

Abandoned Marina Data-set Abandoned Marina Data-set Fluvià Nàutic, St Pere Pescador (Spain) 

http://cres.usc.edu/radishrepository/view-one.php?name=abandoned_marina

600 [m] trajectory

DVL, MRU, Depth

DGPS Ground Truth

MSIS data

80

Experimental Results

Dead Reckoning Roadmap

1

Abandoned Marina Data-Set

2

Dead Reckoning

3

Scan Matching Improved Dead Reckoning

4

USBL Navigation

5

Map based Navigation

6

Underwater SLAM 81

Experimental Results

Dead Reckoning Dead Reckoning Constant Velocity kinematic model DVL, MEMS-AHRS, Depth Updates Drifts over time

Prediction

Correction

DVL

MRU

Pressure

EKF (vehicle)

82

Experimental Results

Scan Matching Improved Dead Reckoning

1

Abandoned Marina Data-Set

2

Dead Reckoning

3

Scan Matching Improved Dead Reckoning

4

USBL Navigation

5

Map based Navigation

6

Underwater SLAM 83

Experimental Results

Scan Matching Improved Dead Reckoning [E. Hernandez et. al. IROS09]

Scan Matching Improved Dead Reckoning Compute the relative displacement of a vehicle between two consecutive configurations by maximizing the overlap between range & bearing scans. Minimization: LS over the Mahalanobis distance.

Scan Matching ^q k

^q pIC

do{ •  Association •  Minimization }while (!convergence)

84

Experimental Results

Scan Matching Improved Dead Reckoning [E. Hernandez et. al. IROS09]

Probabilistic Scan Matching Dead Reckoning + SM Mechanical Scanning Imaging Sonar Polar Range & Bearing scans Deals with motion induced Distortion Drifts less, but drifts

85

Experimental Results

USBL Navigation

1

Abandoned Marina Data-Set

2

Dead Reckoning

3

Scan Matching Improved Dead Reckoning

4

USBL Navigation

5

Map based Navigation

6

Underwater SLAM 86

Experimental Results

USBL Navigation

DGPS

EKF (vehicle)

Prediction

Correction

DVL

MRU

MRU USBL Transceiver

Pressure

87

Experimental Results

USBL Navigation EKF (USBL sist.)

EKF (vehicle)

Prediction

Prediction

Correction

Correction

DGPS

MRU

DVL USBL

MRU

Pressure

88

Experimental Results

Map-Based Navigation

1

Abandoned Marina Data-Set

2

Dead Reckoning

3

Scan Matching Improved Dead Reckoning

4

USBL Navigation

5

Map based Navigation

6

Underwater SLAM 89

Experimental Results

TBN: Voting Grid-based Localization [D. Ribas et al., ICRA07]

The voting procedure North

ρ X Y

The high intensity bin is likely to correspond with the tank boundaries.  The vehicle can only exist within the boundaries of the tank.

90

Experimental Results

TBN: Voting Grid-based Localization [D. Ribas et al., ICRA07]

Number of votes

X (m)

Distance (m)

The voting procedure

Distance (m)

Y (m)

The vehicle’s X-Y position is determined by the cell with the larges number of votes. The Z position is obtained from the pressure sensor.

91

Experimental Results

TBN: Voting Grid-based Localization [D. Ribas et al., ICRA07]

Managing compass errors Compass data may be perturbed when operating close to ferromagnetic structures. Determining erroneously the angle between the vehicle and the map will produce a dispersion of the votes and hence, a poor vehicle’s position estimate. Correct compass measurement

Perturbed compass measurement

92

Experimental Results

TBN: Voting Grid-based Localization [D. Ribas et al., ICRA07]

Managing compass errors

Y (m)

Number of votes

Perturbed compass measurement

X (m)

X (m)

Number of votes

Correct compass measurement

Y (m)

93

Experimental Results

TBN: Voting Grid-based Localization [D. Ribas et al., ICRA07]

Managing compass errors

94

Experimental Results

TBN: Voting Grid-based Localization [D. Ribas et al., ICRA07]

X (m) ( m)

Voting-based localization: SAUC-E 2006 final run

Y (m)

Uses Range & Bearing Scans Deals with the magnetic disturbance of the compass Does not Drift Requires a priori map 95

Experimental Results

TBN: Merged Grid Localization and Dead Reckoning [De Marina et al. CAMS 2007]  Constant Velocity kinematic model DVL, MRU, Depth updates Absolute XY position updates from the Grid Localization Does not Drift Requires a priori map EKF (vehicle) Prediction

Grid

Correction

Localization DVL

MRU

Pressure

96

Experimental Results

TBN: Monte Carlo Localization (PF) [F. Maurelli et al., MCMC09]

Monte Carlo Localization 100 Particles Motion model: Dead Reckoning Measurement Model ScanGrabbing + Prob. average Resampling: SIR + random set of particles Does not drift S t r u c t u re d / N o n structured environment Requires an a priori map 

97

Experimental Results

SLAM: Simultaneous Localization And Mapping

1

Abandoned Marina Data-Set

2

Dead Reckoning

3

Scan Matching Improved Dead Reckoning

4

USBL Navigation

5

Map based Navigation

6

Underwater SLAM 98

Experimental Results

SLAM: Structured Environment [Ribas et al. JFR08]

Feature based EKF SLAM EKF SLAM with line features Line features from MSIS data Line Uncertainty from the acoustic imprint DVL+MRU dead reckoning Statistical dependent Local Maps Does not drift Does not require an a priori map Deals only with structured environment

99

Experimental Results

SLAM: Non structured Environment [A. Mallios et al. IROS10]

Pose Based SLAM ASEKF SLAM with Scan Matching State = trajectory of scan poses Non contiguous Scan Matching Loop Closing Works as a Network of constrains Does not drift Does not require an a priori map Structured/nonstructured environment   

100

Experimental Results

Generation of photomosaics [Garcia et al. IROS01]

Registration of consecutive images

101

Experimental Results

Generation of photomosaics [Garcia et al. IROS01]

Registration of consecutive images

SURF method [Bay, 2006]

A Hessian detector identifies individual features. Feature description (gradient information at particular orientations and spatial frequencies). Matching (Euclidean distance between descriptors). Outlier rejection (RANSAC [Fischler, 1981]). 102

Experimental Results

Generation of photomosaics [Garcia et al. IROS01]

Registration of consecutive images

Registration of nonconsecutive images

Detection of nonconsecutive overlapping images

Motion estimation from consecutive images. Vehicle navigation data (when available). 103

Experimental Results

Generation of photomosaics [Garcia et al. ICRA02]

Registration of consecutive images

Registration of nonconsecutive images

Detection of nonconsecutive overlapping images

Motion estimation from consecutive images. Vehicle navigation data (when available). 104

Experimental Results

Generation of photomosaics [Ferrer et al. OCEANS 07]

Registration of consecutive images

Registration of nonconsecutive images

Global alignment

Small errors that occur during image registration cause misalignment. The image pairings are used as an input for the global alignment. Nonlinear optimization (bundle adjustment) that minimizes a cost function [Ferrer et al., 2007]. 105

Experimental Results

Generation of photomosaics [Ferrer et al. OCEANS 07]

Registration of consecutive images

Registration of nonconsecutive images

Global alignment

Small errors that occur during image registration cause misalignment. The image pairings are used as an input for the global alignment. Nonlinear optimization (bundle adjustment) that minimizes a cost function [Ferrer et al., 2007]. 106

Experimental Results

Generation of photomosaics [Ferrer et al. OCEANS 07]

Registration of consecutive images

Registration of nonconsecutive images

Global alignment

Crossover Detection & Optimization

The mosaic alignment is improved through several iterations of crossover detection and optimization. Iterations are repeated until no new crossovers are detected.

107

Experimental Results

Generation of photomosaics [Ferrer et al. OCEANS 07]

Registration of consecutive images

Registration of nonconsecutive images

Global alignment

Crossover Detection & Optimization

Blending

108

Experimental Results

Generation of photomosaics [Ferrer et al. OCEANS 07]

Registration of consecutive images

Registration of nonconsecutive images

Global alignment

Crossover Detection & Optimization

Blending

109

Experimental Results

Generation of photomosaics Experimental Setup

110

Introduction Applications ICTINEUAUV, a research testbed Navigation & Mapping Conclusion Future Work

111

Experimental Results

Conclusions 1 ICTINEUAUV has become a research platform for Navigation & Mapping 2

Probabilistic Map based Navigation Techniques has been tested (GL, MCL, EKF, Feature Based SLAM, SM Pose based SLAM, Photo-Mosaicing)      

3 Aplications: Dam Inspection, Marine Science Surveys, Archeology. 112

Introduction Applications ICTINEUAUV, a research testbed Navigation & Mapping Conclusion Future Work

113



The team

114 1 11 14



References [Palomeras et al. MCMC09]

Palomeras N., Ridao P., Carreras M., Silvestre C. Mission Control System for an Autonomous Vehicle: Application Study of a Dam Inspection using an AUV. 8th IFAC International Conference on Manoeuvring and Control of Marine Craft, Guaruja, Brasil, September 2009.

[E. Hernandez et. al. IROS09]

E. Hernandez, P. Ridao, D. Ribas and A. Mallios. Probabilistic Sonar Scan Matching for an AUV. International Conference on Intelligent Robots and Systems (2009)

[D. Ribas et al., ICRA07] [De Marina et al. CAMS 2007] [F. Maurelli et al., MCMC09] [Ribas et al. JFR08] [A. Mallios et al. OCEANS 2009]

D. Ribas, N. Palomeras, P. Ridao, M. Carreras and E. Hernàndez. Ictineu AUV wins the first SAUC-E competition. IEEE International Conference on Robotics and Automation, Roma, Italy, April 2007. G. García de Marina, D. Ribas and P. Ridao. A global localization system for structured environments using an imaging sonar. IFAC Conference on Control in Marine Systems, Bol, Croatia, September 2007. F. Maurelli, S. Krupinski, A. Mallios,Y. Petillot, P. Ridao. Sonar-based AUV localization using an improved particle filter algorithm. OCEANS 2009 IEEE; Bremen, Germany. D. Ribas, P. Ridao, J.D. Tardós and J. Neira. Underwater SLAM in Man Made Structured Environments. Journal of Field Robotics, 25(11-12):898–921, November - December 2008. A. Mallios, P. Ridao, E. Hernàndez, D. Ribas, F. Maurelli, and Y. Petillot. Pose-based slam with probabilistic scan matching algorithm using a mechanical scanned imaging sonar. Oceans IEEE, Bremen, Germany, May 2009.

[Garcia et al. IROS01]

R. Garcia, X. Cufi and M. Carreras. Estimating the motion of an underwater robot from a monocular image sequence. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), vol. 3, pp. 1682-1687, Maui, Hawaii, 2001.

[Garcia et al. ICRA02]

R. Garcia, J. Puig, P. Ridao and X. Cufi. Augmented State Kalman Filtering for AUV Navigation. IEEE International Conference on Robotics and Automation (ICRA), pp. 4010-4015, Washington, 2002.

[Ferrer et al. OCEANS 07] [P. Ridao et al., UUVS09]

[Escartin et al., GGG08]

[Nicosevici et al. OCEANS08] [P. H. Milne 1983] [Ridao et al. WPDC 2010] [Alcocer, 2009]

J. Ferrer, A. Elibol, O. Delaunoy, N. Gracias, R. Garcia. Large-Area Photo-Mosaics Using Global Alignment and Navigation Data. MTS/IEEE OCEANS,Vancouver (Canada), Nov. 2007. Pere Ridao, Marc Carreras, David Ribas, Rafael Garcia. Visual inspection of hydroelectric dams using an autonomous underwater vehicle. Journal of Field Robotics. Special Issue: State of the Art in Maritime Autonomous Surface and Underwater Vehicles, Part 1.Volume 27, Issue 6, pages 759–778, November/December 2010.J. Escartin, R. Garcia, O. Delaunoy, N. Gracias, A. Elibol, X. Cufí, L. Neumann, D.J.Fornari, S.E. Humphris, J. Renard. Globally aligned photomosaic of the Lucky Strike hydrothermal vent field (Mid-Atlantic Ridge, 37º18.5'N): Release of georeferenced data, mosaic construction, and viewing software. Geochemistry, Geophysics and Geosystems , vol. 9, no. 12, pp. 12(1)-12(17), 2008. T. Nicosevici and R. Garcia. On-line robust 3D Mapping using structure from motion cues. MTS/IEEE Techno-Ocean Conference (Oceans'08), Kobe, Apr. 2008. ISBN: 978-1-4244-2125-1. DOI: 10.1109/OCEANS KOBE.2008.4531022 Underwater Acoustic Positioning Systems, P.H. Milne 1983, ISBN 0-87201-012-0 P. Ridao, M. Carreras, R. García, D. Ribas, and J. Batlle. Advancing underwater robotics. Water Power & Dam Construction, 62(5):40–43, May 2010. Alex Alcocer Peñas. Positioning and Navigation Systems for Robotic Underwater Vehicles. Phd. Thesis. 2009.

115