technical design note

9 downloads 0 Views 293KB Size Report
a controller board (Easy-StepTM 3000, Active Robots Ltd., Radstock, UK) housed in the base. 102 of the assembly. The board is programmed and controlled via ...
1

TECHNICAL DESIGN NOTE

2 3

A low-cost automated focusing system for time-lapse

4

microscopy

5 6

E F Wright1, D M Wells1,2, A P French2, C Howells2 and N M Everitt1,2

7

University of Nottingham, UK

8 9 10 11 12

1

School of Mechanical, Materials and Manufacturing Engineering Centre for Plant Integrative Biology, School of Biosciences

2

E-mail: [email protected]

13 14

Abstract

15

We present a flexible, low-cost system for maintaining image focus during time-lapse and

16

video microscopy, where focus drift over time can be problematic. The system comprises a

17

stepper motor controlled by software which maintains focus in a closed-loop feedback

18

arrangement using an image analysis approach which quantifies the amount of detail in the

19

image. The focusing attachment is not microscope specific and is comprised of components

20

totalling less than 200 GBP.

21 22

Keywords

23

Time-lapse microscopy, autofocus, Laplacian filters

24 25

This article features online multimedia enhancements

26

1. Introduction

27 28

Microscopic analyses of dynamic biological processes such as seedling germination and plant

29

root growth require long-term observations, often over several days. This is often achieved

30

using time-lapse image capture to record digital images at fixed intervals. Such image

31

sequences can be confounded by loss of focus due to microscope stage drift and sample

32

movement, requiring constant monitoring and adjustment by the user. Commercial solutions

33

are available but are expensive and usually bespoke, limiting their utility.

34

presents a method to aid the imaging of time lapse and video microscopy sequences by

35

automatically maintaining the image in sharp focus. We have developed a flexible, low cost

36

system that allows automated maintenance of focus across a range of laboratory microscopes,

37

as an intrinsic part of the image capture process required for time-lapse data acquisition.

1

This paper

38 39

2. Determination of Focus

40

Determining the clarity of focus in images is a much researched area [e.g. 1, 2, 3]. The

41

ability to take a sharp, in focus image is clearly a common necessity, and we can use

42

techniques developed for focusing camera images in order to quantify the clarity of focus for

43

microscope images, and use this value in a closed-loop feedback arrangement in order to

44

optimally focus the image by controlling the microscope focus mechanism.

45

The approach described is not specific to a particular model of camera, microscope or

46

focusing motor and the method for quantifying the level of focus for an image will work with

47

any suitable digital image. The method is based on calculating a Laplace operator across an

48

image [4]. This is a measure of the second spatial derivative of an image. It is fast to

49

compute and sufficiently accurate for our needs; however, any choice of focus metric may be

50

used instead if desired. High outputs from the filter indicate areas of large intensity change,

51

normally indicating an edge within the image.

52

components of the image, i.e. the edges of a subject, should be most prominent when that

53

subject is optimally focused. As the subject drifts out of focus, the edges blur, producing

54

lower second-derivatives across the image. Therefore, the Laplacian filter should produce its

55

largest output when the image as a whole is in optimal focus. In practice, the function may

56

be multi-modal, as artefacts such as dust or optical effects may cause peaks outside the target

57

plane of focus. However, we assume that as the system is initialized on a focus peak and

58

updates frequently, the focus should never drift far from the target focus plane and so the

59

target peak is the most local one for the search to find.

60

For any scene, the highest frequency

The image is grabbed from the camera using an IEEE 1394 FireWire connection and

61

a

62

(http://www.alliedvisiontec.com/avt-products/software.html). The Laplacian is calculated

63

using

64

(http://sourceforge.net/projects/opencvlibrary). The system accumulates the output of the

65

Laplacian filter across the image, and the total accumulated result should be maximal when

66

the image is in focus. The object of interest may drift in and out of the focal plane in

67

different areas of the image; therefore we can select a region of interest over which to

68

optimise focus, rather than use the whole image. Figure 1 shows the variation of the value of

69

the Laplacian filter-derived output (hereafter referred to as the “focus metric”) derived from

70

images at various levels of focus, achieved by moving the microscope stage fixed distances

71

from the fully-focused position.

software

development

functions

from

the

kit

supplied

OpenCV

open

72

2

from

the

source

camera

computer

manufacturers

vision

library

Focus metric x106 73 74

Stage shift (µm)

75

76 77 78 79 80

Figure 1. (Top) Variation of focus metric with image focus. (Bottom) Images at stage shift = 0 µm (in focus), 50 µm, and 150 µm respectively. Stage shifts are measured from the fully focused position. The subject is a growing root of the model plant Arabidopsis thaliana at 10X magnification.

81

Being a second-derivative measurement, the Laplacian of an image can be sensitive to the

82

noise introduced during image capture. However, it is assumed that this acquisition noise

83

remains constant for both in- and out-of-focus images, and therefore should not affect the

84

estimation of focus. Noise has not been an issue in practice, although excessive noise may

85

overshadow the edges in cases of low signal to noise ratios. In such a case, removing the

86

noise using an appropriate method (such as Gaussian smoothing to remove Gaussian noise

87

[5] or median filtering to remove so called salt and pepper noise [5]) may improve the result.

88 89 3

90

3. Hardware

91

Focus is controlled via a unipolar 2-phase stepper motor (Sanyo Denki, Tokyo, Japan)

92

attached via a 10:1 reduction gearbox (Trident Engineering Ltd., Wokingham, UK) to the

93

fine-focus control of the microscope - in the first instance a Zeiss Axiostar Plus light

94

microscope (Carl Zeiss Ltd., Welwyn Garden City, UK) adapted to allow the vertical

95

imaging of growing plant roots. The focus control connector can be replaced with a spring-

96

loaded version to allow the motor assembly to be fitted to other microscopes or replaced with

97

custom fittings for specific models. The height of the motor assembly and the horizontal

98

position of the connector are adjustable, allowing the unit to be easily fitted to a wide range

99

of standard laboratory microscopes. In half-step mode, the motor has a step-size of 0.9°,

100

giving the system a maximum resolution of 0.09° per step. This equates to a stage movement

101

of 0.1µm per step for the microscope model used in development. The motor is controlled by

102

a controller board (Easy-StepTM 3000, Active Robots Ltd., Radstock, UK) housed in the base

103

of the assembly.

104

computer connected via an RS232 serial communications cable. The motor can be driven by

105

user-controlled stand-alone software (allowing remote control of focus), by a potentiometer

106

knob for manual use when not connected to a computer, or by the automatic focusing and

107

image acquisition software. The software controls the motor mode (full or half-step), the

108

initial and maximum step rates, the ramp factor, and the number of steps moved in any

109

sequence. The camera used was an Oscar F-810C (Allied Vision Technologies, Stadtroda,

110

Germany) which was already being used for time-lapse image capture. Full schematics of the

111

motor and control assembly are available online, and the program sourcecode will be

112

available for download.

The board is programmed and controlled via software running on a

113 114

4. Operation

115

The process is initialized by the user, who focuses the target manually, and can use the

116

onscreen focus metric value as an aid to focusing. This positions the stage within the target

117

plane of focus, and consequently on the focus metric peak of interest (Figure 1). The

118

focusing feedback loop is driven by this measure of focus, which peaks at the point of

119

optimal focus and falls off as the subject drifts out of focus. The autofocusing cycle begins

120

with the motor moving a fixed number of steps in an arbitrary direction. The focus metric is

121

computed once again from a new image. If this movement has increased the focus metric, the

122

motor moves again in the same direction; if the metric has decreased the motor switches

123

direction. Every time the direction is switched, the number of steps that the motor moves is

124

decreased. This allows the search to be refined near the top of the peak of the focus curve.

125

When the step size falls below a threshold value, the image is considered in focus, and the

126

focusing cycle halts for a fixed period of time, whereupon the process is repeated. This

4

127

approach is based on Cornsweet’s staircase search method [3]. As a precaution, the system

128

logs the number of steps taken in any particular direction, and a software stop is set to

129

prevent the system moving the stage beyond its physical limits.

130 131 132

Figure 2 shows the system in use during a time-lapse growth experiment, recording root

133

growth of the model plant Arabidopsis thaliana.

134 135

136 137 138 139 140 141 142 143 144 145

Figure 2 A visualisation of the system in use, showing external views of the focusing motor rig (top left), a screen capture from the camera attached to the microscope (top right) and a plot of the focus metric over time (bottom). The image was manually moved out of focus twice. The plot shows the software returning the image to focus after each disturbance. The data visualization was produced using Digital Replay System [6] and a corresponding video is available online.

5. Conclusion

5

146

The system presented here provides an inexpensive and flexible solution to maintaining

147

microscope focus during time-lapse recording of dynamic biological processes. The system

148

will work with a range of laboratory microscopes, requires no hardware modification, and

149

creates minimum disruption to the work area, making it suitable for a wide range of

150

applications. One cycle of the focusing procedure takes in the order of seconds, allowing

151

frequent capture of time-lapse images. An added advantage of the system is that it provides

152

remote access to the experiment: users can access the recording computer via a network

153

connection to monitor and adjust focus without having to be present in the laboratory for the

154

duration of the experiment. Future applications could include combining the system with an

155

automated stage translation component, allowing multiple regions of the sample to be bought

156

into view, focused, and imaged in turn.

157 158 159 160 161 162 163 164 165 166 167 168 169

Acknowledgements CPIB is a centre for Integrated Systems Biology supported by BBSRC and EPSRC. The authors wish to thank Tony Pridmore for comments on the manuscript. References [1] Subbarao M, Choi T and Nikzad A 1993 Focusing Techniques Optical Engineering 32 2824-2836

170 171

[2] Geusebroek J-M, Cornelissen F, Smeulders A. W. M, and Geerts H 2000 Robust autofocusing in microscopy Cytometry 39 1-9

172 173

[3] LeSage A J and Kron S J 2002 Design and implementation of algorithms for focus automation in digital imaging time-lapse microscopy Cytometry 49 159-169

174 175

[4] Groen F C A,Young I T and Ligthart G A 1985 Comparison of different focus functions for use in autofocus algorithms Cytometry 6 81-91

176 177

[5] Gonzalez R C and Woods R E 2008 Digital Image Processing Prentice Hall ISBN 9780131687288

178 179 180 181 182 183

[6] Crabtree A, French A, Greenhalgh C, Benford S, Chevherst K, Fitton D, Rouncefield M, and Graham C 2006 Developing digital records: early experiences of record and replay 2006 Computer Supported Cooperative Work: The Journal of Collaborative Computing 15 281319

6