Machine vision for digital microfluidics - Semantic Scholar

24 downloads 0 Views 842KB Size Report
Jan 22, 2010 - machine vision-based droplet motion control. It is expected that digital microfluidics-based machine vision system will add intelligence and ...
REVIEW OF SCIENTIFIC INSTRUMENTS 81, 014302 共2010兲

Machine vision for digital microfluidics Yong-Jun Shin and Jeong-Bong Lee Department of Electrical Engineering, The University of Texas at Dallas, 800 W. Campbell Rd., Richardson, Texas 75080, USA

共Received 1 September 2009; accepted 23 November 2009; published online 22 January 2010兲 Machine vision is widely used in an industrial environment today. It can perform various tasks, such as inspecting and controlling production processes, that may require humanlike intelligence. The importance of imaging technology for biological research or medical diagnosis is greater than ever. For example, fluorescent reporter imaging enables scientists to study the dynamics of gene networks with high spatial and temporal resolution. Such high-throughput imaging is increasingly demanding the use of machine vision for real-time analysis and control. Digital microfluidics is a relatively new technology with expectations of becoming a true lab-on-a-chip platform. Utilizing digital microfluidics, only small amounts of biological samples are required and the experimental procedures can be automatically controlled. There is a strong need for the development of a digital microfluidics system integrated with machine vision for innovative biological research today. In this paper, we show how machine vision can be applied to digital microfluidics by demonstrating two applications: machine vision-based measurement of the kinetics of biomolecular interactions and machine vision-based droplet motion control. It is expected that digital microfluidics-based machine vision system will add intelligence and automation to high-throughput biological imaging in the future. © 2010 American Institute of Physics. 关doi:10.1063/1.3274673兴

I. INTRODUCTION

Machine vision can be defined as the use of devices for optical, noncontact sensing to automatically receive and interpret an image of a real scene in order to obtain information and/or control machines or processes.1 The terms machine vision and computer vision are often used interchangeably. However, in general machine vision is about solving practical engineering problems and widely used in an industrial environment.2 The design of machine vision systems requires interdisciplinary efforts.3 They may come from electrical/software engineering 共hardware and software兲, physics 共optics and lighting兲, mechanical engineering 共actuators and robots兲, systems engineering, and specific application domains. Machine vision can perform various tasks such as measuring, identifying, grading, sorting, counting, locating/manipulating parts, monitoring/controlling production processes, etc. It has been successfully applied to a variety of areas, including manufacturing 共semiconductor兲, aerial/satellite image analysis, forensic science 共fingerprint recognition兲, security/ surveillance, road traffic control, and scientific research 共physics, biology, astronomy, material engineering, etc.兲.3 Machine vision has also been applied to live cell experiments, such as noninvasive assessment of cell viability4 and stochastic analysis of cancer cell mitochondrial dysfunction.5 The machine vision community developed optical-flow algorithms that can characterize microchannel flows.6 Applications have been reported, including dynamic contact angle measurement in microchannel flows.7 Building programmable, reconfigurable, and reusable microfluidic devices that can handle most of the laboratory protocols is the ultimate goal of lab-on-a-chip technology. 0034-6748/2010/81共1兲/014302/7/$30.00

However, the capability can only be realized with a complete set of elemental fluidic components that support all of the required fluidic operations.8 Droplet-based or digital microfluidics 共DMF兲 is a relatively new technology that involves the manipulation of discrete and independently controllable droplets. Various mechanisms, including electrowetting,9 have been utilized for realizing DMF. A liquid droplet, sitting on a solid surface, spreads until it reaches a minimum in free energy determined by cohesion forces in the liquid and adhesion forces between the liquid and the surface.10 When an electrical potential is applied, a change in the electric charge distribution at the liquid/solid interface modifies the free energy, inducing electrowetting or spreading of the droplet. If the electrowetting behavior takes place in only one side of the droplet, the droplet can be displaced in the same direction as the side. The suitability of electrowetting-based DMF as a true lab-on-a-chip platform has also been discussed.8,11 Basic operations such as creating, transporting, cutting, and merging of droplets have been demonstrated,12 and there have been a number of biological applications, including on-chip enzyme assays,13–17 protein sample preparation for matrix assisted laser desorption/ionization mass spectrometry,18–20 polymerase chain reaction,21 recombinant DNA synthesis,15 and cell-based assay.22 Machine vision can especially be useful for fluid control in DMF systems. Because the droplets have shapes that can be recognized by machine vision, their position and moving direction can be detected and controlled as shown in this paper. One of the promising application areas of DMF, integrated with machine vision, is systems biology.23 For example, fluorescent reporter-based gene network studies re-

81, 014302-1

© 2010 American Institute of Physics

Downloaded 09 Feb 2010 to 129.110.242.6. Redistribution subject to AIP license or copyright; see http://rsi.aip.org/rsi/copyright.jsp

014302-2

Rev. Sci. Instrum. 81, 014302 共2010兲

Y.-J. Shin and J.-B. Lee

II. EXPERIMENTAL

FIG. 1. 共Color online兲 Schematic illustration of a DMF-based machine vision system in a double-plate configuration.

quire real-time imaging and various perturbations by different signal molecules.24–26 One of the benefits of using DMF for this type of experiment is that precise control of solute concentration is feasible because each droplet has a fixed, measurable volume. Recently, it was also reported that DMF is especially promising for synthetic biology, an emerging discipline that aims at designing and building novel biological systems.27 For these reasons, there is a strong need for the development of an intelligent, automated DMF system with machine vision for innovative biological research today.28 It is expected that future systems or synthetic biology projects will require interdisciplinary efforts from microelectromechanical systems, robotics, optics, cell biology, image processing, statistics, and computational biology. For such a complex DMF system, integrating many different components can be a daunting problem. Therefore, a common platform that enables them to communicate with one another easily is recommended. LABVIEW 共National Instruments, USA兲, a visual programming language platform that supports diverse hardware 共camera, switch, etc.兲, can be used for such purpose. Furthermore, LABVIEW provides powerful and easy-to-use machine vision tools, VISION BUILDER FOR AUTOMATED INSPECTION 共VBAI兲 and VISION ASSISTANT, which enable a designer to focus on implementation rather than algorithm development. Noise filtering, extracting color planes, pattern matching, detecting objects, and measuring intensities are some of the built-in functions. For these reasons, LABVIEW is used in our work and the aim of this paper is not to investigate a specific machine vision algorithm 共such as circular edge detection兲 but to demonstrate a general device/mechanism that shows how machine vision can be applied to DMF. We demonstrate two examples in this paper. The first example emphasizes real-time measurement of the amount of a biological molecule, and the second is focused on automatic control of droplet motion in a DMF system.

Figure 1 shows a schematic illustration of a DMF-based machine vision system in a double-plate configuration. The five major components are: the DMF chip, microscope camera, computer, switch, and voltage source. The DMF chip has the top and bottom plates separated by a spacer. The bottom plate is consisted of a substrate 共glass兲, multiple electrodes embedded in a dielectric layer, and an outermost hydrophobic layer. The top plate has a transparent 共for inspection by machine vision兲 electrode layer covered with a hydrophobic layer, which is also transparent. The droplet is sandwiched between the two plates. The rest of the space between the plates is filled with silicone oil. Silicone oil is known to reduce the contact angle hysteresis of the hydrophobic layer, decrease the minimum voltage required for inducing droplet movement, and prevent the droplet evaporation.9,29 Realtime images are acquired through a stereo microscope 共SZ6145TR, Olympus, Japan兲 and camera 共DCR-HC96, Sony, Japan兲. LABVIEW NI Developer Suite 共National Instruments, USA兲, which includes VBAI and VISION ASSISTANT, is installed in the computer. The switch 共NI SCXI-1127, National Instruments, USA兲 is wired into the DMF chip and voltage source. The direction in which the droplet moves is determined by the switch. All the operations of the system 共image acquisition/analysis, switch manipulation, etc.兲 are integrated into a single LABVIEW application. A single-plate configuration, in which there is no top plate, can also be used for the sake of simplicity in device fabrication. There are four key components we need to consider in designing DMF-based machine vision systems: image acquisition, image processing/analysis, decision making, and actuation.2 One important aspect of such systems is that the entire steps can be carried out in real time. Image analysis involves the automatic extraction of useful information 共such as the changing fluorescence intensity of a fluorescent reporter兲 from an image. It may require algorithms such as object detection, pattern matching, color plane extraction, etc. As an image analysis example, we demonstrate machine vision-based measurement of the kinetics of biomolecular interactions. Real-time background subtraction and color plane extraction are two main algorithms used in the example. The information acquired through image analysis can be used in the subsequent decision making process. Machine vision-based droplet motion control is our second example that shows how decision making and actuation can be realized in a DMF-based machine vision system. The presence of a droplet was recognized by a circular edge detection algorithm, and droplet motion was automatically controlled based on the recognition. This intelligent control was made possible using a finite state machine 共FSM兲, in which detection process is integrated with control process. Identifying droplet position using capacitive detection has been previously reported,30,31 and our approach shows how machine vision can achieve a similar goal without adding any physical sensor to the system.

Downloaded 09 Feb 2010 to 129.110.242.6. Redistribution subject to AIP license or copyright; see http://rsi.aip.org/rsi/copyright.jsp

014302-3

Y.-J. Shin and J.-B. Lee

Rev. Sci. Instrum. 81, 014302 共2010兲

FIG. 3. 共Color online兲 Two enzymatic reactions and molecular components in droplets.

FIG. 2. 共Color online兲 Fabrication steps. 共a兲 Aluminum deposition by sputtering on the bottom plate. 共b兲 Aluminum patterned using dry etching. 共c兲 PECVD Si3N4 deposition and patterning. 共d兲 Teflon layer coating. 共e兲 ITOcoated glass 共the top plate兲. 共f兲 Holes drilled for silicone oil injection. 共g兲 Teflon layer coating. 共h兲 The top plate turned upside down. 共i兲 DI water placement and the spacer fabrication. 共j兲 Bonding of the top and the bottom plates using an UV adhesive and silicone oil injection. 共k兲 Completed device.

III. DMF CHIP FABRICATION

The fabrication process of a double-plate configuration DMF chip is shown in Fig. 2. A 3 in. Pyrex wafer substrate 共Corning Inc., USA兲 was thoroughly cleaned. A layer of aluminum 共1000 Å兲 was deposited on top of the substrate using sputtering 关Fig. 2共a兲兴. It was then coated with a layer of 5-␮m-thick positive photoresist and patterned using standard photolithography and developing processes. The exposed portion of the aluminum layer was removed by chlorinebased plasma 关Fig. 2共b兲兴. A dielectric layer 共Si3N4, 1000 Å兲 was deposited using plasma enhanced chemical vapor deposition. It was then patterned by fluorine-based plasma 关Fig. 2共c兲兴. A layer of 300-Å-thick Teflon AF 601 共DuPont Inc., USA兲 was spin-coated to make the surface hydrophobic 关Fig. 2共d兲兴. The contact angle measured after the Teflon layer coating was near 120°. An indium tin oxide 共ITO兲-coated glass 共Delta Technologies, Ltd., USA兲 was used for the top plate 共ITO thickness: 1500 Å兲 关Fig. 2共e兲兴. Several holes were drilled through the plate for silicone oil 共viscosity: 0.65 cSt, Dow Corning 200 FLUID, Dow Corning, USA兲 and sample injection 关Fig. 2共f兲兴, followed by 300-Å-thick Teflon AF 601 coating 关Fig. 2共g兲兴. A 300-␮m-thick spacer was placed between the top and bottom plates 关Fig. 2共i兲兴. The two plates were bonded together and completed sealed using a UV adhesive 共Impruv 349, Henkel, Germany兲. The applied electrical voltage for droplet movement was 20 Vrms ac at 3 kHz.32

IV. MACHINE VISION-BASED MEASUREMENT OF THE KINETICS OF BIOMOLECULAR INTERACTIONS: COLORIMETRIC ENZYMATIC ASSAY FOR GLUCOSE

A colorimetric enzymatic reaction-based glucose assay kit 共GAGO-20, Sigma, USA兲 is available for quantitative measurement of glucose. Figure 3 shows two enzymatic reactions involved in the assay. In the first reaction, glucose is oxidized to gluconic acid and hydrogen peroxide 共H2O2兲 by an enzyme 共glucose oxidase兲. This can be kinetically treated as a single-substrate enzymatic reaction, and Michaelis– Menten equation can be used for kinetics modeling.33 In the second reaction, H2O2 reacts with reduced o-dianisidine to form oxidized o-dianisidine in the presence of another enzyme 共peroxidase兲. It has been experimentally demonstrated that this reaction also obeys the Michaelis–Menten kinetics.34 By measuring the final amount of the oxidized o-dianisidine, we can estimate the initial amount of glucose. Two droplets of equal volume 共2 ␮L兲 before merging and the merged droplet 共4 ␮L兲 are shown in Fig. 3. The first droplet contains glucose. The second one contains two previously mentioned enzymes 共glucose oxidase and peroxidase兲 and reduced o-dianisidine. When they are merged in a DMF device, brown-colored oxidized o-dianisidine is produced as a result of two concurrent reactions described above. For computer simulation using MATLAB SIMBIOLOGY 共MathWorks, USA兲, the KM values were obtained from an online enzyme database as shown in Table I.35 It was assumed that the enzymatic reactions are irreversible for the simplicity of calculation. The values in moles were converted to the number of molecules in 4 ␮L, which is the volume of the merged droplet. The two droplets were merged in a DMF chip, and brown color change was observed. Real-time image analysis using LABVIEW VISION ASSISTANT was carried out 共Fig. 4兲. One of the critical factors to consider in implementing a machine vision system is lighting.3 It is especially important

Downloaded 09 Feb 2010 to 129.110.242.6. Redistribution subject to AIP license or copyright; see http://rsi.aip.org/rsi/copyright.jsp

014302-4

Y.-J. Shin and J.-B. Lee

Rev. Sci. Instrum. 81, 014302 共2010兲

TABLE I. Initial and constant 共KM , Vmax兲 values for computer simulation. First Reaction 1. Initial amount of D-Glucose 1.0 mg/mL 共kit兲 → 0.5 mg/ mL 共merged droplet, ⫻ 2 dilution兲 = 2.788 mM 关M = mol/ L兴 →共2.778⫻ 10−3 mol/ L兲共4 ⫻ 10−6 L兲共6.022⫻ 1023 molecules/ mol兲 = 6.691⫻ 1015 molecules 2. KM1 = 31.8 mM 共Reference 35兲 共31.8⫻ 10−3 mol/ L兲共4 ⫻ 10−6 L兲共6.022⫻ 1023 molecules/ mol兲 = 7.660⫻ 1016 molecules 3. Glucose oxidase 共Aspergillus niger兲: 500 U 关U = ␮mol/ min兴 共kit兲 →500 U = 500⫻ 10−6 mol/ 60 s = 共8.333⫻ 10−6 mol/ s兲共6.022⫻ 1023 molecules/ mol兲 = 5.018⫻ 1018 molecules/ s 40 mL: 4 ␮L 共merged droplet volume兲 = 5.018⫻ 1018 molecules/ s : Vmax1 Vmax1 = 共5.018⫻ 1018 molecules/ s兲共4 ⫻ 10−6 L兲 / 共40⫻ 10−3 L兲 = 5.018⫻ 1014 molecules/ s Second Reaction 1. Initial amount of H2O2: 0 molecule Initial amount of reduced o-dianisidine: 5.0 mg/ 40 mL→ 40 mL: 4 ␮L 共merged droplet volume兲 = 5.0 mg: o-dianisidine o-dianisidine= 共4 ⫻ 10−6 L兲共5.0⫻ 10−3 g兲 / 共40⫻ 10−3 L兲 = 5 ⫻ 10−7 g →共5 ⫻ 10−7 g兲共1 / 244 mol/ g兲共6.022⫻ 1023 molecules/ mol兲 = 1.233⫻ 1015 molecules 2. KM2 = 0.016 mM 共Reference 35兲 共0.016⫻ 10−3 mol/ L兲共4 ⫻ 10−6 L兲共6.022⫻ 1023 molecules/ mol兲 = 3.854⫻ 1013 molecules 3. Peroxidase 共horseradish兲: 100 Purpurogallin U = 100⫻ 1 mg/ 20 s = 5 mg/ s 共kit兲 →40 mL: 4 ␮L 共merged droplet volume兲 = 5 mg/ s : Vmax2 Vmax2 = 5.0⫻ 10−7 g / s = 共5.0⫻ 10−7 g / s兲共6.022⫻ 1023 molecules/ mol兲共1 / 244 mol/ g兲=1.233⫻ 1015 molecules/ s

when inspecting transparent and glinting objects like water droplets. In our experiment, a background subtraction method was used to remove the reflected light. To visualize only the changing color, the background of each image was removed; the first image was saved in a buffer and subsequent images were subtracted by the first image. Red color plane was then extracted from the image. The mean intensity in the region of interest 共droplet兲 was calculated and saved. All the experimental procedures were carried out on a vibration free table to minimize noise due to vibration. V. MACHINE VISION-BASED DROPLET MOTION CONTROL: SHUTTLING

Unsteady motion of a droplet can be caused by surface defects of hydrophobic layers.21 If droplet motion is unexpectedly slowed down, blind, sequential electrode activation may result in a loss of droplet motion control 共Fig. 5兲.

To solve this problem, machine vision can be integrated with the DMF system to verify the presence of a droplet before activating an adjacent electrode. Displacing the droplet in the desired direction is possible after machine vision recognizes its presence. This method can be applied to any DMF operation, including transporting, cutting, merging, and shuttling. In our work, shuttling was selected as an example for demonstrating machine vision-based droplet motion control. Shuttling refers to moving a droplet back and forth frequently. After two droplets containing chemicals are merged, shuttling can be done to accelerate the rate of a chemical reaction. It is similar to shaking a test tube to speed up the reaction rate. Figure 6 shows VBAI used for shuttling. In this application, a droplet travels back and forth repeatedly between two adjacent electrodes 1 and 2 关Fig. 6共c兲兴. Figure 6共a兲 is a FSM that integrates droplet detection with electrode activation.

FIG. 4. 共Color online兲 Real-time image analysis using LABVIEW VISION ASSISTANT. First, each image is subtracted by the first image stored in a buffer. Then red color plane is extracted, followed by quantification of the mean intensity in the region of interest 共droplet兲.

Downloaded 09 Feb 2010 to 129.110.242.6. Redistribution subject to AIP license or copyright; see http://rsi.aip.org/rsi/copyright.jsp

014302-5

Y.-J. Shin and J.-B. Lee

FIG. 5. 共Color online兲 Loss of droplet motion control due to the surface defects of the hydrophobic layer. The electrodes are activated sequentially with a constant interval of 0.05 s. 共a兲–共c兲 The droplet moves along with the activated electrodes. 共d兲 and 共e兲 show the loss of control over the droplet.

There are three main states in the FSM: Inspect, Move to 共electrode兲 1, and Move to 共electrode兲 2. As shown in Fig. 6共b兲, machine vision executes four steps in Inspect state: acquiring an image, extracting the HSI 共hue, saturation, and intensity兲 intensity plane, and verifying the droplet presence at electrode 1 and 2. The intensity value is equal to the average of three primary colors 共red, green, and blue兲.3 The verification step involves the use of a circular edge detection algorithm provided by VBAI. It locates the intersection points between a set of search lines within a circular area, or annulus, and finds the best fit circle. Various parameters 共size of the annulus, edge strength, etc.兲 need to be optimized

Rev. Sci. Instrum. 81, 014302 共2010兲

through trial and error. For example, water droplets are not completely circular in many cases, and it is important to have an optimized annulus to minimize false operations. If a droplet is detected at electrode 1, there is a current state transition from Inspect to Move to 2. In Move to 2 state, electrode 2 is automatically turned on and the droplet moves from electrode 1 to electrode 2. The current state then changes from Move to 2 to Inspect and the four steps mentioned above are repeated within the state. On the other hand, if a droplet is found at electrode 2, a state transition changes the current state from Inspect to Move to 1, and the droplet moves to electrode 1. Again, after the droplet movement, Inspect becomes the current state and the four steps are repeated. These procedures are designed to form loops so that droplet detection and electrode activation are executed continuously. A schematic illustration of one shuttling cycle is shown in Fig. 7. VI. RESULTS AND DISCUSSION A. Machine vision-based measurement of the kinetics of biomolecular interactions: Colorimetric enzymatic assay for glucose

Two droplets of interest were merged in a DMF chip and brown color change was observed 关Fig. 8共a兲兴. Real-time image analysis and measurement of the mean color intensity using VISION ASSISTANT are shown in Fig. 8共b兲. The original image is in the upper left corner. The images after background subtraction 共upper right兲 and red color plane extraction 共lower right兲 are also shown. The value of the mean color intensity was plotted on the graph 共lower left兲 in real time. Figure 9 compares the experimental result with the

FIG. 6. 共Color online兲 VBAI for demonstrating shuttling. 共a兲 State machine diagram for the integrated process of the droplet recognition and the electrode activation. 共b兲 Four steps at the inspect state; reading in an image, intensity extraction, and detecting circular shapes at regions 1 and 2. 共c兲 Two electrodes of interest numbered as 1 and 2.

Downloaded 09 Feb 2010 to 129.110.242.6. Redistribution subject to AIP license or copyright; see http://rsi.aip.org/rsi/copyright.jsp

014302-6

Y.-J. Shin and J.-B. Lee

FIG. 7. 共Color online兲 One cycle of droplet shuttling. 共a兲 A droplet is present at electrode 1. 共b兲 machine vision inspects for any droplet at electrode 1 and 2. 共c兲 A droplet is detected at electrode 1. 共d兲 The droplet moves from electrode 1 to 2. 共e兲 The droplet is detected at electrode 2. 共f兲 It is sent back to electrode 1.

simulation. The total number of images analyzed for 600 s was 18 190. The duration of the experiment 共600 s兲 was determined by the simulation result, which showed it was long enough to reach the steady state 共Fig. 9兲.

Rev. Sci. Instrum. 81, 014302 共2010兲

FIG. 9. 共Color online兲 Comparison of the experimental and simulation results. The experimental result shows the normalized values of the mean color intensity in real time. The intensity reaches the steady state as predicted by the simulation result.

Figure 10 is a series of real-time images taken with the speed of 15 frames/s 共or 67 ms per frame兲. Figure 10共a兲 shows a droplet at electrode 1 共time elapsed= 0 s兲. The droplet is detected by machine vision at t = 0.200 s 关Fig. 10共b兲兴 and moves from electrode 1 to electrode 2 关Fig. 10共c兲兴. Figures 10共d兲 and 10共e兲 show that the droplet stays at electrode

2 until it is detected by machine vision. Then the droplet moves from electrode 2 back to 1 关Fig. 10共f兲兴. Figures 10共b兲 and 10共c兲 show that the time needed for the droplet to move from one electrode to the other takes no more than 67 ms. Since the width of one electrode is 1400 ␮m, the average speed is at least 21.04 mm/s. The real-time video of this demonstration is available 共Fig. 10兲. Our first example demonstrated real-time measurement of the amount of a biological molecule in a merged droplet, and the second focused on automatic control of droplet motion using machine vision. For practical applications, both of these measurement and control functions can be integrated into one. For example, in the case of fluorescent reporterbased gene network study, measurement of changing fluorescence can be used to identify the type of a gene network motif 共such as feed forward loop兲.24,36 As the cell, which has

FIG. 8. 共Color online兲 共a兲 After the two droplets 共2 ␮L each兲 were merged, brown color change was observed in the merged droplet 共4 ␮L兲. 共b兲 Realtime color intensity measurement using Vision Assistant; the original image is shown in the upper left corner. The images after background subtraction 共upper right兲 and red color plane extraction 共lower right兲 are also shown. The real-time change in the mean color intensity was plotted on the graph 共lower left兲.

FIG. 10. 共Color online兲 A series of real-time images taken with the speed of 15 frames/s 共0.067 s per frame兲. 共a兲 A droplet at electrode 1. 共b兲 The droplet is detected at t = 0.200 s. 共c兲 The droplet moves from electrode 1 to electrode 2 with the minimum speed of 21.04 mm/s. 共d兲 and 共e兲 The droplet stays at electrode 2 until it is detected by machine vision. 共f兲 The droplet moves from electrode 2 back to 1 共enhanced online兲. 关URL: http:// dx.doi.org/10.1063/1.3274673.1兴

B. Machine vision-based droplet motion control: Shuttling

Downloaded 09 Feb 2010 to 129.110.242.6. Redistribution subject to AIP license or copyright; see http://rsi.aip.org/rsi/copyright.jsp

014302-7

the motif, is labeled through this identification process, the droplet that contains the cell can be directed to different destinations for further study using machine vision-based motion control. One important issue that needs to be addressed in the analysis for identification is noise present in all biological processes. Gene expression is fundamentally a stochastic process,37 and advanced algorithms such as adaptive filters can optimally estimate the state variables of gene network motifs from noisy experimental data.38–40 Integrating these sophisticated algorithms with real-time measurement and control is the challenge that must be overcome for true systems biology applications of DMF-based machine vision. VII. CONCLUSION

In this paper, we showed how machine vision can be applied to DMF by demonstrating two applications: machine vision-based colorimetric enzymatic glucose measurement and machine vision-based droplet motion control. To the best of our knowledge, this is the first study regarding the application of machine vision to DMF. DMF-based machine vision system will add intelligence and automation to highthroughput biological imaging. Furthermore, there have been substantial efforts in developing micro-optical components, such as image sensors, filters, and light sources, for on-chip integration. It is expected that all the components of the DMF-based machine vision system, including advanced intelligent algorithms and micro-optical components, will be on a single chip for innovative biological research in the future. ACKNOWLEDGMENTS

This work was supported in part by the National Science Foundation 共NSF兲 Nanoscale Interdisciplinary Research Team 共NIRT兲 program under Grant No. BES-0608934. Automated Vision Association 共1985兲. P. F. Whelan and D. Molloy, Machine Vision Algorithms in Java 共Springer, New York, 2001兲. 3 B. Batchelor and F. Waltz, Intelligent Machine Vision 共Springer, New York, 2001兲. 4 N. Wei, E. Flaschel, K. Friehs, and T. W. Nattkemper, BMC Bioinf. 9, 449 共2008兲. 5 A. D. Chacko, N. T. Crawford, P. G. Johnston, and D. A. Fennell, Apoptosis 13, 1386 共2008兲. 6 N. Nguyen and S. T. Wereley, Fundamentals and Applications of Microfluidics 共Artech House, Norwood, 2002兲. 7 V. Heiskanen, K. Marjanen, and P. Kallio, J. Bionic Eng. 5, 282 共2008兲. 1 2

Rev. Sci. Instrum. 81, 014302 共2010兲

Y.-J. Shin and J.-B. Lee

R. B. Fair, Microfluid. Nanofluid. 3, 245 共2007兲. M. G. Pollack, R. B. Fair, and A. D. Shenderov, Appl. Phys. Lett. 77, 1725 共2000兲. 10 J. Berthier and P. Silberzan, Microfluidics for Biotechnology 共Artech House, Norwood, 2006兲. 11 F. Su, K. Chakrabarty, and R. B. Fair, IEEE Trans. Comput.-Aided Des. 25, 211 共2006兲. 12 S. K. Cho, H. Moon, and C. Kim, J. Microelectromech. Syst. 12, 70 共2003兲. 13 V. Srinivasan, V. K. Pamula, and R. B. Fair, Anal. Chim. Acta 507, 145 共2004兲. 14 V. Srinivasan, V. K. Pamula, and R. B. Fair, Lab Chip 4, 310 共2004兲. 15 Y. Liu, J. Micromech. Microeng. 18, 045017 共2008兲. 16 D. Jary, A. Chollat-Namy, Y. Fouillet, J. Boutet, C. Chabrol, G. Castellan, D. Gasparutto, and C. Peponnet, NSTI Nanotech 2006 Technical Proceedings 共NSTI 2006 Nanotechnology Conference and Trade Show兲 共Nano Science and Technology Institute兲, Vol. 2, pp. 554–557. 17 E. M. Miller and A. R. Wheeler, Anal. Chem. 80, 1614 共2008兲. 18 A. R. Wheeler, H. Moon, C. J. Kim, J. A. Loo, and R. L. Garrell, Anal. Chem. 76, 4833 共2004兲. 19 A. R. Wheeler, H. Moon, C. A. Bird, R. R. Loo, C. J. Kim, J. A. Loo, and R. L. Garrell, Anal. Chem. 77, 534 共2005兲. 20 H. Moon, A. R. Wheeler, R. L. Garrell, J. A. Loo, and C. J. Kim, Lab Chip 6, 1213 共2006兲. 21 Y. H. Chang, G. B. Lee, F. C. Huang, Y. Y. Chen, and J. L. Lin, Biomed. Microdevices 8, 215 共2006兲. 22 I. Barbulovic-Nad, H. Yang, P. S. Park, and A. R. Wheeler, Lab Chip 8, 519 共2008兲. 23 D. N. Breslauer, P. J. Lee, and L. P. Lee, Mol. Biosyst. 2, 97 共2006兲. 24 S. Kaplan, A. Bren, A. Zaslaver, E. Dekel, and U. Alon, Mol. Cell 29, 786 共2008兲. 25 M. J. Dunlop, R. S. Cox III, J. H. Levine, R. M. Murray, and M. B. Elowitz, Nat. Genet. 40, 1493 共2008兲. 26 J. Stricker, S. Cookson, M. R. Bennett, W. H. Mather, L. S. Tsimring, and J. Hasty, Nature 共London兲 456, 516 共2008兲. 27 S. Gulati, V. Rouilly, X. Niu, J. Chappell, R. I. Kitney, J. B. Edel, P. S. Freemont, and A. J. deMello, J. R. Soc., Interface 6, S493 共2009兲. 28 R. Pepperkok and J. Ellenberg, Nat. Rev. Mol. Cell Biol. 7, 690 共2006兲. 29 H. J. J. Verheijen and M. W. J. Prins, Langmuir 15, 6616 共1999兲. 30 J. Z. Chen, A. A. Darhuber, S. M. Troian, and S. Wagner, Lab Chip 4, 473 共2004兲. 31 X. Niu, M. Zhang, S. Peng, W. Wen, and P. Sheng, Biomicrofluidics 1, 044101 共2007兲. 32 T. B. Jones, J. D. Fowler, Y. S. Chang, and C. Kim, Langmuir 19, 7646 共2003兲. 33 V. Leskovac, Comprehensive Enzyme Kinetics 共Kluwer Academic/Plenum, New York, 2003兲. 34 O. V. Lebedeva, N. N. Ugarova, and I. V. Berezin, Biokhimiia 42, 1372 共1977兲. 35 http://www.brenda-enzymes.info/, Brenda 共online enzyme information system兲. 36 S. Kaplan, A. Bren, E. Dekel, and U. Alon, Mol. Syst. Biol. 4, 203 共2008兲. 37 A. Raj and A. van Oudenaarden, Cell 135, 216 共2008兲. 38 R. F. Stengel, Optimal Control and Estimation 共Dover, Mineola, 1994兲. 39 A. H. Sayed, Adaptive Filters 共Wiley, Hoboken, 2008兲. 40 U. Alon, Nat. Rev. Genet. 8, 450 共2007兲. 8 9

Downloaded 09 Feb 2010 to 129.110.242.6. Redistribution subject to AIP license or copyright; see http://rsi.aip.org/rsi/copyright.jsp

Suggest Documents