Development of a microrobot-based micromanipulation cell in a scanning electron microscope (SEM) Ferdinand Schmoeckel*, Stephan Fahlbusch, Joerg Seyfried, Axel Buerkle and Sergej Fatikow Institute for Process Control and Robotics, Universität Karlsruhe (TH), D-76128 Karlsruhe, Germany ABSTRACT In the scanning electron microscope (SEM), specially designed microrobots can act as a flexible assembly facility for prototype microsystems, as probing devices for in-situ tests in various applications or just as a helpful teleoperated tool for the SEM operator when examining samples. Several flexible microrobots of this kind have been developed and tested. Driven by piezoactuators, these few cubic centimeters small mobile robots perform manipulations with a precision of up to 20 nm and transport the gripped objects at speeds of up to 3 cm/s. New microrobot prototypes being employed in the SEM are described in this paper. The SEM’s vacuum chamber has been equipped with various elements to enable the robots to operate. In order to use the SEM image for automatic real-time control of the robots, the SEM’s electron beam is actively controlled by a PC. The latter submits the images to the robots’ control computer system. For obtaining three-dimensional information in real-time, a triangulation method with the luminescent spot of the SEM’s electron beam is being investigated. Finally, the strategies of micro force sensing and control methods required for handling techniques with two robots are discussed. Keywords: microrobots, micromanipulation, scanning electron microscopy, piezoelectric actuators, micro triangulation, micro force sensing, control systems for microrobots
1. INTRODUCTION Existing facilities for manipulation tasks in the micron range are rather large, very expensive, and usually tailored to a specific assignment. Flexible robot systems for various applications requiring precise handling of microscopic objects in a scanning electron microscope are completely missing. Direct-driven few cm³ small robots are likely to help solving this problem. At the University of Karlsruhe, various piezoelectric microrobots have been developed. These mobile robots can perform high-precision manipulations with a precision of up to 20 nm and transport the gripped objects at speeds of up to 3 cm/s employing a slip-stick principle1. They work within the “Flexible Microrobot-based Micromanipulation Station” (FMMS)2 on the stage of a light microscope or inside the vacuum chamber of an SEM. Automatically controlled with the help of visual sensors, these robots can carry out micromanipulation tasks and may free humans from having to manipulate tediously very small objects manually. Figure 1 shows the set-up of the SEM-part of the FMMS. The core of the station is a parallel computer system controlling the microrobots operating both under the light microscope and inside the SEM. The planning module and the user interface with a 6D-mouse, which is commonly used in CAD systems, are implemented on an additional Linux PC. The microrobots’ motion control approach basing on vision sensors was introduced by Munassypov et al.3, 1996 and Allegro and Jacot4, 1997. One can distinguish between coarse motion (i.e. navigation of the robot over a long distance) and fine motion (i.e. manipulation of parts under a microscope). Fine motion demands for a high relative positioning accuracy, whereas the precision requirements for coarse motion are less strict. Reflecting this fact, the visual sensor system of the FMMS has been divided into two subsystems – a global sensor and a local sensor5. The global sensor system uses the image of a CCD camera supervising the microrobots’ work space, i.e. the microscope’s XY-stage (Figure 1). The real-time image processing software detects the global position and orientation of the robot resulting in a positioning accuracy of the tool center point better than 0.5 mm. This is sufficient to navigate a robot into the field of view of the microscope. The actual manipulations with microobjects are monitored by the local sensor system, i.e. the SEM image which is acquired by an additional PC. The SEM image permits an accuracy in the sub-micron range.
*
Correspondence: Email:
[email protected]; http://wwwipr.ira.uka.de/~schmoeck; Telephone: +49 721 608 7133
Additional miniaturized microscope
Global sensor system (CCD camera)
Microrobot
Parallel computer system
SEM image
Local sensor system
Server PC with user interface
SEM PC
Figure 1: Set-up of the SEM-part of the flexible microrobot-based micromanipulation station (FMMS)
2. MICROROBOT NEEDS IN SCANNING ELECTRON MICROSCOPY Considering the demands of micromanipulation and the abilities of the mobile microrobots concerning the dimensions of the workspace (also in height) in connection with the attainable precision of 20 nm, the limits of light microscopy are obvious. The SEM is superior to the light microscope in resolution and, what is often more important, depth of focus. The large working distance – i.e. the distance between the final lens and the samples – of an SEM offers much more space for robot systems. And, indeed, there are substantial demands for microrobots in scanning electron microscopy. As the use of electron beams only can take place in vacuum, the samples to be observed by scanning electron microscopy are always in a vacuum chamber. This means a considerable expenditure of time when samples have to be manipulated. Usually, after examining the sample in the SEM image and noticing that something must be changed, the operator has to refill the vacuum chamber with gas, take the sample under the light microscope, manipulate it with tweezers and needles by hand, put it back into the vacuum chamber and evacuate again, which takes several minutes. Furthermore, the handling of microscopic objects by a human hand is often hardly possible, Figure 2. Manipulating and adjusting under the light microscope is therefore difficult and very coarse in comparison with the magnifications available in an SEM.
Figure 2: Manual preparation and an SEM image of a mite
Flexible, mobile microrobots are helpful tools in such situations. They may act e.g. as a pair of tweezers that the operator can control by teleoperation using the SEM image. When necessary, the robot’s endeffector can be moved automatically from the “waiting position” into the microscope’s field of view using the global sensor system. The flexibility of the microrobots is also advantageous in the assembly of hybrid microsystems, especially at development stages when single parts have to be handled and the final tolerances are not yet determined. As an example, Figure 3 shows the teleoperated microassembly of a micro gear with the help of a 6D-mouse and an additional lateral miniature camera.
500 µm Figure 3: Mounting a ∅500 µm wheel of a planetary micro gear by teleoperation with a 6D-mouse: lateral camera images and an SEM image (lower left)
Many more experiments and studies in various research fields going on at present require the manipulation of objects smaller than 100 µm. These experiments often have to be performed in the SEM due to the tiny dimensions of those objects, which are usually also fragile and hard to handle. An example can be taken from the investigation of environmental pollution. Figure 4, left, shows 10 µm dried down acid rain crystals. It would be very interesting for plant scientists to grasp these particles, lay them down on a suitable surface, and examine them by x-ray analysis. Further demands for micromanipulation within the SEM exist for instance in the research fields of fossils (Figure 4, right), radioactive particles and pest control.
Figure 4: Acid crystals dried-down on a leaf (left) and of fossil micro creature (right). Source: 2nd intermediate report of the ESPRIT project No. 33915 “MINIMAN”
3. THE SEM AS VISUAL SENSOR SYSTEM OF THE ROBOTS 3.1. Real-time image processing of the SEM image For fine motion control of a microrobot, the microscope image of its endeffector is used. Closed-loop control demands for real-time image processing. Therefore, fast algorithms for the recognition and tracking of micro objects are presently being developed. Fatikow et al.2, 1999, have discussed some approaches to object recognition in light microscope images. In order to extend the FMMS by an SEM (Philips SEM 525 M), the latter had to be equipped with some additional elements. As the SEM image acts as the local sensor data for the automatic real-time robot control, the SEM’s electron beam is actively controlled by a PC, using commercial hard- and software6. This PC submits the live images to the robots’ control computer system via a TCP/IP socket interface. The control system of the microrobots operating under the light microscope in the FMMS can be taken over to the SEM, for the most part. Real-time image processing in the SEM must fall back on a fast scanning mode. Relevant to this is a compromise between signal noise and resolution respectively sharpness. To obtain a sufficiently clear image, i.e. a good yield of electrons in fast scanning mode, a large spot size of the electron beam is required, i.e. up to 200 or 500 nm. This restricts the resolving power to approximately these values. Furthermore, the relatively large final aperture of 400 µm is used. Reasonably, the secondary electron image is used, since of all detectable signals it is the strongest. The digital image scanning system has high resolutions, up to 4096² pixels. This suggests picking out small “regions of interest” from a larger, once scanned area. In these “windows”, the robot’s endeffector can be observed accurately and in real-time. In order to track the robot over longer distances, it is possible to move the specimen table mechanically and so constantly keep the manipulator in the field of vision7. 3.2. Additional equipment of the SEM Besides the computer-based control of the SEM, several components had to be added to the SEM’s vacuum chamber, which has a standard size of about 380×366×310 mm³. The robots need a flat surface on which they can move. Therefore, a steel plate with a smoothness optimized to the driving principle was mounted on the motorized specimen table of the microscope. A special flange is needed to pass through the cables between the control system and the robot. Additionally, a lead glass window with a CCD camera was designed for monitoring the chamber’s interior, which is needed for the coarse positioning of the robot system. Finally, a miniaturized light microscope with a CCD chip was mounted inside the vacuum chamber, providing a lateral view with a magnification of ca. 100x (as it can be seen in Figure 3) of the micromanipulation scene. A photo of the vacuum chamber is shown in Figure 5.
Microrobot
Global CCDcamera
Connection flanges
Figure 5: Vacuum chamber with microrobot and flanges for cables and CCD camera (left) and the integrated miniaturized light microscope (right)
Usually, SEM samples are glued to the sample holder. However, this is not possible when they have to be manipulated by a microrobot. To prevent these microobjects from being sucked into the SEM pump system when evacuating the vacuum chamber, an additional throttle valve was inserted. 3.3. Electron beam triangulation The microscope image as local sensor data provides only two-dimensional information about an object’s location. To perform a robot operation like gripping a micropart, it is not sufficient to align gripper and object horizontally. Additional depth information is required to determine the height of the object and the gripper's vertical position. Mitsuishi et al.8, 1996, and Hatamura et al.9, 1997, described smart but expensive solutions employing a second electron gun in a stereo SEM for stereo pairs of images and a multi-directional SEM for the lateral view, respectively. A common and fast method for gaining depth information in the macroscopic world is laser triangulation10. This principle is also applicable in the micro world. It is currently being investigated for using with the light microscope of the FMMS and – similarly – with the SEM11. The height of a point illuminated by a laterally mounted laser can easily be calculated using the point’s microscope image and the geometry of the optical system and of the laser beam. In the SEM, using the electron beam instead of a laser seems to be reasonable, because it can easily be moved across the field of view. The height calculation can even be done during the regular scanning process. However, highly luminescent materials are required, on which a bright spot of the electron beam can be observed by the miniaturized light microscope. Hence, to implement this principle, the relevant surfaces must be coated with a suitable luminescent. Therefore, a coating process with a result comparable to sputtering must be found if a height profile of the whole scene shall be scanned. However, for the three-dimensional closedloop control of the microrobot’s endeffector, scintillator powder applied to the manipulator is sufficient. In between two normal scans for acquiring the SEM image, a line scan at the manipulator’s approximate position can provide its height. Figure 6 shows the set-up for triangulation with the electron beam in the SEM and the image of the miniaturized microscope. The camera image shows the luminescent line of the electron beam – in line scan mode – on a micro gear wheel, which allows height calculation with an accuracy of 1-5 µm. The latter is only limited by the light microscope.
electron beam miniature CCD camera spot
microrobot
500 µm Figure 6: Micro triangulation set-up in the SEM (left) and the camera image of the luminescent line of the electron beam (right)11
3.4. Calibration To calculate depth information through triangulation, the exact position of the electron beam or – in line scan mode – the plane spanned by the electron beam, has to be known. Furthermore, the CCD camera within the SEM has to be calibrated. Knowing the camera’s mapping function enables us to calculate for any pixel within the camera image the corresponding projection ray containing the original point of the scene. As mentioned before, the electron beam is actively controlled by a PC, i.e. the destination point of the electron beam is specified by the user or by the control system. Since its virtual pivot point is also known, the electron beam’s geometry is completely defined. The miniature CCD camera is calibrated using Tsai’s 11 parameter camera model12, which is also utilized to calibrate the global camera system of the FMMS. As for the global camera, a squared net grid is employed as reference pattern. As a special feature, this grid net can be produced through the SEM by moving the electron beam in a grid pattern across a flat luminescent sample. An image processing routine automatically extracts the points of intersection serving as calibration data for Tsai’s model. The calibration algorithms provide the 6 external parameters of the camera which describe the camera’s location and orientation and 5 intrinsic parameters which include the effective focal length of the corresponding pin hole camera, a distortion coefficient and a scale factor.
4. INTEGRATION OF MICROROBOTS INTO AN SEM 4.1. First results with an SEM-microrobot For being employed inside the vacuum chamber of a conventional SEM, a special prototype, MINIMAN III, has been implemented (Figure 8). Like all mobile microrobots developed for the FMMS so far, it consists of a mobile positioning unit that can move across a smooth surface in three degrees of freedom (DOF) and a manipulation unit – a steel ball carrying the endeffector – that is mounted on the platform. Both the positioning unit and the manipulation unit are each driven by three tube-shaped piezoelectric “legs”. By bending these legs coordinately, the robot and the manipulator respectively can be moved in any direction. To avoid image disturbances due to charging effects, the robot is metallic and grounded where possible. Although the high voltage piezo actuators (±150 V)and connection wires are not encased, no image disturbances could be observed during operation. To get an idea of the order of magnitude of the robots’ precision, Figure 7 shows an edge of the robot endeffector moving to the left while the image was scanned from top to bottom. Thus, the progress of the movement is visible. It should be noted, that with this magnification (100,000x), a tolerably sharp SEM image can already take up to one minute to be acquired. To increase the details in this image, the robot was moved with a step size of 60 nm.
Scan direction (time)
Robot motion
Figure 7: SEM image of single 60-nm-steps of the micro robot
4.2. Coarse motion control of the micromanipulation unit As mentioned above, the motion control of the robots is divided into coarse and fine positioning. First, the coarse motion control of the microrobots has been restricted to the positioning unit. As described by Fatikow et al.5, 1999, the implemented closed-loop control bases on the detection of infrared LEDs mounted on the robots (Figure 8), using the global CCD camera. A reliable method for the segmentation of the LEDs has been established. It uses the difference of two images taken while the LEDs are switched on and off, respectively. Knowing the 2D pixel coordinates of the LEDs, the position and the orientation of the microrobot can quickly be calculated. Once the LEDs are detected in the image, they are tracked in realtime while the robot is moving.
Infrared LEDs of the manipulating unit Infrared LEDs of the positioning unit
Figure 8: The infrared LEDs of MINIMAN III
This control method has been extended to the control of the manipulating unit. As the 3D coordinates of the ball center are provided by the LEDs mounted on the positioning unit, two more LEDs on the manipulator are sufficient to calculate the orientation of the ball. They have been mounted at the balance weight of the MINIMAN III manipulator, at the backside of the steel ball, Figure 8. Thus, the robot control system can calculate the actual turning angles of roll, pitch and yaw of the manipulator and compare them with the required values given by the planning system or by a user. For teleoperating the robot, a 6D-mouse is employed, which is a very intuitive and sensitive user interface. Besides the transition data, it directly provides the desired turning angles of the manipulator. To move the ball accordingly, the bend direction of each piezoleg must be calculated from these turning angles. This is shown in Figure 9 (right) for the rear leg and the yaw axis (b), exemplarily.
b
l m ey ex ez
Figure 9: 6D-mouse13 (left) and sketch of the piezolegs and the manulator ball (right)
The required bend direction m of each piezoleg is calculated by transferring the coordinates of all three ball axes into the coordinate system of the piezoleg (ex,ey,ez). The absolute values of these vectors are the desired turning velocities around these axes. The resulting bending vectors of the leg can be summed up, since the rotation axes are perpendicular to each other14. 4.3. Handling techniques and Micro Force Sensing When trying to manipulate smaller and smaller objects, the problems described e.g. by Fearing15, 1995, Shimoyama16, 1995, and Miyazaki et al.17, 1997, caused by the dominant surface forces get worse and cannot be disregarded. Furthermore, the particles observed with the help of the electron beam can be charged electrically – unless a so-called “Environmental SEM” working at elevated pressure is used. This makes manipulations unpredictable if no suitable actions are taken to cope with these problems. The most frequent effect caused by these unfamiliar force ratios is that a grasped object remains sticking at one jaw of the microgripper when trying to drop it. As proposed by Miyazaki17, 1997, one possible approach to face this problem is to involve a second robot which is equipped with a “helping hand” consisting of a simple needle-shaped gripper tip to brush off the object, minimizing the contact faces by the small dimensions of the needle. As an example, the releasing of a grain of pollen with this technique is shown in Figure 10.
Figure 10: Releasing a grain of pollen from a micro gripper with the help of a needle. Courtesy of Kammrath & Weiss GmbH, Dortmund, Germany
The function of a gripper is to guarantee a fixed position and orientation of the gripped part with respect to the last link of the robot. The gripper must grasp the part firmly enough to prevent it from shifting while at the same time it must not deform or damage it. As pointed out, the inertia forces in microassembly are not as severe as in conventional assembly, because the weight of the parts is much smaller in relation to the surface available for gripping. Therefore, the main problem one is faced with in micromanipulation is the danger that microobjects get damaged by gripping forces exceeding the allowed load of the parts.
However, not only grasping operations but also the brush-off process of parts sticking to the gripper has to be carried out under sensor surveillance. High surface pressures, which may result in partial damage of highly sensitive objects like a grain of pollen, can easily be reached by needle-like endeffectors. Microgrippers with integrated piezoresistive force sensors and with attached strain gauges are limited in their ability to resolve the gripping force. A scanning probe microscope (SPM) allows very precise displacement and force measurements in subangstrom and subnanonewton ranges. Hence, self-sensing SPM cantilevers are currently being integrated into the gripper of one of the microrobots (Figure 11, left). These cantilevers operate by measuring stress-induced electrical resistance changes in an implanted conductive channel in the flexure legs of the cantilever. The real-time force feedback provided by these sensors offers information to better understand the prevailing nano forces and dynamics, what is indispensable for reliable micro-manipulation strategies. When the gripper is approaching a micro-part, e.g. just before impact occurs, a peak downward in the force plot shows an adhesive force that begins pulling the cantilever before impact actually occurs (Figure 11, right). This phenomenon has been observed and reported during micro-manipulations by Nelson18, 1997.
Figure 11: Contact mode PiezoleverTM (ThermoMicroscopes AutoProbe), (left), interatomic forces (right)
Piezoresistive cantilevers have the considerable advantage that the deflection-sensing element is integrated into the cantilever, so unlike optical levers, they do not require external lasers and detectors. The resistance of the piezoresistive cantilever is measured with a Wheatstone bridge. This arrangement produces an output voltage Uw given by
Uw = k ⋅
F ∆R ⋅ Ub R
where F is the force exerted on the cantilever, Ub is the Wheatstone bridge bias voltage, k is the cantilever spring constant and ∆R/R is the resistance change of the cantilever per unit deflection divided by the resistance of the cantilever. After surveying applicable force sensing techniques and commercially available force sensors which can be implemented into our microrobots, we are currently in the stage of integrating suitable sensors and developing force control algorithms to expand our control system and thereby to improve the microhandling abilities of our microrobots. 4.4. Further miniaturization of the robots To be able to integrate two or – for more complex tasks – even more micro robots, their size must be reduced more and more. Actually, it is not very difficult to further miniaturize the presented prototypes because of their simple construction. In the current stage, a crucial problem is the connection of the robots to the control system. Since they do not carry any electronics themselves, they need up to 50 electrical connections. Therefore, long flexible printed circuit boards are presently being tested replacing bundles of single thin wires. However, the smallest available plugs still consume a lot of space on the robot platforms. With the latest prototype, MINIMAN IV, the microrobot size will be further decreased. It consists mainly of a small printed circuit board (Ø50 mm) integrating all piezolegs, LEDs and the connectors to the changeable manipulating unit and the control system. Figure 12, shows the design of the robot with one of the planned manipulating units.
Figure 12: Design of MINIMAN IV
5. PLANNING AND CONTROL STRATEGY In order to implement a coordinated work of two or more robots for performing handling tasks like the described one, adequate control methods are required. The control system of the station has been developed both for applications in an SEM and under an optical microscope. In the latter case, we are also investigating the possibilities of a completely automated assembly process*. Here, the user specifies the system to be assembled using a CAD system including the necessary connections and connection types. Next, the assembly planning system computes the optimal assembly sequence taking special problems in the micro world into account. It will ensure visibility of all operations so that they can take place under sensor surveillance and optimize the micro assembly sequence by avoiding very small and lightweight subassemblies. If necessary, it will also request assembly aids from the user. This planning system based on the work of de Mello19 has already been presented20, 21, 22. The optimal assembly sequence is decomposed for the given number of robots based on their operational capabilities and fed into the control system using a special Robot Control Language (RCL). The execution planning module executes this robot control code, supervising the robots’ motions. In order to make a parallel plan execution possible, the RCL interpreter rearranges the command sequence performing look-ahead for variable and object dependencies. The station’s components are controlled by an object oriented, distributed control system. Depending on the system configuration (e.g. the number of robots, microscope etc.) – and therefore the computational load –, the control system can run on a single machine or on several computers linked by a real-time capable communication medium. The control computer currently in use consists of several Intel Pentium modules linked by a Dual Ported RAM backplane. The communication between the control system’s objects is based on CORBA which is extended to make dynamic real-time execution of tasks possible. To be able to scale the system as easily as possible, the physical objects involved in the station were mapped to C++ objects by a permutation mapping. Using the object oriented paradigm makes it easy to adapt the control system from a light optical microscope to an SEM. Both microscopes offer an XY-stage where the robots are positioned and both microscopes are a means of magnification. Of course, special properties of the different station set-ups, e.g. the different microscope optics and imaging techniques have to be taken into account for vision algorithms, but all implementation details can be hidden inside the software objects so that the higher level objects have a unified view on the station independent of the microscope or the robots used. The software objects are split into a communication part performing higher level planning tasks and CORBA negotiation and a real-time kernel necessary for quick reaction to sensor input. The real-time parts of the distributed objects can establish a direct communication channel bypassing the CORBA protocol when necessary depending on the station’s global state and the states of all objects.
*
assembly within an SEM is also conceivable, but parts feeding and gluing etc. are easier under an optical microscope. Also, to date there are no assembly tasks which would really demand for an SEM in terms of magnification.
User interface World model Product design
Assembly planning Robot control language interpreter
Supervisor, global execution planning
Knowledge base
ORB ...
Robot object
Robot object
Kernel
Kernel
Real-time Kernel
Real-time Kernel
Robot 2
Microassembly process
Robot 1
Assembly aids
...
Camera object
Camera object
Kernel
Kernel
Real-time Kernel
Real-time Kernel
...
Sensors
Micro objects
Figure 13: Architecture of the planning and control system
To keep track of the processes in the station and to be able to avoid deadlocks and resource collisions, all objects in the control system have an internal state which can also be queried by a supervisor and is set by different actions. The objects’ states are set by actions which are executed in the station, e.g. “grip object”, “drop object”, “goto position”, etc. To guarantee a deadlock-free execution, all state transitions were modeled using finite state machines and the possible object transitions were checked using Petri-nets as presented in previous work23.
6. CONCLUSION AND FUTURE WORK In this paper, the integration of an SEM into a flexible microrobot-based micromanipulation station has been described. The motivation for letting mobile microrobots work inside the vacuum chamber of an SEM is a variety of possible applications that have been discussed. The SEM-suited microrobot prototype and its motion control has been explained as well as the additional components required for the SEM acting as the local sensor system of the station. Furthermore, we discussed some force sensing issues that arise when approaching certain handling techniques required in the micro world. In order to proceed from the – already implemented – micro-teleoperation to (semi-) automation of the microrobot, SEM images must be evaluated in real-time. For this, the results achieved for processing of the light microscope images in the FMMS may be taken over and adapted. The same applies to the object oriented planning and control strategy presented in this paper. The electron beam triangulation method must be further investigated with emphasis on the automated 3D-control of the robot’s manipulator. The main goal of the nearest future is to complete a semi-automated system with a powerful graphical user interface enabling the operator to just click on objects in the microscope image when they e.g. want to grasp them.
ACKNOWLEDGEMENTS This research work has been performed at the Institute for Process Control and Robotics (Head - Prof. H. Wörn), Computer Science Department, Universität Karlsruhe (TH). The research work is being supported by the European Union (ESPRIT Project “MINIMAN”, Grant No. 33915).
REFERENCES 1.
Breguet, J.-M.; Pernette, E.; Clavel, R.: “Stick and slip actuators and parallel architectures dedicated to microrobotics”; Proc. SPIE 2906, Boston, 1996, pp. 13-24
2.
Fatikow, S.; Seyfried J.; Fahlbusch St.; Buerkle, A.; Schmoeckel, F.: “A Flexible Microrobot-Based Microassembly Station”, Journal of Intelligent & Robotic Systems 27, pp. 135-169, 2000
3.
Munassypov, R.; Grossmann, B.; Magnussen, B.; Fatikow, S.: “Development and Control of Piezoelectric Actuators for the Mobile Micromanipulation System”, Proc. ACTUATOR, pp. 213-216, Bremen, 1996
4.
Allegro, S.; Jacot, J.: “Automated Microassembly by Means of a Micromanipulator and External Sensors”, Proc. of the Int. Conf. Microrobotics and Micromanipulation, SPIE ’97, pp. 108-116, Vol. 3202, Pittsburgh, USA, 1997
5.
Fatikow, S.; Buerkle, A.; Seyfried, J.: “Automatic Control System of a Microrobot-Based Microassembly Station Using Computer Vision”, SPIE's International Symposium on Intelligent Systems & Advanced Manufacturing, Conference on Microrobotics and Microassembly, SPIE, pp. 11-22, Boston, USA, 19-22 Sept. 1999
6.
Digital Image Scanning System, Point Electronic GmbH, Halle, http://www.pointelectronic.de
7.
Fatikow, S.; Seyfried J.; Fahlbusch St.; Buerkle, A.; Schmoeckel, F.: “Entwicklung flexibler Mikroroboter zur Handhabung von Mikroobjekten”, 4. Chemnitzer Fachtagung Mikromechanik & Mikroelektronik, pp. 138-143, 11./12. Oct. 1999
8.
Mitsuishi, M.et al.: “A tele-micro machining system with operational environment transmission under a stereo-SEM”; Proc. IEEE Intern. Conf. on Robotics and Automation, Minneapolis, 1996, pp. 2194-2201
9.
Hatamura, Y. ; Nakao, M. ; Sato, T.: “Construction of an integrated manufacturing system for 3d microstructure - concept, design and realization”; CIRP Annals, 46(1), 1997, pp. 313-318
10. Ruocco, S. R.: Sensoren und Wandler für Roboter, Weinheim, VCH, 1991 11. Buerkle, A.; Schmoeckel, F.: “Quantitative Measuring System of a flexible Microrobot-based Microassembly Station”, 4th Seminar on Quantitative Microscopy, Semmering, Austria, 12-14 January, 2000 12. Tsai, R.Y.: “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Offthe-Shelf TV Cameras and Lenses”; IEEE Journal of Robotics and Automation, No 4, 1987, pp. 323-344 13. LogiCad3D GmbH, Gilching, Germany, Driver CD, 1999 14. Schmoeckel, F.; Fatikow, S.:"Smart flexible microrobots for SEM applications”, Journal of Intelligent Material Systems and Structures, accepted 15. Fearing, R. S.: “Survey of Sticking Effects for Micro Parts Handling”; IEEE/RSJ Workshop on Intelligent Robots and Systems, Vol. 2, Pittsburgh, PA, 1995 16. Shimoyama, Isao: “Scaling in microrobots”; Proc. Int. Conf. Intelligent Robots and Systems (IROS). IEEE, 1995. 17. Miyazaki, H. et al.: “Adhesive forces acting on micro objects in manipulation under SEM”; Microrobotics and Microsystem Fabrication, SPIE 3202, Pittsburgh, 1997, pp. 197-208 18. Nelson, B.; Zhou, Y.; Vikramaditya, B.: “Integration of force and vision feedback for microassembly”, Proc. SPIE 3202, Boston, MA, 1997 19. Homem de Mello, L.; Sanderson, A.: “And/or Graph Representation of Assembly Plans”. IEEE Transactions on robotics and automation.; April, 1990, pp. 188-199 20. Woern H., Seyfried J., Fatikow S. and A. Faizullin, “Assembly Planning in a Flexible Micro-assembly Station”, Proc. of the International Workshop on Intelligent Manufacturing Systems, September 22-24 1999 21. S. Fatikow, A. Faizullin, J. Seyfried; “Computer Aided Planning System of a Flexible Microrobot-based Microassembly Station”; Proc. of the 7th International Workshop on Computer Aided Design Theory and Technology; 29. September- 2. Oktober, Wien, Austria, October 12-15 1999 22. Fatikow, S.; Munassypov, S.; Rembold, U.: “Assembly Planning and Plan Decomposition in an Automated Microrobot-Based Microassembly Desktop Station”; Journ. of Intelligent Manufacturing, Chapman&Hall, 1998 23. Seyfried, J.: “Control and Planning System of a Micro Robot-based Micro-assembly Station”; Proc. of the 30th ISR, Tokyo, Japan , 1999