Modelling and simulations for nanometrology aid to: ⢠experimentalists ... They give access to the geometrical parameters of periodic structures like structure ...... Modelling which helps in trends in commercial developments: ⢠green electronics;.
Modeling and Simulation: Nanometrology Status and Future Needs Within Europe Publication Date: January 2011 Author(s): Ana Proykova1, Markus Baer2, Jorgen Garnaes3, Carl Frase4, Ludger Koenders4
Ana Proykova 17.1.11 13:53 Deleted: J
1 University of Sofia, Faculty of Physics, 5 James Bourchier Blvd. Sofia-‐1164, Bulgaria 2 Physikalisch-‐Technische Bundesanstalt, Abbestr. 2 -‐ 12, 10587 Berlin, Germany 3 Danish Fundamental Metrology, 307 Matematiktorvet, DK-‐2800 Kgs. Lyngby, Denmark 4
Physikalisch-‐Technische Bundesanstalt, Bundesallee 100, 38116 Braunschweig, Germany
1
Ana Proykova 17.1.11 13:53 Deleted: o
1. Introduction Nanometrology, defined as science of measurements at the nanoscale, provides measurements that characterise processes and product performance and covers instrumentation and standards. Advances in nanometrology depend on understanding the properties of matter at the nanoscale, quality of measuring instruments, and the requirements of the industry involved in production of nanomaterials. Nanometrology development is a result of the achievements of nanoscience and nanotechnology widely distributed after the birth of cluster science and the invention of the scanning tunnelling microscope (STM). The cluster physics showed that collective phenomena break down for very small object sizes. For example, small clusters of a ferromagnetic material are super-‐ paramagnetic rather than ferromagnetic. Paramagnetism is not a collective phenomenon, which means that the ferromagnetism of the macrostate was not conserved by going into the nanostate. Because at the nanoscale most objects exhibit sometimes unexpected properties, new trends in metrology are necessary to face these challenges. Instrumentation and measurement techniques at the nanoscale play a crucial role not only in extending our knowledge of the properties of matter and processes in nanotechnology, but also in addressing new measurement needs in process control and quality assurance in industry. Micro-‐ and nanotechnologies are now facing a growing demand for quantitative measurements to support the reliability, safety and competitiveness of products and services. Quantitative measurements presuppose reliable and stable instruments and measurement procedures as well as suitable calibration artifacts to ensure the quality of measurements and traceability to standards. Computer models assist in designing new modes of measurements by giving an insight into background physical processes. To properly simulate the behaviour of a system in interest, both the experimental set-‐up and the theoretical model should allow reasonable changes to meet convergence requirements when the experimental and theoretical results are being compared, Fig.1. On one side the theorists and modellers are expected to provide missing understanding of the physical properties for a simulation technology while on the other side nanometrologiest contribute with precise data. The dynamic connections help in improving both experimental methods and physical models. •
• •
•
• •
Modelling and simulations for nanometrology aid to: experimentalists fabricating and measuring nanoscale devices for which numerical simulations can optimise the output either by shortcutting device design or analysing results of measurement calculate the physical properties of nanoobjects (clusters, polymers) for a bottom up design of nanoobjects calculate the range in which “the wanted” physical effect occurs, e.g. clusters of silicon atoms might become metallic in a small range. Then metrology is necessary to measure and to control a small range simulate measurement tasks obtained by a device under variation of internal (probe, electronic) or external parameters (temperature, vibration, …) to estimate measurement uncertainties (Virtual Instrument) theorists developing analytical theory researchers interested in particular materials or device configurations.
2
Fig.1 Interaction and feedback between Modelling, Simulation and Experiment
Modelling effort include large-‐scale finite element methods, multiscale Green's function methods, classical atomistic simulations (special purpose Monte Carlo Methods, Molecular Dynamics), ab initio quantum mechanical calculations, spin or/and time-‐dependent density functional theory, to mention the most frequently used techniques for predicting nanoparticle properties. In section 3 we list the advantages and limitations of these computational techniques from the point of view of some experimental techniques frequently used in nanometrology: electron microscopy (Scanning Electron Microscopy -‐ SEM, Transmission Electron Microscopy -‐ TEM, Low Energy Electron Microscopy -‐ LEEM), X-‐ray and neutron scattering, optical measurements (IR, Raman, FT-‐IR), near edge X-‐ray absorption fine-‐structure spectroscopy (NEXFAS). These techniques require specific computational methods to reconstruct the original image. The Atomic Force Microscopy (AFM) is one of the foremost tools for imaging, measuring and manipulating matter at the nanoscale. AFM has recently converted into an every-‐day life characterisation technique in biology and medical physics. AFM provides a three-‐dimensional surface profile without requiring any special treatments (such as metal/carbon coatings) that would irreversibly change or damage the bio-‐ sample. However for most commercial available instruments AFM is less suitable than SEM because AFM can only image a maximum height on the order of micrometers and a maximum scanning area of around 150 by 150 micrometers. Despite its advantages, the AFM images need specific measurement strategy and data evaluation techniques. Figure 2 illustrates this for an image obtained with an AFM. Here two successive numeric manipulations have to be done before it becomes well visible: first, reorientation of the planes containing the spots and second, Fourier filtration to clean the background.
3
Fig.2 A) Original image
B) Corrected image
C) Filtered image
Scatterometry is the investigation of micro-‐ or nanostructured surfaces regarding their geometry and dimension by measurement and analysis of light diffraction from these surfaces. An example of the experimental setup used in the PTB departments 4.2 and 7.1 for scatterometry is shown in Fig. 3. Non-‐imaging metrology methods like scatterometry are in contrast to optical methods not diffraction limited. They give access to the geometrical parameters of periodic structures like structure width (CD), pitch, side-‐wall angle or line height (cf. Fig. 3). However, scatterometry requires apriori information. Typically, the surface structure needs to be specified as member of a certain class of gratings and is described by a finite number of parameters, which are confined to certain intervals. The inverse diffraction problem has to be solved to determine the structure parameters from a measured diffraction pattern. In the present context, it is interesting, because proper scatterometric measurements require an intense effort regarding modelling, simulation and inverse methods that shall be discussed in more detail below.
Fig. 3: Typical experimental setup for scatterometrical measurements.
4
2. Modelling and Simulation Requirements in the Field The computations are performed under well specified conditions. That is why the experimental nanometrology is expected to provide reference specimens for calibration of advanced scanning probe microscopy. The measuring apparatus can be calibrated with standards. In a few cases Atomic Force Microscopes are equipped with laser interferometers to trace back the displacement measurements to the SI unit of length. Although resolution of about 0.1 nm is achievable by using independent methods including tuneable and stabilized lasers, Fabry-‐Pérot interferometry and laser interferometry the measurement uncertainty is in the range of a few nanometres due to non-‐ linearity of the laser interferometer, alignment of the displacement axis and interferometer axis.
• • •
Efforts in the fabrication and controlling processes and events on nanoscale should include: the study of electrochemical and micro-‐fluid methods for producing nanostructures; novel approaches to nanocalorimetry for the study of interfacial reactions; in situ observations of nanoparticle and nanotube dispersions and alignment.
Due to uncertainty of experiments at the nanoscale an incomplete picture of the structure is obtained. More, some measurements might destroy the original object preventing us from complementary and repeatable measurements. The electron microscopy can produce defects in the structure being measured, optical measurements might heat the smallest metal nanoparticles that can melt; the atomic force microscope can trigger structural reorganisation of soft materials. X-‐ray and neutron scattering provide information about structure of cluster containing materials but often cause excitation of the particles. Another obstacle comes from the high activity of the nanoparticles which aggregate on the grid used for precise measurements of a structure. In situ measurements are either not possible or are of limited resolution. All simulations need reference data for the initial conditions to start with. The experiments are expected to characterise nanoparticles in: • size and shape; • structure; • aspect ratio; • volume versus surface ( inner and outer structure); • conductivity; • magnetic properties; • morphology and topography. These characteristics are needed to create realistic models. However, all techniques possess inherent (systematic) sources of errors. For example, AFM sources of errors have been summarised in 1,2,3,4. One important requirement to measurements is the temperature control. As temperature directly affects diffusion speeds and agglomeration of nanoparticles, the measurements should be held at a well control temperature. For material science 20 °C is the reference temperature used in 1
Francesco Marinello, Atomic Force Microscopy in Nanometrology: Modeling and Enhancement of the Instrument, PhD thesis (2006) 2 Danzebrink et al. Advances in scanning force microscopy for dimensional metrology, Ann. CIRP, vol. 55, 2 (2006) 3 A. Yacoot et al. Aspects of scanning force microscopes and their effects on dimensional measurement, J. Phys. D: Appl. Phys. 41 (2008) 103001 4 http://www.nanoscience.com/education/software.html
5
ISO standards. However materials might be used in technical devices in a rather large range: 0 to 1000 K or higher. For bio-‐objects a suitable temperature range is between 25 C and 37 C. Another important parameter for simulations is the electric field. An example is a model of deflection–voltage curves in atomic force microscopy and its use in DC electrostatic nanomanipulation experiments. Such a model predicts the deflection of the atomic force microscope probe as a function of the applied probe–substrate voltage, as well as the distance and voltage at which the tip collapses irreversibly onto the substrate due to electrostatic forces. The model is useful in DC electrostatic manipulation of nanoparticles.
3. Modelling and Simulation Techniques In the Finite-‐Element Modelling (FEM), a distributed physical system to be analysed is divided into a number (often large) of discrete elements. These elements are connected at points called nodes. In solids models, displacements in each element are directly related to the nodal displacements. The nodal displacements are then related to the strains and the stresses in the elements. The finite element method tries to choose the nodal displacements so that the stresses are in equilibrium (approximately) with the applied loads. The nodal displacements must also be consistent with any constraints on the motion of the structure. The division into elements may partly correspond to natural subdivisions of the structure. For example, the Atomic Force Microscope tip may be divided into groups of elements corresponding to different material properties. The finite element method converts the conditions of equilibrium into a set of linear algebraic equations for the nodal displacements. Once the equations are solved, one can find the actual strains and stresses in all the elements. By breaking the structure into a larger number of smaller elements, the stresses become closer to achieving equilibrium with the applied loads. Therefore an important concept in the use of finite element methods is that, in general, a finite element model approaches the true solution to the problem only as the element density is increased. In spite of the significant advances that have been made in developing finite element packages, the results obtained must be carefully examined before they can be used. The most significant limitation of finite element methods is that the accuracy of the obtained solution is usually a function of the mesh resolution. Any regions of highly concentrated stress, such as around loading points and supports, must be carefully analysed with the use of a sufficiently refined mesh. In addition, there are some problems which are inherently singular (the stresses are theoretically infinite). Special efforts must be made to analyse such problems. Density functional theory (DFT) transformed theoretical chemistry, surface science, and materials physics and has created a new ability to describe the electronic structure and inter-‐atomic forces in molecules with hundreds and sometimes thousands of atoms. Monte Carlo Methods for classical simulations have undergone a revolution, with the development of a range of techniques (e.g., parallel tempering, continuum configurationally bias, and extended ensembles) that permit extraordinarily fast equilibration of systems with long relaxation times. FEM can effectively capture the elastic behaviour of macroscopic structures but includes no accurate failure criteria since this depends upon atomic-‐scale behaviour. Classical atomistic simulations can handle enough atoms to model such events but the potentials of interactions become inaccurate for large strains and they can not effectively handle chemistry. Quantum-‐ mechanics-‐ based simulations using DFT give much better approximation to the exact solutions but
6
they are efficient for a few hundred atoms. A combination of all three modelling techniques is required to accurately model device behaviour at the nanoscale5.
Fig.4 FEM model of a rigid 100 nm diameter sphere indenting an Al sample to a depth of 10nm6.
Fig.5 Diagram of a hybrid simulation for obtaining the vacancy formation energy at different positions relative to an edge dislocation. The large box represents the FEM cell in which the DFT region containing the vacancy was embedded.
At the macrostate FEM simulates the elastic behaviour of a nanomechanical system. Figure 4 shows an Al sample indented by a rigid 100 nm diameter sphere. The indenter and three of the sample quadrants have been removed to highlight the resulting stress distribution after indenting 10 nm. The FEM is fine enough to use the predicted elastic displacement fields to generate boundary conditions and initial atom positions for an atomistic simulation using classical potentials. The use of classical potentials in a large simulation cell allows the correct propagation of the long range stresses to the critical regions where bond distortions are large or where the chemistry effects need to be explored. In these regions a DFT simulation should be performed. The classical cell is relaxed with the help of a Monte Carlo algorithm. An application of the hybrid technique described has been determination of the vacancy formation energy in aluminium as a function of distance (at a fixed angle) away from an edge dislocation. The simulation geometry is shown in the Figure 5. Connection to experimental measurements requires careful force calibration of the indenter and calibrated AFM measurements of the indenter tip. The AFM data are used to generate a FEM mesh for the simulations. Inverse Methods are required if the desired quantity cannot be measured directly, but is related to the outcome of the actual measurement by a well-‐specified mathematical model as, e. g. in scatterometry (see above). There, the desired dimensional quantities of microelectronic and optical devices can be reconstructed by the combination of measurement data stemming from the scattering process of UV light at the sample with simulations of a mathematical model (Maxwell 5 Ana Proykova, Challenges of Computations at the Nanoscale, Journal of Computational and Theoretical Nanoscience, v.7, pp.1806-‐1813 (2010) 6 http://www.indiananotechnology.com/uploads/Nanometrology_nist.pdf
7
equations) of the process7,8. The inverse method is needed, because the geometric quantities can be obtained only from finding the geometric parameters, for which the simulations best fit the measured data. A further challenge is the determination of the uncertainties of quantities reconstructed by inverse methods that require a statistical treatment of the inverse problem. To illustrate potential and challenges of inverse problems for nanometrology, the case of scatterometry is discussed in more detail below. The conversion of measurement data into desired geometrical parameters at the heart of scatterometry depends crucially on a high precision rigorous solution of Maxwell's equations, which can be reduced to the two-‐dimensional Helmholtz equation if geometry and material properties are invariant in one direction. The typical transmission conditions of electro-‐magnetic fields yield continuity and jump conditions for the transverse field components; the radiation conditions at infinity are well established. For the numerical solution, rigorous methods have been developed. Often the finite element method (FEM) is used, where the infinite domain of computation is reduced to a finite one by coupling it with boundary elements (cf. Fig. 4).
Fig. 6: CoG grating – Chromium on a glass mask used for forward calculations and reconstruction tests (d = 1120 nm, hCr = 50 nm, hCrO = 18 nm, hSiO2 = 6.35 mm).
7
H. Gross, R. Model, M. Bär, M. Wurm, B. Bodermann and A. Rathsfeld (2006). Mathematical modelling of indirect measurements in periodic diffractive optics and scatterometry. Measurement 39, 782-‐794.
8
H. Groß, A. Rathsfeld, F. Scholze and M. Bär (2009). Profile reconstruction in extreme ultraviolet (EUV) scatterometry: modelling and uncertainty estimates. Meas. Sci. Technol. 20, 105102 -‐ 105112.
8
Apart from the forward computations of the Helmholtz equation, the solution of the inverse problem, i.e. the reconstruction of the grating profiles and interfaces from measured diffraction data, is the essential task in scatterometry. The problem is equivalent to the minimization of an objective functional describing the difference between the calculated and the given efficiency pattern in dependence of the assumed model parameters. Fig. 7 shows the shape of the objective functional calculated by varying the heights of the Cr-‐ and the CrO-‐layer of chromium on a glass mask (cf. Fig. 6). For the selected admissible range of the two heights the coordinates of the minimum values of the objective functional are near to the expected values, e.g. 50 nm for hCr and 18 nm for hCrO. Because we use gradient-‐based optimization methods the admissible range of model parameters and their initial values can have a strong influence on the accuracy of the reconstruction result and need to be taken into account.
Fig. 7. Objective functions for test case CoG mask: hCr versus hCrO for a suitable subset of efficiencies.
Furthermore it is well known that the solution of the inverse problem might fail if it is based on insufficient or improper input data. Based on a sensitivity analysis, algorithms for finding optimal sets of efficiency data suitable for the inverse problem can be developed. If the inverse problem is solved, one needs to employ Monte-‐Carlo simulations or approximation methods to estimate the measurement uncertainty. Recent studies9 have revealed also strong influences of systematic errors that require advanced data evaluation as well as improved models in future research activities. A major challenge is the correct estimation of measurement uncertainty that will require research 9
J. Kaipio, E. Somersalo (2005). Statistical and computational inverse problems. (Springer Verlag, New York).
9
activities that go far beyond of existing guidelines like the ISO-‐Guide on Evaluation of Uncertainties in Measurement (GUM). Inverse problems occur also in a number of other applications of nanometrology, e.g. the localization of magnetic nanoparticles in medicine, and are expected to play a more prominent role as the field of nanometrology expands further. Molecular Dynamics (MD) with fast multi-‐pole methods for computing long-‐range inter-‐ atomic forces has made accurate calculations possible on the dynamics of millions and sometimes billions of atoms. When combined with the DFT calculations it is used instead of the classical convolution approach to tip–sample artefacts that is not valid for measurements of nano-‐specimens due to the quantum-‐mechanical nature of small objects. As interatomic forces act on the sample and the tip of the microscope, the atoms of both relax in order to reach equilibrium positions. This leads to changes in those quantities that are finally interpreted as the AFM tip position and influences the resultant dimensional measurements. Sources of uncertainty connected with tip–surface relaxation at the atomic level are discussed in10. Results of both density functional theory modelling and of classical molecular dynamics of AFM scans on typical systems used in nanometrology, e.g., fullerenes and carbon nanotubes, on highly oriented pyrolytic graphite substrates are presented. We study also the effects of tip–surface relaxation on critical measurements of the dimensions of these objects. The Car-‐Parrinello method for ab-‐initio molecular dynamics with simultaneous computation of electronic wavefunctions and interatomic forces has opened the way for exploring the dynamics of molecules in condensed media as well as complex interfaces. New mesoscale methods (including Dissipative Particle Dynamics and Field Theoretic Polymer Simulation) have been developed for describing systems with long relaxation times and large spatial scales, and are proving useful for the rapid prototyping of nanostructures in multicomponent polymer blends. In the case of hybrid technique of modelling one should pay a great attention to the correct interfacing between models operating at different length scales to ensure that models properly capture the physics of both the components and total systems11, Fig.8. However, the space-‐time calculations localised in a specific region can not always be transferred into the neighbour region because some of the properties do not scale. For instance, when particle size becomes comparable with the Fermi wavelength of an electron, the optical, electronic, and chemical properties of metal clusters differ dramatically from large nanoparticles. In the smallest size regime metal clusters become molecular species and discrete states with strong fluorescence can be observed. These molecule-‐like properties of highly polarizable and emissive few-‐atom metal clusters open new opportunities for biological labels, energy-‐transfer pairs, light-‐emitting sources in nanoscale optoelectronics, and test targets in nanometrology. The tools of theory have advanced as much as the experimental tools in nanoscience over the past 15 years. It has been a true revolution by increased computer power. The rise of fast workstations, cluster computing, and new generations of massively parallel computers complete the picture of the transformation in theory, modelling, and simulation over the last decade and a half. Moreover, these hardware (and basic software) tools are continuing on the Moore’s Law, Fig. 9 exponential trajectory of improvement, doubling the computing power available on a single chips 10
Anna Campbellová, Petr Klapetek and Miroslav Valtr, Meas. Sci. Technol. 20 084014 (2009) Ana Proykova, Molecular Dynamics simulation of gas adsorption and absorption in Nanotubes, in Carbon Nanotubes: from Basic Research to Nanotechnology, Springer, pp.187-‐207, Vol. 222 eds. Popov, Valentin N.; Lambin, Philippe (Eds.) (2006)
11
10
every 18 months. Computational Grids are emerging as the next logical extension of cluster and parallel computing.
Fig. 8 The space-‐time calculations localised in a specific region can not always be transferred into the neighbouring region because some of the properties do not scale.
Fig. 9 Plot of CPU transistor counts against dates of introduction. The curve shows counts doubling every two years.
11
Because nanomaterials exhibit different properties from their bulk counterparts due to quantum nature of the materials and the processes at the nanoscale, the techniques of measurements and modelling are distributed in two classes -‐ for particle characterisation and for processes. The techniques should also respect the low dimensionality of most nanomaterials : (0D) -‐ quantum dots, (1D) -‐ carbon nanotubes, (2D) -‐ thin-‐film-‐multilayers.
3.1. Techniques for particle characterisation at the nanoscale 3.1.1 Diffraction and pair distribution function In statistical mechanics, a radial distribution function (RDF), g(r), describes how the atomic density varies as a function of the distance from one particular atom. More precisely, if there is an atom at the origin 0, and if n = N/V is the average number density, then the local density at a distance r from 0 is ng(r). Given a potential energy function, the radial distribution function can be found either via computer simulation methods like the Monte Carlo method, or via the Ornstein-‐ Zernike equation, using approximative closure relations like the Perckus-‐Yevick approximation or the Hypernetted Chain Theory. It is possible to measure g(r) experimentally with neutron scattering or x-‐ray scattering diffraction data. For small particles it is possible to estimate their size and shape by calculating the diffraction pattern using the Debye equation12. The size is limited by the calculation power of computers. Current powers enable us to compute in a reasonable time properties of atomic clusters with diameters about 50 nm. The usual procedure includes a model of the atomic structure, which makes it possible calculation of the pair distribution function (PDF); finally the diffraction pattern is calculated. The total powder diffraction pattern includes the Bragg and diffuse scattering contributions to the PDF; no periodicity is assumed for nanopowders. For particle sizes comparable with the crystallographic unit of the lattice, considering the periodicity of the lattice is questionable. The computed patterns are compared with experimental data, and the procedure is repeated until the model and experiment match. The atomic pair distribution function (PDF) G(r) denotes the probability of finding an atom at a distance r with respect to another atom. As PDF is obtainable from diffraction patterns important conclusions can be drawn about the nature of infrasonic bonds and the atomic structure of nano-‐objects. The traditional usage of PDF requires a model for the interaction potential which is implemented in either molecular dynamics or Monte Carlo calculations for structure optimisation. A comparison of a PDF computed for the case of a methane cluster (1.6 nm in size) with the peak distribution for bulk methane (in two distinct phases – face-‐centred cubic (fcc) and icosahedral (ico) demonstrates the finite-‐size effects observed in most measurements of nanosized materials: shift and broadening of the peaks, very low intensity of some peaks. The surface atoms (molecules) of the clusters if they are embedded in a less-‐dense medium, could organize themselves in a collective motion revealed in the vibrational spectrum. The role of the surface collective motion 12
W. Lojkowski, R. Turan, A. Proykova, and A. Daniszewska, eds., Nanometrology, Eight Nanoforum report, http://www.nanoforum.org/ (2006)
12
decreases with the cluster size increase. If the measurements are in thin films, the surface (2D) effects could prevail over the 3D properties, for instance a shift in the colour of the emitted light.
A couple of comments are due: •
Intensity of the X-‐rays should be high in order to register diffraction data within the diffraction patterns taken from the nano-‐objects; this is why the usage of synchrotron radiation is highly recommended. Several centres in Europe provide facilities necessary for this purpose. The European Crystallographic Association (ECA) has approved Special Interest Groups13 that should be contacted in order to find out the conditions for using their premises;
•
Neutron diffraction is a complementary tool which is less harmful to nanomaterials than intense X-‐ray sources and provides valuable information about new structures with unknown properties. The Rutherford Appleton Laboratory (ISIS Facility) in the UK has a well developed infrastructure for specific measurements and standardization.
3.1.2. Monte Carlo modelling of SEM image formation is used to generate artificial SEM images or signal profiles for the development and testing of new CD evaluation algorithms. Therefore, the generated images have to be close enough to real SEM images to allow us to transfer results of a CD evaluation at a simulated image to real SEM measurements. The electron diffusion in the sample and the excitation and emission of secondary electrons are simulated by Monte Carlo routines. Central part of the simulation procedure is the iterative (i.e. stepwise) calculation of electron trajectories in solid state (Fig. 4). Random numbers are used to determine the scattering angles from scattering cross sections and the distances between the scattering points from the total mean free path. Simulated electron trajectories in a two-‐dimensional silicon line structure are shown in Fig. 5 (top). The local SE yield as a function of the scan position forms a signal profile that is characteristic for topography and material composition of the specimen (centre). A number of subsequent scan lines form a typical SEM grey-‐level image (bottom). Monte Carlo based SEM image modelling is a valuable and indispensable tool for traceable dimensional measurements with the scanning electron microscope14. 3.1.3. Monte-‐Carlo Method provides an effective approach to evaluate the measurement uncertainty of instruments, e.g. SFM, SEM, for a given measurement tasks. The key of building the Virtual SFM is the construction of the proper model of SFMs and its software realisation. According to generic measurement principles of SFMs, the main error sources of SFMs are classified into several sub-‐blocks. The first is the instrument itself, for instance the geometric errors of the scanner, the cantilever detection sensor, and the tip-‐surface. The second concerns the artefact, for instance, its misalignment and in-‐homogeneity. The third comes from the environment conditions, for instance, the acoustic/ground vibration, temperature, humidity, static electric charge. In addition, the measurement performance of a SPM may also be influenced by the operator, for instance, the selection of the scan mode, scanning parameters, and servo parameters. Besides the modelling of 13
http://www.ecanews.org/sig.htm
14
C G Frase, D Gnieser and H Bosse: “Model-‐based SEM for dimensional metrology tasks in semiconductor and mask industry“, J. Phys. D: Appl. Phys. 42 (2009) 183001
13
SFMs and the software realisation, several typical measurement tasks are simulated in Virtual SFM. After selecting the desired measurement task, virtual measurement procedures can be simulated by hundreds of times where the influence factors vary randomly. Based on this procedure, propagation of the stochastic and systematic error sources is simulated. The uncertainty budget can finally be determined from the distribution of the simulated measurement results. Until now, our virtual SFM model/software is able to simulate both commercial and metrological SFMs, and is able to estimate measurement uncertainty of simple measurement tasks, such as step height and 1D or 2D pitch measurement.
3.2. Techniques for modelling processes 3.2.1. Time dependent Monte Carlo method simulates processes occurring in a three-‐phase batch reactor working at isobar and isotherm conditions. It calculates the dose in time-‐dependent geometry; the results of three-‐dimensional calculations are usually performed separately and combined. This approach becomes cumbersome when high temporal resolution is required, if the geometry is complex, or if interplay effects between different, independently moving systems are to be studied. Standards in energy deposition can be established by implementing this technique. Quantum Monte Carlo methods now promise to provide nearly exact descriptions of the electronic structures of molecules. 3.2.2. Raman spectroscopy and related calculations The Raman Effect occurs when light in the visible, near infrared, or near ultraviolet range impinges upon a molecule and interacts with the electron cloud and the bonds of that molecule. The incident photon excites the molecule into a virtual state. A change in the molecular polarisation potential — or amount of deformation of the electron cloud — with respect to the vibrational co-‐ ordinate is required for the molecule to exhibit the Raman Effect. For the spontaneous Raman Effect, the molecule will be excited from the ground state to a virtual energy state, and relax into a vibrational excited state. The laser light interacts with phonons or other excitations in the system, resulting in the energy of the laser photons being shifted up or down. The shift in energy gives information about the phonon modes in the system. Two series of lines exist around this central vibrational transition. They correspond to the complementary rotational transition. Anti-‐Stokes lines correspond to rotational relaxation whereas Stokes lines correspond to rotational excitation. The Raman spectroscopy (RS) measures vibrational, rotational, and other low-‐frequency modes in a system. The amount of the polarizability change determines the Raman scattering intensity, whereas the Raman shift is equal to the vibrational level that is involved. The modes can be computed from the first principles which make the RS very useful for matching experimental data with theoretical predictions. Resonance Raman Spectroscopy (RRS) is more sensitive than the ordinary RS and can investigate details of the structure including the carbon nanotube chirality. RRS is a test of ab initio methods for the computation of molecular potential energy surfaces.
3.2.3. Photophysics in nanometrology
The photophysics is related to molecular fluorescence in condensed media. Current research is focused on biomedical sensing, sol-‐gel nanoparticle structure and dynamics, nanotomography using fluorescence resonance energy transfer, single molecule studies and developing ultrafast techniques such as multi-‐photon excitation. The fluorescence lifetime photophysics of ensembles and single molecules combined with surface enhanced resonance Raman studies to determine dynamical structure and distance as paths towards making molecular nanomovies. This includes characterization and control of simulated natural environments using hydrated sol-‐gel nanopores
14
and nanoparticles of oxides and noble metals. Detecting medically important metabolites, such as glucose, proteins, metal ions and others down to the single molecule level under controlled conditions provides a better understanding of the fundamental building-‐blocks of bio-‐molecular interaction that underpins the life sciences and medicine.
3.2.4 Molecular Transport and Molecular Nanometrology
WKB models, direct tunneling (Simmons model) and field emission tunnelling (Fowler-‐ Nordhaim tunneling), could be used to model conductivity in single molecular structure at low and elevated biased. Potentially, Simmons model could extract two molecular barriers, one for electrons and one for holes from conductivity spectra. Following this assumption electrical and optical gap-‐ probed molecular nanometrology (GMN) could be developed. The main GMN principle is the small difference between the values of the HOMO-‐LUMO energy gap detected by electrical and optical measurements. A comparison of experimentally derived electrical and optical probed gap and energy offsets between EF and nearest molecular orbital makes it possible the applicability and feasibility of this approach15.
4. Future Needs and Challenges By identifying the challenges both in computations and precise measurements the groups working in the field can be invited to meet and work out the ways to overcome obstacles related both with measurement techniques and computations. The close future needs include: •
Modelling and simulation of nanoscale materials, structures, and processes that take into account fabrication and integration processes, production equipment and characterisation of instrumentation supports and drives the real experiments towards new research directions.
•
Modelling of manufacturing processes -‐ solidification, 3D injection with or without reinforcement, fibres, anisotropy -‐ and systems (virtual and accurate products, conduction positions). These include combinatorial chemistry and genetic techniques that have opened the door to the synthesis of new bio-‐molecular materials and the creation of nanointerfaces and nanointerconnects between hard and soft matter. Nanoscience arose from the entire ensemble of these and other new experimental techniques, which have created the building blocks of nanotechnology.
•
Processes important for self-‐organisation of nanomaterials differ several orders of magnitude when the temperature changes in relatively narrow margins. That is why only devices based on preliminary understanding of the physical processes on the nanoscale could provide precise measurements .
•
Understanding specific restrictions and limitations relevant to nanometrology
Fundamental challenges in simulations are related to multi-‐scales. Multiscale modelling predicts material properties or system behaviour on one level using information or models from different levels. On each level particular approaches are used and sequence levels are usually distinguished: a level of quantum mechanical models (information about electrons is included), a level of molecular 15
Burtman, Vladimir; Pakoulev, Andrei V. WKB modeling of single molecular transport and Molecular Nanometrology, APS March Meeting, March 16-‐20, 2009, abstract #H38.007
15
dynamics models (information about individual atoms is included), mesoscale or nano level (information about groups of atoms and molecules is included), level of continuum models, level of device models. Each level addresses a phenomenon over a specific window of length and time. Multi-‐scale modelling is particularly important in integrated computational materials engineering since it allows to predict material properties or system behaviour based on knowledge of the atomistic structure and properties of elementary processes. A key challenge in nanoscience is the range of length and time scales that need to be bridged. It seems likely that fundamentally new mathematics will be needed to: • to bridge electronic through macroscopic length and time scales • to determine the essential science of transport mechanisms at the nanoscale • to devise theoretical and simulation approaches to study nanointerfaces, which dominate nanoscale systems and are necessarily highly complex and heterogeneous • to simulate with reasonable accuracy the optical properties of nanoscale structures and to model nanoscale opto-‐electronic devices • to simulate complex nanostructures involving “soft” biologically or organically based structures and “hard” inorganic ones as well as nano-‐interfaces between hard and soft matter • to simulate self-‐assembly and directed assembly • to devise theoretical and simulation approaches to quantum coherence, de-‐coherence, and spintronics • to develop self-‐validating and benchmarking methods In summary: there must be robust tools for quantitative understanding of structure and dynamics at the nanoscale and a strong link to real world and measurement values. 4.1 Steps to be followed to meet the needs and challenges To coordinate exchange of information about the needs of SMEs with the alliances of software developers, it is necessary: • • •
to identify 'a home' (this could be NPL, UK because of the utilities and facilities available for testing of new ideas in measurements) where regular meetings will take place to organize meetings with the representatives of European Technology Platforms -‐ NESSI, Photonics21, EuMat to establish alliances and teams of experimentalists, theorists, applied mathematicians, and computer and computational scientists to meet the challenges of nanometrology.
Several formations have already been established in Europe. Examples are: •
Modelling for Nanotechnology (M4nano) is a WEB-‐based initiative taking by four Spanish institutions. The aim is to maintain a systematic flow of information among research groups and to avoid that research efforts in Nanomodelling remain fragmented. A total of 139 Spanish research groups are registered16
16
http://www.m4nano.com/m4nanoc_m4/index.php
16
•
At the University of Leeds, a Centre for Nano-‐Device Modelling coordinates the work of nanotechnology device modelling within the University17.
Italy The issue of AFM calibration is addressed in http://paduaresearch.cab.unipd.it/1295/. A new concept of calibration standards is introduced, based on optical fiber technology. It is shown how fiber micro-‐cylinders can be applied for accurate calibration of horizontal and vertical AFM axes, in the whole scan range. Crosstalk error evaluation and correction are also discussed. A method for modeling distortions due to tip wear rate in contact mode AFM is eventually proposed, based on lateral force monitoring. Czech Republic A department of nanometrology was established in 2007 in the Czech republic to develop scanning probe techniques for metrology purposes: http://cmi.eu/index.php?dwn=1&par=4208&wdc=67&lang=2 The main research topics are as follows: −
development of metrology SPM
−
improvement of SPM methods used obtaining local physical quantities, e. g.local optical, magnetic, thermal or mechanic properties
−
modelling of tip-‐sample interactions and their effects on measurement uncertainty in AFM measurements
Currently, the following instruments are being used for both the measurements and research: -‐ Microscope Accurex (Thermomicroscopes), with AFM, MFM and EFM capabilities within range of 100x100x10 micrometers −
Microscope Explorer (Veeco), with AFM, STM and SthM capabilities within range 100x100x10 micrometers.
-‐ Near-‐field scanning optical microscope Aurora (Veeco). Great Britain The University of Strathclyde has won a £5m award to expand its ground breaking research into nanometrology -‐ the ability to measure and characterise molecules. The prestigious Science and Innovation Award announced today is made up of £2.8m from the Engineering and Physical Science Research Council (EPSRC), £1.5M from the Scottish and Higher Education Funding Councils and £0.5M institutional support. 17
http://www.amsta.leeds.ac.uk/cndm/
17
The project is led by Professor David Birch, Head of the Department of Physics, in collaboration with Professor John Pickup's team at King's College London School of Medicine. The Strathclyde team includes Professor Duncan Graham, Pure and Applied Chemistry, and Professor Martin Dawson, Strathclydes Institute of Photonics. These awards aim to address the shortage of academics capable of leading future research in areas of strategic importance to the UK, and will lead to the recruitment and support costs of at least three lecturers, six research fellows and six Ph.D. students, across the two institutions. The project will be focused around the new Centre for Molecular Nanometrology, set up at Strathclyde in 2005 with over 2 million investment. The Centre combines capabilities in physics and chemistry based on novel molecular properties for emitting and scattering light as means of revealing molecular structure and dynamics on the nanometre scale.( http://nano.strath.ac.uk/ ) Non European •
At Rensselaer Polytechnic Institute, USA a Computational Center for Nanotechnology Innovation (CCNI) has one of the most powerful computers dedicated to simulations in the fields of nanoelectronic devices and molecular systems18.
•
Research and academic programs at Department of Electrical Engineering and Computer Science, MIT, USA seek to develop improved methods of nanoscale simulation and modelling19.
•
nanoHUB.org was created by the NSF-‐funded Network for Computational Nanotechnology (NCN). NCN is a network of universities with a vision to pioneer the development of nanotechnology from science to manufacturing through innovative theory, exploratory simulation, and novel cyberinfrastructure20.
International research in Nanometrology Having in mind that modelling is dependent on standards in measurements we have looked at recent patents related to the field. US Patent -‐ Nanometrology device standards for scanning probe microscopes and processes for their fabrication and use http://www.patentstorm.us/patents/7472576.html Nanometrology device standards and methods for fabricating and using such devices in conjunction with scanning probe microscopes are described. The fabrication methods comprise: (1) epitaxial growth that produces nanometer sized islands of known morphology, structural, morphological and chemical stability in typical nanometrology environments, and large height-‐to-‐ width nano-‐island aspect ratios, and (2) marking suitable crystallographic directions on the device for alignment with a scanning direction. 4.2 Software developers open software packages • Tinker, Molecular Modelling Toolkit (MMTK http://dirac.cnrs-‐orleans.fr/MMTK/) • LAMMPS http://lammps.sandia.gov/ 18 19 20
http://www.rpi.edu/research/ccni/ http://engineering.mit.edu/research/initiatives/cce.php https://nanohub.org/
18
commercial software packages • Materials Studio •
QuantumWise First-‐principles simulation software for nanoscience
4.3 Producers of nanotechnology products As a result of a Project on Emerging Nanotechnologies, an Inventory of nanotechnology-‐ based consumer products currently on the market has been created. From this inventory, the following European producers of nanotechnology products have been identified: Nanotechnology Concept (surface treatments), Aquanova (Micelle), Bonderite (ceramic coatings), Continental (tyres), Geohumus (water storing granulate), Schott/Minox (optical multicoating), Jack Wolfskin (fabric), Finy (surface sealing), Head (carbon fibre racquets), Akzo Nobel/BASF (paint), Adidas (carbon nanotube reinforced plate for running shoes), Kleinmann (surface coatings), Starnberger (surface coatings), Nanogate (surface coating), Percenta (sealing and cleaning foam), Tencel (nanofibrils), Nanit (toothpaste additive), SCF Technologies (glass coating and nano-‐structured materials), Sandvik (metal alloy), Bianchi (carbon fibre bicycles), NanoSphere (surface coating), SwissDent (toothpaste), Oxonica (fuel borne catalyst).
5. Future Directions
It is obvious that the electronics industry will not risk deploying billions of devices based on molecular electronics, even when they can be built, unless they are thoroughly understood and manufacturing processes are made predictable and controllable21. The electronics industry must have new simulations and models for nanotechnology that are at least as powerful and predictive as the ones in use today for conventional integrated circuits before it can chance marketing molecular electronics devices for a myriad of applications. It can be argued that biological applications of nanotechnology will require the same level of quantitative understanding before they are widely applied. Currently available metrology tools are also beginning to reach the limits of resolution and accuracy and are not expected to meet future requirements for nanotechnology or nanomanufacturing: The generic measurement tasks to be performed in micro-‐ and nano-‐metrology are: • Distance as defined between two surfaces oriented in the same direction. Example: distance between two lines of a line grating or two planes in a microstructure. • Width as defined by the distance between two opposing surfaces. Example: width of a channel. • Height as defined by the distance between two surfaces of same orientation but placed in a vertical direction. Example: depth of microfluidic channel. • Geometry (or form) as defined by the distance between the surface of the object and a pre-‐ defined reference. Example: flatness of wafer. • Texture and roughness defined as geometries of surface structures whose dimensions are small compared to the object under investigation. This poses a particular challenge for micro or nano sized objects because the surface becomes dominant with respect to object volume.. 21
http://www.mel.nist.gov/programs/nnp.htm
19
• Thickness of layers • Aspect ratio as defined by the depth of a structure divided by its width.
Typical measurement tasks are performed in the fields of Semiconductors, Microsystems, Nanotechnology. Miniaturisation has been one of the driving forces of technology during the last 20 years. As predicted by Taniguchi in 1983 by now the technologies have moved into the nano-processing era and even for precision machining processes sub-µm precision is achievable. Fig. 1 illustrates Taniguchi’s prediction. This development has been made very clear in the semiconductor industry during the last 30 years, where the number of components on a chip has been doubled each 18 months approximately. This phenomenon is usually referred to as Moore’s law. Today the semiconductor industry is moving below 90 nm in pitch and need for proper process and quality control is evident. N ew res ear ch dir ecti ons : •
• • • • • • •
encompasses nanoscale materials and structures simulation of integration processes, design of novel equipment for nanometrology, modelling of open quantum systems such as those encountered in nanometrology integration of multi-‐scale functional systems simulation of three dimensional nanoscale metrology, production-‐hardened metrology, and other areas driven by industrial applications advances in extensibility and portability of the software validation and verification of the modelling codes 20
m o d e l l i n g t h a t e
Computational simulation is riding a hardware and software wave that has led to revolutionary advances in many fields represented by both continuous and discrete models. TRACS and HPC-‐ Europe22 initiatives of the EU have fostered capabilities that make simulations with millions of degrees of freedom on thousands of processors for tens of days possible. Modelling which helps in trends in commercial developments: • green electronics; • organic and large area electronics; • bio-‐markers, • bio-‐chips & drug design; • drug delivery on a spot. The diverse phenomena within nanoscience will lead to a plethora of models with widely varying characteristics. For this reason, it is difficult to anticipate all the areas of mathematics which are likely to contribute and serendipity will undoubtedly play a role. As models and their supporting mathematics mature, new algorithms will be needed to allow for efficient utilization and application of the models. Feedback received after the distribution of the paper published in November 2009 (courtesy Dr. James Johnstone, Nanotechnology Knowledge Transfer Network) 1) The need for SME's to use easy packages which might take you 80% there in a specific problem and then you might need to call in the academic experts to help refine things later. 2) The need to embed modelling into industrial collaborative projects at national and European level as funding authorities are unlikely to call modelling topics in isolation except for certain things like nanotoxicology for example where the field is more academic. 3) The development of new data mining processes and procedures to cope with the sheer amount of data that can be generated these days. 4) Modelling is good for the sustainability agenda as it reduces the amount of expensive experimentation for industry and academia alike.
22
http://www.hpc-‐europa.eu
21