Chapter 11

The Realization of Nanotechnology

Abstract

The practical realization of nanotechnology requires the output of any nanofacturing process to be carefully specified. This leads naturally to the voluntary adherence to standards, a prerequisite for sustainable trade. Both standardization and specification require an effective metrology. Nanometrology poses particular challenges, but industry has risen to them and nowadays the provision of instruments capable of measuring a wide range of attributes with nanoscale resolution is an important industry in its own right.

Keywords

Specification; Standardization; Metrology; Instruments

One of the reasons why nanotechnology has been slow to move away from the research laboratory (with the exception of the semiconductor processing industry) is the lack of standardization. Another reason is the lack of scale-up capability. Since this is often due to lack of capital, it is covered in Chapter 13 (see also Section 12.11). It is connected with the lack of standardization because, if standardization were widespread, it would be possible to source nanomaterials from multiple suppliers with a low risk of unpredictable performance.

The archetypical scenario is that a company (“B”) wishes to enhance a finished product made from an organic polymer by incorporating nanoparticles into the polymer. It is assumed that the formulation problems have been solved (Section 10.5) and that the nanoparticle manufacturer (“A”) can deliver its product properly dispersed into the polymer at a high concentration (a masterbatch), in pelleted form.

Company B naturally wishes to guarantee security of supply, which is normally achieved by entering into supply contracts with several, typically three, manufacturers. This is, however, rarely possible with nanofacturers, whose ecosystem is dominated by many small companies each making a unique product. Furthermore, the small nanofacturer may be unable to meet a high volume of demand. These factors make it risky for B to proceed.

This unfavorable scenario would no longer apply if products for which there was evidently demand were nanofactured to according to standard specifications [1].

11.1 Nanospecification

A true nanomaterial is appropriately specified by a list of the identities and Euclidean coördinates of its constituent atoms. As a list literally written out, this is impracticable. For any macroscopic artifact, the number of atoms will be of the order of Avogadro's number, 6×1023Image. It may be that not only the element but also the isotope has to be specified—requiring about 10 bits. If each atom's position is specified with a resolution of 1 nm—in reality it would have to be somewhat better—about 90 bits would be required, assuming that the object has a volume of several dm3. Hence each list would require of the order of one yottabyte (YB) of storage—exceeding the present total world digital storage capacity of 10–100 ZB. An enormous amount of data compression could, however, be achieved by expressing most of the coördinates using a few short algorithms.

A single nanoparticle made of, say, doped selenium (atomic weight Mr=79Image and density ρ=4.8gcm3Image) and having a diameter of 20 nm has a volume of about 4×1027Image m3, and contains about 150 atoms. It would require about 7000 bytes for its list. If every particle in a nanofactured batch was the same, this, and the total number of particles, would suffice as a specification.

Without such a list, one could instead specify shape, the lengths of all characteristic dimensions, crystal structure, and impurity concentrations, separated into average bulk concentrations and foreign atoms present at the surface. For crystalline particles made from compounds, the chemical nature of the crystal faces should also be specified. The methods used to determine all these parameters should also be given.

Even if it is not desired, most nanofacturing procedures generate a distribution of sizes which should then be specified, along with any distribution of shapes and facets.

In reality, most nanoparticle manufacturers merely state the chemical formula (often assumed rather than determined experimentally), the overall purity, the average particle size (APS) and the specific surface area (SSA) measured experimentally using a technique such as the BET method. Typically, the methods used to determine the parameters are not stated. Since different methods often yield different values and have different uncertainties, this reluctance to share details raises another needless barrier to trade, since very often the end-user requires nano-objects of a certain size measured with a particular method.

APS usually implicitly refers to primary particle size with no information regarding agglomeration or aggregation given. It can, however, be estimated from the APS and SSA: for a spherical particle the SSA should be 6/(ρAPS)Image, hence a measure of the degree of agglomeration or aggregation is the ratio SSAAPS/SSABETImage: the ratio would equal one if there were no agglomeration or aggregation. The difference between agglomeration and aggregation is of great practical importance, since agglomerates can be broken up using a low energy input, whereas aggregates usually cannot without destroying the particles. It is usually left to the end-user to determine which they are when SSAAPS>SSABETImage.

Given the great variety of technologies used to produce nano-objects (chemical vapor deposition, chemical precipitation, hydrothermal methods, explosion, mechanical milling, physical vapor deposition, vapor condensation, and so forth) it would be helpful to know how a given product was produced, especially when the parameters characterizing the material are sparse, but again this is not usually forthcoming unless the nanofacturer is known to use a single, usually novel, nanofacturing technology.

11.2 Standardization of Nanotechnology

It has long been recognized that the adoption of standards has a strongly positive effect on commercial life, so much so that, historically, adopting them has often been a legally enforced obligation. A straightforward example is offered by railways. A company fixing a unique gauge for its tracks retains a complete monopoly over their exploitation but the impossibility of running trains through to other networks creates a net disbenefit. Occasionally there were strategic reasons for preventing such through running (as in the case of Russia), especially important as a time when the military potential of railways had begun to be perceived, but nowadays the nuisance of having to change trains, or change the bogies, at the border is only mitigated by Russia being such a vast country; border traffic only constitutes a minute proportion of the whole.

Standards can be normative, specifying what shall be done (e.g., a specific test method) or simply informative. For the most part their adoption is nowadays voluntary. The fact that they have been drawn up through a careful process of achieving consensus, preferably international (the leading organization is the ISO) is an obvious encouragement to their adoption. Voluntary submission to an engineering constraint removes constraints to trade [2].

The first stage of standardization is to agree on the definitions of words. Thus, the first publications of the ISO Technical Committee (TC) 229—Nanotechnologies (constituted in 2005) have been vocabularies [3]. The work of other ISO technical committees also impinges on nanotechnology: for example TC 24 (particle characterization); TC 201 (surface chemical analysis); TC 202 (microbeam analysis); and TC 213 (the geometry of surfaces). The business plan of the ISO TC 229 is very comprehensive: for categories of standards are being developed: terminology and nomenclature, in order to provide a common language; measurement and characterization; health, safety and environmental; and individual material specifications.

11.3 Nanometrology

The existence of technical specifications implies the capabilities of measuring the specified parameters. It might well be asserted that metrology invented nanotechnology. Apart from the nanoparticles long synthesized by chemists, the first practical development in nanotechnology was the scanning tunneling microscope, followed by the atomic force microscope and the still growing number of other scanning probe microscopies. These instruments are able to resolve height differences with subatomic resolution, and their lateral resolution is also now good enough to enable individual atoms to be imaged. Moving atoms around on a solid surface, which can be readily accomplished by an atomic force microscope, is a prototype of the process of atom-by-atom assembly that will, it is hoped, form the basis of productive nanosystems.

Metrology has long been associated with the obligatory use of standard weights and measures. England's Magna Carta (1215) makes reference to such an obligation, the history of which goes back to the earliest civilizations such as Egypt. Clearly such obligations cannot be enforced unless there are reliable methods for quantifying masses and lengths, as well as the other basic quantities such as electric current, temperature, luminous intensity, amount of substance and (sidereal) time that constitute the Système International d'Unités; all the other units can be derived from these.

Owing to the intrinsic difficulties of making measurements at the nanoscale, many nanotechnologists end up having to follow an empirical, trial-and-error approach to product development, instead of a systematic and controlled one. Although the former may require little additional capital expenditure, it is lengthy and, hence, expensive, apart from the enormous material waste that generally ensues. Presently, this appears to be very much the case in nanomedicine. The developers of diagnostic nanodots and biosensors lack the metrology tools that they would need to improve device performance in a rational and controlled manner. The lack of such tools not only impedes development, but is also an obstacle to formulating procedures for measurement, quality control, and reproducibility of manufacture. The deficiency in nanometrology tools is particularly glaring in technologies involving the life sciences, although there has been encouraging recent progress in developing tools based on interrogating the evanescent optical fields created at the surface of planar optical waveguides functioning as substrata for living cells and biomolecular thin films.

A similar deficiency can be seen in the development of nanodevices for energy conversion and storage. While purely structural nanometrology has become well developed, thanks, above all, to techniques like the atomic force microscope and electron microscopy, together with auxiliary techniques such as fast ion bombardment (FIB) that are extremely useful for sample preparation, device development can still be very slow because of the lack of functional microscopies. For example, what would be needed for the development of novel solar cells is a technique able to simultaneously characterize the three-dimensional domain structure and the optoelectronic properties under operating device conditions. Nevertheless, the seemingly endless versatility of the scanning probe microscope may come to the rescue, since the scanning tip can be made to be sensitive to photoconductivity. Another example is the scanning Kelvin probe microscope (SKPM) for the study of transparent conducting nanocomposites (to replace indium tin oxide). The SKPM can measure both surface topography and work function (i.e., the ease of extracting electronic charge from a material) [4]. Other extensions of the scanning probe concept include scanning electrochemical microscopy (SECM), which can be usefully applied to develop novel materials for fuel cells [5]; and scanning ion current microscopy (SICM), which allows the topography of living cells to be mapped with less perturbation of the cell than with conventional atomic force microscopy; and electrostatic force microscopy (EFM) [6], which is very useful for the precise measurement of the thickness of oligolayer graphene.

Looking further into the future, the development of quantum cryptography based on encoding information into the quantum states of single photons will also require a metrology framework for independently assessing performance. Challenges of this magnitude are generally only achievable through multinational collaboration.

Whereas in the past metrology has been firmly embedded in the domain of physics, the introduction of nano-objects, which can quickly be dispersed into the environment in enormous numbers with unknown and possibly harmful effects, has prompted the development of nano-ecotoxicology as a branch of physics rather than remaining within the traditional domain of naturalists. A key development here is the creation of reference materials for calibrating instruments and optimizing protocols for sample preparation and testing. Such reference materials are no less important in this domain than they are for the length calibration of scanning probe microscopes.

The interaction between metrology and nanotechnology operates in both directions. Recently graphene, the archetypical nanomaterial, was used to effect the most precise measurements of the quantum Hall effect ever achieved [7], which will assist the work to redefine the fundamental units of mass and electric current.

Furthermore, the scanning tunneling and atomic force microscopes, which have played such a central rôle in the development of nanotechnology, are the preferred tools for the development of the assemblers envisioned by Richard Feynman for the fabrication of devices and materials to atomic specifications, and given more concrete form by the subsequent work of Eric Drexler. With these instruments single atoms can be picked up from, displaced along, and deposited on a substrate.

Despite all the developments, there is still no single instrument that can fulfill all the needs of nanotechnologists. Figure 11.1 summarizes the most important available techniques for topographic and chemical analysis, plotted according to their spacial and chemical resolution.

Image
Figure 11.1 Techniques for surface and nano analysis. AES, Auger electron spectroscopy; AFM, atomic force microscopy; DESI, desorption electrospray ionization; dSIMS, dynamic SIMS (secondary ion mass spectrometry); EPMA, electron probe microanalysis; G-SIMS, gentle SIMS; μTA, micro thermal analysis; SNOM, scanning near-field optical microscopy; STM, scanning tunneling microscopy; TERS, tip-enhanced Raman spectroscopy; TEM, transmission electron microscopy; PEELS, parallel electron energy–loss spectrometry; XPS, X-ray photoelectron spectroscopy. Bulk analysis terms: EI-MS, electron ionization mass spectrometry; ICP, inductively-coupled plasma. Figure reproduced from C. Minelli and C.A. Clifford, The role of metrology and the UK National Physical Laboratory in nanotechnology. Nanotechnol. Perceptions 8 (2012) 59–75, with permission of Collegium Basilea.

11.4 The Nanometrology Instrument Industry

The primary products included in this category are the scanning probe microscopes that are indispensable for observing processes at the nanoscale, and which may even be used to assemble prototypes. The market is, however, still relatively small in value—estimated at around $250 million per annum for the whole world. This represents about a quarter of the total microscope market. Optical microscopes have a similar share (but are presently declining, whereas scanning probe microscopes are increasing), and electron microscopes have half the global market.

Given the growing importance of metrology for the rational development of nanoproducts (see Section 11.3), this sector must inevitably grow and become more diverse. While the first instruments of a new type are built in the laboratory of the inventors, if it becomes apparent that there is widespread demand for them, commercial instrument manufacturers quickly move to provide user-friendly exemplars (this has been a very striking development with scanning tunneling and atomic force microscopes, both of which were typically constructed in the laboratories of institutions wishing to use them in the years immediately following their invention).

At the other end of the size spectrum are telescopes, looking at very large and very distant objects. New generations of space telescopes and terrestrial telescopes used for astronomy require optical components (especially mirrors) finished to nanoscale precision. The current concept for very large terrestrial telescopes is to segment the mirrors into a large number of unique pieces (of the order of one square meter in area), the surface of each of which must be finished to nanoscale precision [8].

References

[1] The reader should bear in mind that even where there is a standard specification, changing the supplier (of a traditional material) is known to often cause problems in production; sometimes even a new batch from the same supplier will cause problems. At root, this is due to the material being underspecified. In complex technologies it might, however, take an unreasonable amount of work to determine an adequately comprehensive list of parameters that the material must fulfil.

[2] One can see a similar effect operating in Switzerland, and in some other countries, which have voluntarily decided to adopt certain European Union laws to facilitate trade between their country and the EU. The only negative aspect of standardization is that it limits diversity. In the engineering world new ideas are, however, constantly emerging from individuals and research institutes, ensuring the constant influx of diversity. There is, however, danger in standardization in the living world. For example, Turkey has voluntarily replaced almost all of its indigenous varieties of tomatoes with varieties approved by the European Union, to the detriment of taste and, very likely, nutritional quality. Although the displaced varieties can be preserved in seed banks, in effect they have become extinct and experience shows how difficult it is to recreate lost diversity in agriculture.

[3] The ISO/TS 80004 multipart series.

[4] A. Cuenat, et al., Quantitative nanoscale surface voltage measurement on organic semiconductor blends, Nanotechnology 2012;23, 045703.

[5] R. Nicholson, et al., Electrocatalytic activity mapping of model fuel cell catalyst films using scanning electrochemical microscopy, Electrochim. Acta 2009;54:4525–4533.

[6] T. Burnett, et al., Mapping of local electrical properties in epitaxial graphene using electrostatic force microscopy, Nano Lett. 2011;11:2324–2328.

[7] A. Tzalenchuck, et al., Towards a quantum resistance standard based on epitaxial graphene, Nature Nanotechnol. 2010;5:186–189.

[8] P. Shore, Ultra precision surfaces, Proc. ASPE. Portland, Oregon. 2008:75–80.

Further Reading

[9] P. Hatto, Standardization for nanotechnology, Nanotechnol. Percept. 2007;3:123–130.

[10] P.W. Hawkes, From fluorescent patch to picoscopy, one strand in the history of the electron, Nanotechnol. Percept. 2011;7:3–20.

[11] R.K. Leach, Fundamental Principles of Engineering Nanometrology. Amsterdam: Elsevier; 2010.

[12] G.N. Peggs, Measurement in the nanoworld, Nanotechnol. Percept. 2005;1:18–23.

[13] J.J. Ramsden, R. Horvath, Optical biosensors for cell adhesion, J. Recept. Signal Transduct. Res. 2009;29:211–223 Describes techniques for analysing the evanescent optical fields created at the surface of planar optical waveguides functioning as substrata for living cells and biomolecular thin films, as an alternative to the traditional methods of optical microscopy for investigating biological objects.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset