Chapter 1

What is Nanotechnology?

Abstract

Nanotechnology is defined as ”the design, characterization, production, and application of materials, devices, and systems by controlling shape and size at the nanoscale.” In other words, it is both a process, called nanofacture or ultraprecision engineering, and a class of materials. The nanoscale consensually means the range 1–100 nm; nanomaterials are structured and/or sized in this range. Nanotechnology is sometimes also called atomically precise engineering, but techniques processing single atoms in the manner of additive manufacturing with powders (called “productive nanosystems”) are still inchoate and do not yet constitute a universal manufacturing technology. Some nano-objects are used directly in certain applications, such as the delivery of medicinal drugs to internal targets in the human body. In so far as the feature sizes of integrated circuits are now in the range of tens of nanometers, the applications of such circuits, for example in cellphones, rank as indirect nanotechnology. The creation of novelty is also usually appended to the definition of nanotechnology.

Keywords

Nanomaterial; Nano-object; Nanostructured material; Nanodevice; Nanofacture; Nanometrology; Nanoscience; Definitions

In the heady days of any new, emerging technology, definitions tend to abound and are first documented in reports and journal publications, then slowly get into books and are finally taken up by dictionaries, which do not, however, prescribe but merely record usage. Ultimately the technology will attract the attention of the International Organization for Standardization (ISO), which may in due course issue a technical specification (TS) prescribing in an unambiguous manner the terminology of the field, which is clearly an essential prerequisite for the formulation of manufacturing standards, the next step in the process.

In this regard, nanotechnology is no different, except that nanotechnology seems to be arriving rather faster than the technologies with which we might be familiar from the past, such as steam engines, telephones, and digital computers, were developed. As a reflexion of the rapidity of this arrival, in 2005 the ISO set up a Technical Committee (TC 229) devoted to nanotechnologies. Thus, unprecedentedly in the history of the ISO, we have technical specifications in advance of the emergence of a significant industrial sector.

The work of TC 229 is not yet complete, however, hence we shall have to make our own attempt to find a consensus definition. As a start, let us look at the roots of the technology. They are widely attributed to Richard Feynman, who in a now famous lecture at Caltech in 1959 [1] advocated manufacturing things at the smallest possible scale, namely atom-by-atom—hence the prefix “nano”, atoms typically being a few tenths of a nanometer (109Image m) in size. He was clearly envisaging a manufacturing technology, but from his lecture we also have glimpses of a novel viewpoint, namely that of looking at things at the atomic scale—not only artifacts fashioned by human ingenuity, but also the minute molecular machines grown inside living cells.

1.1 Nanotechnology as Process

We see nanotechnology as looking at things—measuring, describing, characterizing and quantifying them, and ultimately reaching a deeper assessment of their place in the universe. It is also making things. The manufacturing aspect was evidently very much in the mind of the actual inventor of the term “nanotechnology”, Norio Taniguchi from the University of Tokyo, who considered it as the inevitable and ultimate consequence of steadily (exponentially) improving engineering precision (Figure 1.1) [2]. Clearly, the surface finish of a workpiece achieved by grinding cannot be less rough than atomic roughness, hence nanotechnology must be the endpoint of ultraprecision engineering.

Image
Figure 1.1 The evolution of machining accuracy (after Norio Taniguchi).

At the same time, improvements in metrology had reached the point where individual atoms at the surface of a piece of material could be imaged, hence visualized on a screen. The possibility was of course already inherent in electron ultramicroscopy, which was invented in the 1930s [4], but numerous incremental technical improvements were needed before atomic resolution became attainable. Another development was the invention of the “Topografiner” by scientists at the US National Standards Institute [5], which produced a map of topography at the nanoscale by raster scanning a needle over the surface of the sample. A few years later, it was developed into the scanning tunneling microscope (STM), and in turn the atomic force microscope (AFM) that is now seen as the epitome of nanometrology (collectively, these instruments are known as scanning probe microscopes, SPMs). Hence a little more than 10 years after Feynman's lecture, advances in instrumentation already allowed one to view the hitherto invisible world of the nanoscale in a very graphic fashion. There is a strong appeal in having a small, desktop instrument (such as the AFM) able to probe matter at the atomic scale, which contrasts strongly with the bulk of traditional high-resolution instruments such as the electron microscope, which needs at least a fair-sized room to house it and its attendant services. Every nanotechnologist should have an SPM in his or her study!

In parallel, people were also thinking about how atom-by-atom assembly might be possible. Erstwhile Caltech colleagues recall Richard Feynman's dismay when William McLellan constructed a minute electric motor by hand-assembling the parts in the manner of a watchmaker, thereby winning the prize Feynman had offered for the first person to create an electric motor smaller than 1/64th of an inch. Although this is still how nanoscale artefacts are made (but perhaps not for much longer), Feynman's concept was of machines making progressively smaller machines ultimately small enough to manipulate atoms and assemble things at that scale. An indefatigable subsequent champion of that concept has been Eric Drexler, who developed the concept of the assembler, a tiny machine programmed to build objects atom-by-atom. It was an obvious corollary of the minute size of an assembler that in order to make anything of a size useful for humans, or in useful numbers, there would have to be a great many assemblers working in parallel. Hence, the first task of the assembler would be to build copies of itself, after which they would be set to perform more general assembly tasks. This conjuncture of very small size and very large numbers, which are the results, respectively, of nanification and vastification, will often be encountered in nanotechnology

This program received a significant boost when it was realized that the scanning probe microscope (SPM) could be used not only to determine nanoscale topography, but also as an assembler. IBM researchers iconically demonstrated this application of the SPM by creating the logo of the company in xenon atoms on a nickel surface at 4 K: the tip of the SPM was used to laboriously push 18 individual atoms into location [6]. Given that the assembly of the atoms in two dimensions took almost 24 h of laborious manual manipulation, few people associated the feat with steps on the road to molecular manufacturing. Indeed, since then further progress in realizing an assembler has been painstakingly slow [7], the next milestone being Oyabu et al.'s demonstration of picking up (abstracting) a silicon atom from a silicon surface and placing it somewhere else on the same surface, and then carrying out the reverse operation [8]. These demonstrations were sufficiently encouraging to stimulate the very necessary parallel work to automate the process of pick-and-place synthesis [9]. Without computer-controlled automation, atom-by-atom assembly could never evolve to become an industrially significant process.

Meanwhile, following on in the spirit of Taniguchi, semiconductor processing—the sequences of material deposition and etching through special masks used to create electronic components [10]—integrated circuits—was steadily reducing the feature sizes that could be achieved well below the threshold of 100 nm that is usually considered to constitute the upper boundary of the nano realm (the lower boundary being about 0.1 nm, the size of atoms). At the same time, the desire to fill a wafer led to an enormous increase of numbers, pace Moore's law. Nevertheless, frustration at being unable to apply “top–down” processing methods to achieve feature sizes in the truly atomic scale (i.e., of the order of 1 nm), or even the tens of nanometers range (although this has now been achieved by the semiconductor industry [11]) stimulated the development of “bottom–up” or self-assembly methods. These are inspired by the ability of randomly ordered structures, or mixtures of components, to form definite structures in biology. Well-known examples are proteins (merely upon cooling, a random polypeptide coil of a certain sequence of amino acids will adopt a definite structure), the ribosome, and bacteriophage viruses—a stirred mixture of the constituent components will spontaneously assemble into a functional final structure [12].

At present, a plethora of ingeniously synthesized organic and organometallic compounds capable of spontaneously connecting themselves to form definite structures are available. Very often these follow the hierarchical sequence delineated by A.I. Kitaigorodskii as a guide to the crystallization of organic molecules (the Kitaigorodskii Aufbau Principle, KAP)—the individual molecules first form rods, the rods bundle to form plates, and the plates stack to form a three-dimensional space-filling object. Exemplars in nature include glucose polymerizing to form cellulose molecules, which are bundled to form fibrils, which in turn are stacked and glued with lignin to create wood. Incidentally, this field already had a life of its own, as supramolecular chemistry [13], before nanotechnology focused interest on self-assembly processes.

Molecular manufacturing, the sequences of pick-and-place operations carried out by assemblers, fits in somewhere between these two extremes. Insofar as a minute object is assembled from individual atoms, it might be called “bottom–up”. On the other hand, insofar as atoms are selected and positioned by a larger tool, it could well be called “top–down”. Hence it is sometimes called “bottom-to-bottom”. Figure 1.2 summarizes the different approaches to nanofacture (nanomanufacture).

Image
Figure 1.2 Different modes of nanomanufacture (nanofacture). “Pick-and-place” assembly is also known as “bottom-to-bottom” or “mechanosynthesis” (the latter term can also mean synthesis facilitated by grinding).

1.2 Nanotechnology as Materials

The above illustrates an early preoccupation with nanotechnology as process—a way of making things. Before the semiconductor processing industry reduced the feature sizes of integrated circuit components to less than 100 nm [14], however, there was no real industrial example of nanotechnology at work. On the other hand, while process—top–down and bottom–up, and we include metrology here—is clearly one way of thinking about nanotechnology, there is already a sizable industry involved in making very fine particles, which, because their size is less than 100 nm, can legitimately be called nanoparticles. Generalizing, a nano-object is something with at least one spatial (Euclidean) dimension less than 100 nm; from this definition are derived those for nanoplates (one dimension less than 100 nm), nanofibers (two dimensions less than 100 nm), and nanoparticles (all three dimensions less than 100 nm); nanofibers are in turn subdivided into nanotubes (hollow fibers), nanorods (rigid fibers), and nanowires (conducting fibers).

Although nanoparticles of many different kinds of materials have been made for hundreds of years, one nanomaterial stands out as being rightfully so named, because it was discovered and nanoscopically characterized in the nanotechnology era: graphene and its compactified forms, namely carbon nanotubes (Figure 1.3) and fullerenes (nanoparticles).

Image
Figure 1.3 Scanning electron micrographs of carbon nanotubes grown on the surface of a carbon fiber using thermal chemical vapor deposition. The right-hand image is an enlargement of the surface of the fiber, showing the nanotubes in more detail. Reprinted from B.O. Boscovic, Carbon nanotubes and nanofibres. Nanotechnol. Perceptions 3 (2007) 141–158, with permission from Collegium Basilea.

A very important application of nanofibers and nanoparticles is in nanocomposites: the nano-objects are added to and dispersed in a matrix, as described in more detail in Chapter 5.

1.3 Nanotechnology as Devices and Systems

One problem with associating nanotechnology exclusively with materials is that nanoparticles were deliberately made for various esthetic, technological and medical applications at least 500 years ago, and one would therefore be compelled to say that nanotechnology began then. To avoid that problem, materials are generally grouped with other entities along an axis of increasing complexity, encompassing successively devices and systems. These are not core terms in the ISO 80004 vocabulary, hence some ambiguity surrounds their use: is the overall size of a nanodevice, or nanomachine, defined as a nanoscale automaton (i.e., an information processor), within the nanoscale (e.g., a responsive or “smart” nano-object) or does it merely contain nanosized components? Furthermore, a device might well be a system (of components) in a formal sense; it is not generally clear what use is intended by specifying “nanosystem”, as distinct from a device. Since devices are obviously made from materials, the latter might be considered as the more basic category; on the other hand the functional equivalent of a particular device could be realized in different ways, using different materials. Devices and systems belong to a common category with their complexity as a feature differentiating between them [16]. These concrete concepts of processes, materials, devices, and systems can be organized into a formal concept system or ontology, as illustrated in Figure 1.4.

Image
Figure 1.4 A concept system (ontology) for nanotechnology. Most of the terms would normally be prefixed by “nano” (e.g., nanometrology, nanodevice). A dashed line signifies that if the superordinate concept contributes, then the prefix must indicate that (e.g., bionanodevice, bionanosystem). Biology may also have some input to nanomanufacture (nanofacture), inspiring, especially, self-assembly processes. Reproduced from J.J. Ramsden, Towards a concept system for nanotechnology. Nanotechnol. Perceptions 5 (2009) 187–189, with permission of Collegium Basilea.

1.4 Direct, Indirect, and Conceptual Nanotechnology

Another axis for displaying nanotechnology, which might be considered as orthogonal to the materials, devices and systems axis, considers direct, indirect and conceptual aspects. Direct nanotechnology refers to nanosized objects used directly in an application—a responsive nanoparticle used to deliver drugs to an internal target in the human body is an example. Indirect nanotechnology refers to a (probably miniature) device that contains nanoscale components, possibly along with other microscale or macroscale components and systems. An example is a cellphone. The internal nanodevice is the “chip”—the integrated electronic information processor circuits with feature sizes less than 100 nm. All the uses to which the cellphone might be put would then rank as indirect nanotechnology. Given the extent of contemporary society's dependence on information processing, nanotechnology is truly pervasive from this viewpoint alone. It is, of course, the very great processing power, enabled by the vast number of components on a small chip, and the relatively low cost (arising for the same reason), both of which increasingly rely on nanotechnology for their realization, that makes the “micro” processor ubiquitous.

The indirect role of nanotechnology emphasizes its nature as an enabling technology. The pervasive presence of computing is enabled by nanotechnology (were processors bigger, they would not be anything like as ubiquitous as they are).

Conceptual nanotechnology refers to considering materials and systems (or, even more generally, “phenomena”) from the “nano” viewpoint—such as trying to understand a structure, or the mechanism of a process, at the atomic scale. Thus, as an example, molecular medicine [17], which attempts to explain diseases by the actions of molecules, is classified as conceptual nanotechnology. Conceptual nanotechnology also refers to the creation of novelty, which is exemplified by:

  • •  Objects with no bulk counterpart (e.g., graphene, carbon nanotubes, and fullerenes);
  • •  Objects acquiring new functionality because of their nano-enabled ultrasmall size or cost (the latter implying the possibility of vast numbers) (e.g., cellphones);
  • •  Enhanced performance, including the achievement of novel combinations of properties by bringing nanoscale components together in intimate juxtaposition.

1.5 Nanobiotechnology and Bionanotechnology

These widely used terms are almost self-explanatory. Nanobiotechnology is the application of nanotechnology to biology. For example, the use of semiconductor quantum dots as biomarkers in cell biology research ranks as nanobiotechnology, which encompasses “nanomedicine”, defined as the application of nanotechnology to human health.

Bionanotechnology is the application of biology—which could be a living cell, or a biomolecule—to nanotechnology. An example is the use of the protein bacteriorhodopsin as an optically switched optical (nanophotonic) switch.

These terms also exemplify the convergence outlined in Section 4.5.

1.6 Nanotechnology—Toward a Definition

The current dictionary definition of nanotechnology is “the design, characterization, production and application of materials, devices and systems by controlling shape and size at the nanoscale” [18]. (The nanoscale itself is at present consensually considered to cover the range from about 1 to 100 nm—see Section 1.7 and [14].) A slightly different nuance is given by the same source as “the deliberate and controlled manipulation, precision placement, measurement, modeling, and production of matter at the nanoscale in order to create materials, devices, and systems with fundamentally new properties and functions”. The International Organization for Standardization (ISO) also gives two meanings: (1) understanding and control of matter and processes at the nanoscale, typically, but not exclusively, below 100 nm in one or more dimensions where the onset of size-dependent phenomena usually enables novel applications; and (2) utilizing the properties of nanoscale materials that differ from the properties of both individual atoms and molecules, and bulk matter, in order to create improved materials, devices, and systems that exploit these new properties.

Another formulation encountered in reports is “the design, synthesis, characterization and application of materials, devices, and systems that have a functional organization in at least one dimension on the nanometer scale”. The US Foresight Institute gives: “Nanotechnology is a group of emerging technologies in which the structure of matter is controlled at the nanometer scale to produce novel materials and devices that have useful and unique properties.” The emphasis on control is particularly important: it is this that distinguishes nanotechnology from chemistry, with which it is often compared; in the latter, motion is essentially uncontrolled and random, within the constraint that it takes place on the potential energy surface of the atoms and molecules under consideration. In order to achieve the desired control, a special, nonrandom, eutactic environment needs to be available or created. Reflecting the importance of control, a very succinct definition of nanotechnology is simply “engineering with atomic precision”; sometimes the phrase “atomically precise technologies” (APT) is used to denote nanotechnology. The “fundamentally new (or unique) properties” and “novel” aspects that many nanotechnologists insist upon imply the exclusion of ancient or existing artifacts that happen to be small.

A good example of novelty is the emergence of superparamagnetism in small particles of materials that are ferromagnetic in the bulk [15]. This phenomenon was discovered in the 1930s but at that time the size of the particles could not be precisely determined; the definitive demonstration dates from the late 1950s (around the time of Richard Feynman's lecture [1]).

The production of materials in the nanoscale (nanofacture) implies the ability to measure in that scale (nanometrology) [3].

In summary, nanotechnology has three aspects:

  1. 1.  A universal fabrication procedure (which bears the same relation to classical engineering as “pointillisme” does to classical painting), with a precision of around 1 nm at present, rather than truly atomic precision;
  2. 2.  A particular way of conceiving, designing, measuring, and modeling materials, devices, and systems, which have features in the nanoscale;
  3. 3.  The creation of novelty; new features that appear uniquely in the nanoscale.

As a definition, to be intelligible these aspects require the auxiliary definition of the nanoscale.

1.7 The Nanoscale

Any definition of nanotechnology must also incorporate, or refer to, a definition of the nanoscale. As yet, there is no formal definition with a rational basis, merely a working proposal. If nanotechnology and nanoscience regard the atom (with size of the order of 1 Å, i.e. 0.1 nm) as the smallest indivisible entity, this forms a natural lower boundary to the nanoscale. The upper boundary is fixed more arbitrarily. By analogy with microtechnology, now a well-established field dealing with devices up to about 100 μm in size, one could provisionally fix the upper boundary of nanotechnology as 100 nm. There is no guarantee, however, that unique properties appear below that boundary [14].

The advent of nanotechnology raises an interesting question about the definition of the prefix “micro”. An optical microscope can resolve features of the order of 1 μm in size. It is really a misnomer to also refer to instruments such as the electron microscope and the scanning probe microscope as “microscopes”, because they can resolve features at the nanometer scale. It would be more logical to rename these key nanometrology instruments electron nanoscopes and scanning probe nanoscopes—although the word “microscope” is perhaps already too deeply entrenched for a change to be accepted. As a compromise, the term “ultramicroscope” could be used: it is already known within the community of electron microscopists; the pioneers in the field, especially in Germany, almost always referred to “ultramicroscopy” [4].

In summary, the nanoscale is defined in three ways:

  1. 1.  The scale in which novel (with respect to bulk matter and isolated atoms) features appear in condensed matter;
  2. 2.  The practical limit of ultraprecision engineering;
  3. 3.  A consensus of 1–100 nm.

1.8 Nanoscience

This term is sometimes defined as “the science underlying nanotechnology”—but is this not biology, chemistry and physics—or the “molecular sciences”? It is the technology of designing and making functional objects at the nanoscale that is new; science has long been working at this scale and below. No one is arguing that fundamentally new physics, in the sense of new elementary forces, for example, appears at the nanoscale; rather it is new combinations of phenomena manifesting themselves at that scale that constitute the new technology. The term “nanoscience” therefore appears to be superfluous if it is used in the sense of “the science underlying nanotechnology”, although as a synonym of conceptual nanotechnology it might have a valid meaning as the science of mesoscale approximation.

The molecular sciences include the phenomena of life (biology), which do indeed emerge at the nanoscale (although, again, without requiring new elementary laws as far as we are presently aware).

References

[1] R.P. Feynman, There's plenty of room at the bottom, H.D. Gilbert, ed. Miniaturization. New York: Reinhold; 1961:282–296.

[2] N. Taniguchi, On the basic concept of nano-technology, Proc. Intl Conf. Prod. Engng Tokyo, Part II. Japan Society of Precision Engineering; 1974:18–23.

[3] G.N. Peggs, Measurement in the nanoworld, Nanotechnol. Percept. 2005;1:18–23.

[4] P.W. Hawkes, From fluorescent patch to picoscopy, one strand in the history of the electron, Nanotechnol. Percept. 2011;7:3–20.

[5] R. Young, et al., The topografiner: an instrument for measuring surface microtopography, Rev. Sci. Instrum. 1972;43:999–1011.

[6] E.K. Schweizer, D.M. Eigler, Positioning single atoms with a scanning tunnelling microscope, Nature (London) 1990;344:524–526.

[7] Apart from intensive activity in numerically simulating the steps of molecular manufacturing—e.g. B. Temelso, C.D. Sherrill, R.C. Merkle, R.A. Freitas Jr., Ab initio thermochemistry of the hydrogenation of hydrocarbon radicals using silicon-, germanium-, tin-, and lead-substituted methane and isobutene, J. Phys. Chem. A 2007;111:8677–8688.

[8] N. Oyabu, et al., Mechanical vertical manipulation of selected single atoms by soft nanoindentation using near contact atomic force microscopy, Phys. Rev. Lett. 2003;90, 176102.

[9] D.Q. Ly, et al., The matter compiler—towards atomically precise engineering and manufacture, Nanotechnol. Percept. 2011;7:199–217; R.A.J. Woolley, et al., Automated probe microscopy via evolutionary optimization at the atomic scale, Appl. Phys. Lett. 2011;98, 253104.

[10] A.G. Mamalis, A. Markopoulos, D.E. Manolakos, Micro and nanoprocessing techniques and applications, Nanotechnol. Percept. 2005;1:63–73.

[11] International Technology Roadmap for Semiconductors (ITRS). The current (2013) edition of the ITRS looks as far ahead as 2028.

[12] E. Kellenberger, Assembly in biological systems, Polymerization in Biological Systems, CIBA Foundation Symposium 7 (new series). Amsterdam: Elsevier; 1972.

[13] P.A. Gale, J.W. Steed, eds. Supramolecular Chemistry: From Molecules to Nanomaterials. Chichester: Wiley; 2012 (especially vol. 6: Supramolecular Materials Chemistry).

[14] This is a provisional upper limit of the nanoscale. More careful consideration suggests that the nanoscale is, in fact, property-dependent. See J.J. Ramsden, J. Freeman, The nanoscale, Nanotechnol. Percept. 2009;5:3–26.

[15] R. Kaiser, G. Miskolczy, Magnetic properties of stable dispersions of subdomain magnetite particles, J. Appl. Phys. 1970;41:1064–1072.

[16] For an interesting approach to quantifying the complexity, see J.W. Senders, On the complexity of medical devices and systems, Qual. Saf. Health Care 2006;15(Suppl I):i41–i43.

[17] This could also be called nanomedicine but that term is usually used to signify the use of nano-objects to deliver drugs and, the like (see Chapter 6).

[18] E. Abad, et al., NanoDictionary. Basel: Collegium Basilea; 2005.

Further Reading

[19] K.E. Drexler, Engines of Creation. New York: Anchor Books/Doubleday; 1986.

[20] J.J. Ramsden, What is nanotechnology? Nanotechnol. Percept. 2005;1:3–17.

[21] J.J. Ramsden, Nanotechnology: An Introduction. 2nd edition Amsterdam: Elsevier; 2016.

[22] C.W. Shong, S.C. Haur, A.T.S. Wee, Science at the Nanoscale. Singapore: Pan Stanford; 2010.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset