Chapter 10

The Design of Nanotechnology Products

Abstract

The extreme smallness of nano-objects and nanodevices implies the need for a great many of them for any practical application, which imposes great design challenges. In particular, it may be necessary to abandon the traditional engineering approach of explicit design of each individual component in a system; instead, evolutionary design principles may need to be adopted. Clearly this is very successful in the living world (e.g., brain design). Since living cells are full of intricate nanostructures and nanodevices, they also serve as a direct inspiration for artificial materials and devices. Furthermore, like their living counterparts, nano-objects and nanodevices moving in liquid media are subjected to viscous rather than inertial forces. Another corollary of smallness is the immense surface:volume ratio compared with macroscopic systems, which means that interfacial physical chemistry becomes of vital importance when formulating nanocomposites. Materials selection is anyway confronted with the problem of a much greater range of potential choices, requiring the deployment of computer simulation and high-throughput synthetic and measurement methods in order to be usefully exploited. Quality control, accustomed to traditional macroscopic approaches, requires a similar kind of rethinking.

Keywords

Biomimicry; Biomimetics; Evolutionary algorithms; Low Reynolds number

One of the difficulties faced by suppliers of any upstream technology is that they must ensure that its use is already envisaged in the design of the downstream products that will incorporate the technology. Apart from fulfilling technical specifications, esthetic design is one of the crucial factors determining the allure of almost any product (perhaps those destined for outer space are exceptions), but especially a consumer product. In this chapter we look at some peculiar features associated with the design of nanodevices, here defined as devices incorporating nanomaterials or of very small overall size.

10.1 The Challenge of Vastification

There is little point in making something very small if only a few of those things are required [1]. The interest in making a very large-scale integrated circuit with nanoscale components is rooted in the possibility of making vast numbers in parallel. Thus, the diameter of the silicon wafers has grown from 4″ to 8″ to 12″ and now to 18″.

Hence, although the most obvious consequence of nanotechnology is the creation of very small objects, an immediate corollary is that there must be a great many of these objects. If r is the relative device size and R the number of devices, then usefulness may require that rR1Image, implying the need for making 109 nanodevices at a stroke [2]. This corresponds to the number of components (with a minimum feature size of 45 nm) on a very large-scale integrated electronic processor or storage chip, for example. At present, all these components are explicitly designed and fabricated, albeit with the aid of computers in a subordinate rôle. But will this still be practicable if the number of components increases by a further two and more orders of magnitude?

10.2 Enhancing Traditional Design Routes

Regarding processor chips, which are presently the most vastified objects in the nano world, aspects requiring special attention are: power management, especially to control leakage; process variability, which may require a new conception of architectural features; and a systems-oriented approach, integrating functions and constraints, rather than considering the performance of individual transistors. Nevertheless, the basic framework remains the same.

Because we cannot give a clear affirmative answer to the question posed above, alternative routes to the design and fabrication of such vast numbers are being explored. The human brain serves as an inspiration. Its scale is far vaster than the integrated circuit: it has 1011Image neurons, and each neuron has hundreds or thousands of connections to other neurons. So vast is this complexity there is insufficient information contained in our genes to specify all these interconnexions. We may therefore infer that our genes specify an algorithm for generating them [3].

In this spirit, evolutionary design principles may become essential for designing nanodevices. An example of an evolutionary design algorithm is shown in Figure 10.1. It might be initialized by a collection of existing designs, or guesses at possible new designs. Since new variety within the design population is generated randomly, the algorithm effectively expands the imagination of the human designer.

Image
Figure 10.1 An evolutionary design algorithm. All relevant design features are encoded in the genome (a very simple genome is for each gene to be a single digit binary value indicating absence (0) or presence (1) of a feature). The genomes are evaluated (“survivor selection strategy”)—this stage could include human (interactive) as well as automated evaluation—and only genomes fulfilling the evaluation criteria are retained. The diminished population is then expanded in numbers and in variety—typically the successful genomes are used as the basis for generating new ones via biologically-inspired processes such as recombination and mutation.

Although this strategy enables the design size (i.e., the number of individual features that must be explicitly specified) to be expanded practically without limit, one typically sacrifices knowledge of the exact internal workings of the device, introducing a level of unpredictability into device performance that may require a new engineering paradigm to be made acceptable.

Genetic algorithms [4] use bit strings to encode the target object. The genome is fixed in advance, only the combinations of presence and absence of individual features can be varied. In other words, the form of the solution is predetermined. For example, if the solution can be expressed as an equation, the coefficients evolve but not the form of the equation. More advanced algorithms relax these conditions; that is, the genome length can vary and additions and deletions are possible. These schemata are rather far from natural selection, and might best be described as artificial selection.

Genetic programming [5] works at a higher level, in which the algorithm itself evolves. In other words, the form of the solution can evolve. Typically the solution is defined by trees of Lisp-like expressions, and changes can be made to any node of the tree. Genetic programming is closer to natural selection.

Human knowledge can be captured not only in the design of the algorithms, but also by incorporating an interactive stage in the fitness evaluation [6]. Capturing such knowledge, typically contained only in the minds of long-serving practitioners, may be even more significant than the knowledge that is written down. The invention of writing was, of course, a great advance for civilization, allowing a great deal of knowledge to be passed efficiently from one generation to another, which did not even need to directly follow the one that had written the knowledge down. As the historian Buckle has pointed out, however, writing also directly encourages the publication of falsehoods, and it is often only the expert practitioner who can clearly separate reliable from unreliable knowledge. That is, currently, a great problem with automated data mining. The sheer volume of written information now available in many fields has made some kind of automated retrieval essential. The efficiency of retrieval can be gauged by the length of the interval that must elapse until it becomes more cost-effective for an organization to repeat work rather than retrieve the records of previous attempts.

10.3 Crowdsourcing

The stupendous increase in global connectivity between individuals has spawned a completely new paradigm. It is most obviously applicable to goods that can be transferred by the Internet, such as design and software, but since almost everything can be adequately encoded as a binary sequence of symbols (including the instructions to manufacture a physical object) it does not have any real limitation. “Outsourcing” is the distribution of defined tasks to people working in locations remote from that of the originator, but crowdsourcing goes well beyond that, because typically only the final goal is specified; the specifier hopes to receive a vast variety of ideas that will solve, or help to solve, the problem.

In principle anyone could solicit contributions; several organizations have been created to facilitate the process, giving more publicity to the demand and constructing a mechanism for rewarding contributors whose ideas are taken up. Ultimately that might be seen as superfluous. We have the examples of open source software (e.g., Linux) and the open source encyclopedia Wikipedia—just contrast the rise of the latter with the formerly mighty Encyclopaedia Britannica (which was, admittedly, anyway in decline since its great 11th edition published in 1910–1), in which contributions are provided gratis—although this model engenders other problems, not intrinsic to an open source project, but certainly present in Wikipedia [7,8], and which increasingly limit its usefulness. Productive Nanosystems (Section 18.1) will doubtless see a similar growth of open source instruction sets for nanofabricating artifacts.

A related development is “unsourcing”, the reliance on contributions from Internet volunteers for technical support to proprietary products. The proprietor may have to provide some initial impetus by setting up an online community. Some companies have sought to incentivize the process by turning the process into a game (“gamification”).

Clearly these developments have vast implications for the patent system for protecting intellectual property (cf. Section 12.9). It will, in fact, become quite redundant. Although it might be hard to envisage the dismantling of the huge and elaborate apparatus that has been built up around patents during the past few hundred years, society probably has to get used to the idea that it is going to fade away. As Benjamin Disraeli is said to have remarked, there is no copyright for ideas; the digital age (of which nanotechnology is the material embodiment) converts everything into ideas.

10.4 Materials Selection

Ashby has systematized materials selection through his property charts [9]. For example, Young's modulus E is plotted against density ρ for all known materials, ranging from weak light polymer foams to strong dense engineering alloys. The huge interest in nanomaterials is derived from the hope that it may be possible to populate empty regions on such charts, such as strong and light (currently natural woods are as close as we can get to this) or weak and dense (no known materials exist).

Material properties are only the first step. Shapability is also important, in ways that cannot be easily quantified. For example, rubber can readily be manufactured as a sealed tube in which form it can serve as a pneumatic tyre, but it is at risk from punctures, and a novel solid material may be useful, and more robust, for the same function. Finally, availability (including necessary human expertise) and cost—linked by the “laws” of supply and demand—must be taken into consideration. Nanotechnology, by allowing rapid material prototyping, should greatly enhance the real availability of novelty. An assembler should in principle allow any combination of atoms to be put together to create new materials.

The Materials Genome Initiative seeks to combine computational simulation, modeling, high-throughput combinatorial materials synthesis and rapid measurement techniques to vastly increase the speed of introduction of new materials onto the market [10].

10.5 Formulation

A significant practical difficulty that arises with nanified mixtures is dispersion. Nano-objects are highly prone to agglomeration or aggregation, and unless the primary objects are individually dispersed the benefits of nanification are lost.

The extremely high surface:volume ratio of nano-objects (mostly nanoparticles rather than nanorods or nanoplatelets) means that their surface energetic properties play an extremely important rôle. If the material can be characterized by its three single-substance surface tension components, γLWImage, γImage and γImage, corresponding to the Lifshitz–van der Waals (LW) potential, the electron donor potential (⊖) and the electron acceptor potential (⊕), then the interfacial tension (adhesive potential) γ12Image, where subscript 1 denotes the matrix and subscript 2 denotes the nano-object, is the sum of the Lifshitz–van der Waals and electron donor–acceptor (da) contributions [11]:

γ 12 = γ 12 ( LW ) + γ 12 ( da ) .

Image (10.1)

The LW interfacial tension is:

γ 12 ( LW ) = ( γ 1 ( LW ) γ 2 ( LW ) ) 2

Image (10.2)

and the da interfacial tension is:

γ 12 ( da ) = 2 ( γ 1 γ 1 + γ 2 γ 2 γ 1 γ 2 γ 1 γ 2 ) .

Image (10.3)

The single-substance surface tension components can be determined from measurements of the advancing contact angles of drops of three different liquids (water, dimethylsulfoxide and α-bromonaphthalene are quite suitable) on a flat surface of the pure material. This normally presents no problem for the matrix, but for the nano-objects, one can either prepare a film of them on a planar substrate or use bulk material and afterwards correct for the curvature [12]. The experiments are easy to carry out in the sense that no special equipment is required; one simply needs to know how to do them in the right way.

The interaction energy ν of the dispersion is given by the well known Bragg–Williams expression

ν = ν 11 + ν 22 2 ν 12

Image (10.4)

where the three terms on the right hand side are the interaction energies (enthalpies) for the matrix molecules (often an organic polymer) with themselves, nano-objects with themselves, and the matrix with the nano-objects. ν11Image and ν22Image are the cohesive energies related to the single-substance surface tensions by the Dupré law

ν 11 = 2 γ 1 A

Image (10.5)

where A is the interfacial area, and ν12Image is the adhesive energy, related to the single-substance surface tensions by

ν 12 = ( γ 1 + γ 2 γ 12 ) A .

Image (10.6)

The mixture is miscible (good dispersion) if ν<0Image, and poorly dispersed (immiscible) otherwise. If the interaction energies happen to all be zero, there is still an entropic drive towards mixing.

If the result of this calculation shows that the proposed mixture will be poorly dispersed, but the matrix cannot be changed, then the nano-objects will have to be functionalized (Figure 6.1 shows an example).

Obtaining a satisfactory dispersion often requires laborious experiments. It may happen that even if ν in equation (10.4) is less than zero, the nano-objects are not dispersed. The problem may then lie in deficiencies in the mixing process (which aims to distribute the objects throughout the matrix) [13].

At present there is no clear demarcation between manufacture of nano-objects and their end-use. For the downstream manufacturer of a finished product, for example one molded from a polymer, who wishes to incrementally improve the product by incorporating nano-objects into the polymer (e.g., to confer scratch resistance), it is likely to be convenient to obtain a masterbatch of the nano-objects properly dispersed as a high concentration, in pelleted form. The pellets then merely need to be well mixed at an appropriate concentration with pellets of the pure polymer prior to entering the shaping process. This requires the downstream manufacturer to release some proprietary information about the polymer to the nano-object manufacturer, who can then determine how to properly disperse it. Alternatively, the downstream manufacturer can undertake this optimization itself, in which case the nanofacturer merely needs to deliver the nano-objects as a raw material in a safe and usable form, which may be as weakly bound agglomerates or even as a slurry in an easy-to-remove liquid. It is then up to the downstream manufacturer to undertake all the optimization work. A third alternative is to engage an independent laboratory to optimize the formulation. Since the problems of dispersal outlined in this section may lie within the domains of expertise of neither the nanofacturer nor the manufacturer, the last alternative may be the most satisfactory.

10.6 Quality Control

It is perhaps a reflexion of the immaturity of the nanotechnology industry that very little attention has been hitherto given to the control of the quality of nanofactured materials and devices. Since most materials (most in terms of different kinds, rather than most in quantity) are not made according to any particular standard, quality control in any case might be somewhat nugatory. Many nanoparticles offered for sale are merely characterized by “average particle size” (APS) and “specific surface area” (SSA), usually without specifying the methods. The problems are even greater when nano products are applied to surfaces in their final locations (e.g., a hospital operating theater). Clearly the challenge is partly one of nanometrology. It is well known that the many different methods for measuring APS yield different results when applied to a given sample—and the differences are usually significant with respect to the measurement uncertainty. One hesitates to assign the differences to systematic errors; the problem is rather that the same quantity is not being measured, and the mistake is to call all the quantities “APS”. It is not usually possible to measure each nano-object individually, hence careful consideration needs to be given to the statistics of any quality control measurements. It may well be that a function quality control test is more useful than a test of the usually more directly accessible physical and chemical parameters. In branches of the chemical industry using complex raw materials, it is well known that different batches of nominally the same material give different functional results even though their physical and chemical parameters are the same at the customary level of resolution.

10.7 Biomimicry

Much closer to nanotechnology is the mimicry, by artificial means, of natural materials, devices and systems structured in the nanoscale. Ever since Drexler presented biology as a “living proof of principle” of nanotechnology [14], there has been a close relationship between biology and nanotechnology. The earliest protagonists of nanotechnology have been inspired by their knowledge of ultraminiature machinery in the living world, and as molecular biology has revealed more and more of this machinery, confidence in such inspiration as a source of practicable technology has generally been vindicated. At the present level of fabrication capability, the intricately nanostructured surfaces capable of endowing their bearers with special abilities have received the most interest. Examples include the superhydrophobicity of the leaves of the lotus and many other plants, hydrodynamic drag reduction of sharkskin (alongside reduced microbial colonization—either that or the hydrodynamic drag reduction may be an exaptation), and dry adhesion of the foot of the gecko and other animals. Their structures have been studied with ultramicroscopy and replicated artificially with fair success. Wood, seashells, and bone are natural nanocomposites whose structure has been studied and is now understood at the nanoscale. In the case of bone, however, at least some of its remarkable mechanical and other properties depend on the fact that it remains alive in service. Another material with remarkable mechanical properties is spider silk, which is still being extensively studied by chemists as well as nanotechnologists. These examples represent just a tiny snapshot of the incredible variety of structures and mechanisms in the living world. The molecular machinery within living cells, such as the rotary motors involved in various kinds of energy transduction (the enzyme ATPase and bacterial flagellar motors, for example) are also well understood but only serve as examples of principle at present and cannot be artificially mimicked by available technologies.

The ultimate criterion of commercial feasibility is whether a proposed technology is cost-competitive; these natural wonders are always competing with any attempts to produce artificial mimics. Photovoltaic arrays are competing with fields of plants, and although large-scale arrays undoubtedly now exist, the checkered commercial history of their manufacturers and the heavy subsidies from public funds that they have enjoyed do not augur well for their sustainability, alongside the fact that, according to some estimates, their whole-life energy output does not exceed the energy used in their manufacture, installation and maintenance.

The advantages of reducing aerodynamic and hydrodynamic drag are easy to quantify in terms of energy savings, and biomimicry has offered a shortcut to designs that might conceivably have been achieved using (interactive) evolutionary computation (Section 10.2). What limits their deployment is the impossibility of creating these nanostructured materials at a scale large enough to cover the hull of a ship or the fuselage of an airplane, at a cost providing a reasonable return, through energy savings, on the investment. Furthermore, the very intricacy of some of the structures means that they are fragile. For example, an artificial gecko foot can be fabricated, but after one or a few adhesion–release cycles is already so badly damaged that it no longer has any useful attributes. In the living world, these structures are constantly being repaired and replaced. So far at least, these problems have prevented biomimicry from becoming more than a niche activity in the commercial sphere.

10.8 Nanodevices Moving in Viscous Media

Just as the very small size of devices on VSLI circuit chips bring to the fore aspects such as thermal management, which may be less critical at the micrometer scale and above, the design of very small devices (e.g., drug delivery vehicles) designed to migrate in liquid media such as the blood require cognizance of a different régime compared with larger objects. Whereas with the latter, inertial (Newtonian) forces dominate the motion, viscous forces dominate with the former. In other words, the Reynolds number is typically very low. For example, a bacterium-sized object moving with a velocity of ∼1 mm/s has a Reynolds number of around 103Image, whereas a human swimmer has Re 106Image, and applied force is proportional to, respectively, velocity and acceleration.

References

[1] Devices for which accessibility is the principal consideration might still be worth making very small even if only few are required; e.g., for a mission to outer space.

[2] This is why vastification—the proliferation of numbers—almost always accompanies nanification.

[3] P. Érdi, Gy. Barna, Self-organizing mechanism for the formation of ordered neural mappings, Biol. Cybern. 1984;51:93–101.

[4] J.H. Holland, Adaptation in Natural and Artificial Systems. Ann Arbor: University of Michigan Press; 1975.

[5] J.H. Koza, Genetic Programming. Cambridge, MA: MIT Press; 1992.

[6] E.g. A.M. Brintrup, H. Takagi, A. Tiwari, J.J. Ramsden, Evaluation of sequential, multi-objective, and parallel interactive genetic algorithms for multi-objective optimization problems, J. Biol. Phys. Chem. 2006;6:137–146.

[7] J.J. Ramsden, The future of Wikipedia, Nanotechnol. Percept. 2015;11:131–135.

[8] D. Cross, Whither Wikipedia? Nanotechnol. Percept. 2016;12:50–52.

[9] M.F. Ashby, Materials Selection in Mechanical Design. Oxford: Pergamon; 1992.

[10] M.L. Green, Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies, Appl. Phys. Rev. 2017;4, 011105.

[11] M.G. Cacace, E.M. Landau, J.J. Ramsden, The Hofmeister series: salt and solvent effects on interfacial phenomena, Q. Rev. Biophys. 1997;30:241–278.

[12] O. Sinanoǧlu, Microscopic surface tension down to molecular dimensions and micro thermodynamic surface areas of molecules or clusters, J. Chem. Phys. 1981;75:463–468.

[13] E.L. Paul, V.A. Atiemo-Obeng, S.M. Kresta, eds. Handbook of Industrial Mixing: Science and Practice. Wiley; 2004.

[14] K.E. Drexler, Molecular engineering: an approach to the development of general capabilities for molecular manipulation, Proc. Natl. Acad. Sci. USA 1981;78:5275–5278.

Further Reading

[15] W. Banzhaf, et al., From artificial evolution to computational evolution: a research agenda, Nature Rev. Genet. 2006;7:729–735.

[16] B. Bhushan, Biomimetics: lessons from nature—an overview, Philos. Trans. R. Soc. A 2009;367:1445–1486.

[17] J-C. Lu, S-L. Jeng, K. Wang, A review of statistical methods for quality improvement and control in nanotechnology, J. Qual. Technol. 2009;41:148–164.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset