The dictionary definition of “innovation” is simply “the bringing in of novelties.” The typically exponential growth of technologies is examined, and how new technologies succeed mature ones—in processes of disruption and creative destruction. The balance of technology push and market pull is considered, and other aspects of social value.
Disruption; Creative destruction; Technology push; Market pull; Exponential growth; Social value; Conviviality
Although the dictionary definition of “innovation” is simply “the bringing in of novelties”, it has in recent years become a more narrowly defined concept much beloved especially by government ministries and their agencies charged with animating economic activity in their countries. Indeed, in 2007 the UK government, which has been in the van of this animating process, created a new Department of Innovation, Universities and Skills, which in 2009 was merged with the Department for Business, Enterprise and Regulatory Reform to create the Department for Business, Innovation and Skills, which was in 2016 in turn merged with the Department of Energy and Climate Change to create the Department for Business, Energy and Industrial Strategy (BEIS), revealingly linking innovation with universities [1]. Furthermore, in 2017 the academic UK research councils will be placed together with Innovate UK in a new umbrella organization called UK Research and Innovation (UKRI). In this usage, innovation has come to mean specifically the process whereby new products are introduced into the commercial sphere: “The technical, designing, manufacturing, management and commercial activities involved in the marketing of a new (or improved) product or the first commercial use of a new (or improved) process or equipment” [2]. It implies not only the commercialization of a major advance in the technological state of the art, but also “includes the utilization of even small-scale changes in technological know-how” [3]. Thomas Alva Edison was not only a brilliant inventor but also a masterful innovator (who is reputed to have said “it's 1% inspiration and 99% perspiration”); however, the inventor is very often not the innovator. Suction sweepers are associated not with Spengler, their inventor, but with Hoover; similarly the sewing machine is associated with Isaac Merritt Singer, not with Elias Howe, and still less with Barthélemy Thimonnier or Thomas Saint [4].
The innovator in the narrow sense is crucial to the overall process of wealth creation. The concept of innovation can be naturally entrained in the “linear model” (Figure 2.1). If we define “high technology” or “advanced technology” as “technology emerging from science”, then nanotechnology is clearly a high technology, according to the “new model” (Figure 2.3), and the process of innovation, in its new restricted usage (of introducing novel products into the commercial sphere), is likely to be highly relevant. Figure 3.1 shows more explicitly how science can be transformed into wealth via innovation.
It is not hard to find reasons for the flurry of official interest in the topic. Governments have noticed that a great deal of the research financed from the public purse appears to be of very little strategic—i.e., military and national-economic—importance [5]. Even though the funding and execution of scientific research is not, in most countries, prominent in the public mind, nevertheless governments feel that they have to justify public spending on it, despite its negligible contribution to the total government budget [7]. Hence, governments have become obsessed with increasing the economic impact of their science spending [8]. The justification of funding scientific research therefore becomes its capacity to generate wealth through innovation, and the illusion of convergence of the “new model” with the “linear model” (they are not isomorphous) allows the old tradition of Baconian thinking to continue [9].
Innovation, in the sense of the implementation of discovery, or how research results are turned into products, is a theme at the heart of this book (cf. Chapter 12). Governments have become particularly wedded to the path shown in Figure 3.2. Given that the granting of a patent—in other words the right to monopolistically exploit an invention for a certain number of years—is a clear prerogative of governments, it is perhaps not surprising to find they have a vested interest in promoting patenting, regardless of the presence or absence of any overall economic benefit to the country (cf. Section 12.9); in the USA, the Bayh-Dole Act, which enshrined the right of universities to retain ownership of inventions emerging from federally funded research, was enacted as long ago as 1980.
Economists, especially J.A. Schumpeter, have noticed that established technologies sometimes die out, creating space for new ones. This phenomenon came to be called creative destruction. The man in the street expresses it through proverbs such as “you cannot make an omelette without breaking an egg”, and biologists are also familiar with the idea, a good example being the death of about half the neurons at a certain epoch in the development of the brain of the embryonic chicken (and doubtless of other embryonic animals). The scientist specializing in biomolecular conformation will be familiar with the fact that for ribonucleic acid (RNA) polymers to adopt their final stable structure, intramolecular bonds formed while the polymer is still being synthesized have subsequently to be broken [10]. At the time Schumpeter was putting forward the notion of creative destruction, it was widely believed that epochs of rapid multiplication of new species were preceded by mass destructions of existing ones [11]. Preceding destruction is, however, an unnecessary condition for the occurrence of creative construction. Obviously, a literally empty potential habitat has space for colonization (by so-called r-selection—see Section 3.1)—although if it is truly devoid of life initial colonization might be quite difficult. On the other hand, an apparently crowded habitat may be very rich in potential niches for new species capable of imaginatively exploiting them (the so-called K-selection—see Section 3.1).
When innovation disrupts existing commercial structures it is, obviously enough, labeled “disruptive” [12]. Effort has been put into scrutinizing the minutiae of how it actually happens. Naturally it is difficult for a small innovator to break into a well-established market dominated by behemoths. Since many innovations provide a more cost-effective solution to an existing problem (cf. Chapter 4), disruptive innovators often begin by targeting the bottom end of the market. Margins are small but this is affordable for the small company because it has small overheads. This gives the innovator a foothold in the market, and also enables valuable experience to be gained. In due course the incumbents can be effectively challenged. Sometimes they respond by seeking to acquire the disruptive innovator.
Figure 3.1 omits details about the process whereby the new products are transformed into wealth. Evidently, in order for that to happen people must want to buy the products—in other words, there must be a market for them. For incremental technologies, demand for novelty typically comes from buyers of existing products. Directly or indirectly, manufacturers receive feedback from buyers (including the manufacturers' own employees), which can more or less straightforwardly be worked into a steadily improving product. This situation is referred to as “market pull”. Disruptive technologies, by definition, are qualitatively different from those in existence at the epoch of their emergence. Any user of an existing technology sufficiently farsighted to imagine a qualitatively different solution to his problem is likely himself to be the innovator. Therefore, market pull is inapplicable; one refers to “technology push”, or the technological imperative. The development of technology is considered to be autonomous, and the emergence of new technologies determines the desire for goods and services [13]. In reality the reasons for buying a product are influenced by other factors. Depending on the degree of étatism in an economy, compliance with regulation may be a powerful force, and in our era of social media the creation of “buzz” can also be very influential [14].
By analogy with biological growth, a good guess for the kinetics of innovation is the sigmoidal logistic equation
where Q is the quantity under observation (the degree of innovation, for example), K is the carrying capacity of the system (the value to which Q tends as time ), r is the growth rate coefficient, and m is the time at which and . The terms r-selection and K-selection can be explained by reference to this equation: the former operates when a niche is relatively empty and everything is growing as fast as it can; therefore the species with the biggest r will dominate; the latter operates when an ecosystem is crowded, and dominance must be achieved by increasing K. This is perhaps more easily seen by noting that equation (3.1) is the solution to the differential equation
The application of this equation to innovation implies, perhaps a little surprisingly, that innovation grows autonomously; that is, it does not need any adjunct (although, as written, it cannot start from zero). Perhaps, indeed, the lone innovator is still a leading figure. Hirooka has gathered some evidence for this time course, the most extensive being for the electronics industry [15]. He promulgates the view that innovation comprises three successive logistic curves: one each for technology, development, and diffusion. (“Development” is used by Hirooka in a sense different from that of Figure 3.1, in which research leads to science (i.e., the accumulation of scientific knowledge), and development of that science leads to technology, out of which innovation creates products such as the personal computer.) There seems to be no need to have separate “development” and “diffusion” trajectories: these taken together constitute innovation. In Hirooka's electronics example, the technology trajectory begins with the point-contact transistor invented in 1948, and m is reached in about 1960 with the metal oxide–semiconductor transistor and the silicon-based planar integrated circuit. This evidence is not, however, wholly satisfactory, not least because there seems to be a certain arbitrariness in assigning values of Q. Furthermore, why the trajectory should end with submicrometer lithography in 1973 is not clear. The continuation of Moore's law up to the present (and it is anticipated to continue for at least several more years) implies that we are still in the exponential phase of technological progress. The development trajectory is considered to begin with the UNIX operating system in 1969 and continues with other microprocessors (quantified by the number of components on the processor chip, or the number of memory elements) and operating systems, with m reached in about 1985 with the Apple Macintosh computer; the diffusion trajectory is quantified by the demand for integrated circuits (chips).
Perhaps Hirooka's aim was only to quantify the temporal evolution; at any rate, he does not offer a real explanation of the law that he promulgates, but seems to be more interested in aligning his ideas with those of the empirical business cycles of Kondratiev and others [16]. For insight into what drives the temporal evolution of innovation, one should turn to consideration of the noise inherent in a system (whether socio-economic, biological, mechanical, etc.) [17]. Some of this noise (embodied in random microstates) is amplified up to macroscopic expression [18], providing a potent source of microdiversity. Equation (3.2) should therefore be replaced by
where ξ is a random noise term that can take both positive and negative values (a more complete discussion than we have space for here would examine correlations in the noise). This modification also overcomes the problem that equation (3.2) cannot do anything if Q is initially zero.
Amplification of the noise up to macroscopic expression is called by Allen “exploration and experiment”. Any system in which mechanisms of exploration and experiment are suppressed is doomed in any environment other than a fixed, unchanging one, although in the short term exploration and experiment are expensive (they could well be considered as the price of long-term survival).
Recognition of microdiversity as the primary generator of novelty does not in itself provide clues to its kinetics. It may, however, be sufficient to argue from analogy with living systems. By definition, a novelty enters an empty (with respect to the novelty) ecosystem; growth is only limited by the intrinsic growth rate coefficient (the r-limited régime in ecology). Inevitably as the ecosystem gets filled up, crowding constraints prevent exponential growth from continuing.
One may legitimately ask whether the first positive term in equation (3.3) should be proportional to Q. Usually innovation depends on other innovations occurring concurrently. Kurzweil comments that technology can sometimes grow superexponentially. Equation (3.1) should therefore only be taken as a provisional starting point. We need to consider that technological growth, , is proportional to , and carefully examine whether n is, in fact, greater than unity. For this we also need to work out how to place successive entities in a developing technology on a common scale of degree of development. How much more developed is the MOS transistor than the p–n junction transistor? The complexity of the object might provide a possible quantification, especially via the notion of thermodynamic depth [19], and compare with the work of Senders [20]. These matters remain to be investigated further.
During the post-m stage we enter the K-limited régime: survival is ensured not through outgrowing the competition, but through ingenuity in exploiting the now highly ramified ecosystem. The filling will itself create some new niches, but eventually the system will become saturated. Even factors such as the fatigue of university professors training the researchers and developers through repeatedly having to expound the same material plays a rôle.
The development of the electronics industry is perhaps atypically smooth. Successive technologies usually overlapped the preceding ones, and the industry was generally at pains to ensure compatibility of each technological advance with the preceding one. But, innovation is often a much more turbulent affair; one may characterize it using words like discontinuity, disruption, or revolution [21].
An early example of disruptive innovation is the stirrup, which seems to have diffused westwards from China around the 8th century CE, and in Europe was taken up on a large scale by the Franks led by Charles Martel. At a stroke, it enabled the horse to be used far more effectively in warfare: with stirrups, a knight could hold a heavy lance, the momentum of the horse would be added to his own, and the lance would be virtually unstoppable by any defenses then current. It also enabled arrows to be fired from a longbow by a mounted rider. This is a good example of technology push—there is no actual evidence that a group of knights sat down and decided this was what they wanted to enable them to fight far more effectively in the saddle. It was a push that was to have far-reaching social consequences. Other armies adopted the innovation, and there was a concomitant increase in defensive technology, including armor and fortified castles. Warfare rapidly became significantly more expensive than hitherto. White has argued that this triggered a revolutionary social change [22]—to support the expense, land was seized by leaders like Martel and distributed to knights in exchange for military service, which they then fulfilled at their own expense. The knights in turn took control over the peasants who lived on the land, cultivating it and raising livestock for themselves and the knights, in exchange for protection from marauding invaders. In other words, the stirrup led to the introduction of feudalism, a far greater revolution (in the sense that it affected far more people) than that of the technology per se.
The classification of innovations as either technology push (typically associated with disruptive innovation: by definition, the market cannot demand something it does not know about) or market pull (for incremental innovations, whereby technology responds to customer feedback) does not seem to cover all cases, however. There is currently no real demand for new operating systems for personal computers, for example, yet Microsoft, the market leader (in terms of volume), is constantly launching new ones. The innovation is incremental, yet customers complain that each successive one is worse than its predecessors (e.g., “Windows 10”, “8”, “7” and “Vista” compared with “XP”). Simple economic theory suggests that such products should never be introduced; presumably only the quasimonopolistic situation of Microsoft allows it to happen. Those responsible for IT systems frequently express surprise that the much more secure Linux operating system is not more widely used. Linux is based on Unix, a mainframe operating system that evolved in a rather ad hoc fashion; Unix ultimately displaced VMS, which many considered to be rather better. Evidently the technically best solution may not be the one adopted. Familiarity with an existing system may outweigh the perceived benefits of a new one, and there is anyway a cost associated with change [23]. Another example is the display screen increasingly being introduced inside railway carriages, giving various items of information such as the distribution of passengers on the train and notice of any perturbation to other services. There is little indication that passengers desired these screens, the information on which is almost invariably too out-of-date to be practically useful. The Siemens Class 700 trains, recently introduced into the UK, are equipped with such screens and other advanced features, such as very fast acceleration and braking, but the main feature that passengers are above all interested in, namely a pleasant and comfortable interior, has been ruthlessly sacrificed to the extent that passengers now avoid traveling in them unless it is absolutely essential to make the journey.
An extension to the basic push–pull concept is the idea of “latent demand”. It can be identified post hoc by unusually rapid takeup of a disruptive innovation. By definition, latent demand is impossible to identify in advance; its existence can only be verified by observation.
By analogy to supply and demand, push and pull may also (under certain circumstances, the special nature of which needs further inquiry) “equilibrate”, as illustrated in Figure 3.3.
As already mentioned near the beginning of this chapter, the term “creative destruction” was introduced by Joseph Schumpeter, but it is in itself incomplete and inadequate for understanding disruptive innovation. It would be more logical to begin with the “noise”, which at the level of the firm is represented by the continuous appearance of new companies—after all, nothing can be destroyed before it exists. Noise can, however, be both positive and negative. If the commercial raison d'être of the company disappears, then the company will presumably also disappear, along with others that depended on it. This process can be modeled very simply: if the companies are all characterized by a single parameter F, which we can call “fitness”, and time advances in discrete steps (as is usual in simulations), then at each time step the least fit company is eliminated along with a certain number of its “neighbors” (in the sense of being linked by some kind of commercial dependence) regardless of their fitnesses, and those eliminated are replaced by new companies with randomly assigned fitnesses [24]. This model was introduced as a description of the evolution of living species, and has a number of interesting properties such as a (self-organized?) critical fitness threshold, the height of which depends on the number of neighbors affected by an extinction. Furthermore, if the proper time of the model (successive time steps) is mapped onto real (universal or sidereal) time by supposing that the real waiting time for an extinction is proportional to , extinctions occur in well-delineated bursts (“avalanches”) in real time, and the sizes of the bursts follow a power law distribution. Palaeontologists call this kind of dynamics “punctuated equilibrium” (Figure 3.4) [25].
The Bak-Sneppen model emphasizes the interdependence of species. Companies do not exist in isolation, but form part of an ecosystem. Ramsden and Kiss-Haypál have argued that the “economic ecosystem” (i.e., the economy) optimizes itself such that human desires are supplied with the least effort—a generalization of Zipf's law, whence it follows that the distribution of company sizes obeys [26]:
where is the size of the kth company ranked according to size such that k is the rank, being the largest company; P is a normalizing coefficient, and ρ and θ are the independent parameters of the distribution, called, respectively, the competitive exclusion parameter and the cybernetic temperature. Competitive exclusion means that in any niche, ultimately one player will dominate (this is, in fact, a simple consequence of general systems theory). The Dixit–Stiglitz model of consumer demand is another application [27].
According to the “theory” elaborated by Christensen and others [12], nanotechnology has not followed the recognized path of disruptive innovation, by conquering the lowest end of the market with products that are cheaper or better or both. On the contrary, the expense of most nano-enabled products outweighs any superiority in performance. Similarly, digital electronic photography based on CCD or CMOS sensors was initially both more expensive and inferior in performance (regarding the quality of the images—but immediately superior in terms of convenience, by not requiring making up solutions of chemicals and a darkroom), but ended up almost completely displacing photography based on silver halide emulsions. Nor is nanotechnology conquering new markets, the other facet of disruptive innovation according to the “theory”. The apotheosis of nanotechnology, namely the universal fabrication technology represented by the personal nanofactory [28] will of course be hugely disruptive—one might speak of “total disruption”—but at present there is no obvious path for getting there. But otherwise nanotechnology is best characterized as a technology of continuous, incremental improvement as typified by Figure 1.1 and the International Technology Roadmap for Semiconductors (ITRS)—now considering 16 nm features on 450 mm wafers. Failure to properly appreciate this point may account for the difficulties experienced in commercializing nanotechnology. Even in nanomedicine, in which some of the most far-reaching nano-enabled innovations are foreseen (see Chapter 6), nano-objects introduced into the human body will be subjected to the same environmental challenges, most prominently opsonization, as anything else, whether molecular, microscopic or macroscopic.
The key question is “What does nanotechnology enable me to do that I could not do before?” For example, ultrastrong carbon-based materials seem to offer the only hope of constructing the space elevator. Otherwise, nanotechnology's greatest value is in substituting for traditional materials and processes that have become too scarce and expensive. For example, a nanodevice with the functions of the kidney, able to replace—disrupt—pyrometallurgy for the extraction of metals, could be of existential importance.
Is there any deeper underlying mechanism behind the dynamics represented by equation (3.3)? Since invention and innovation are carried out by human beings, one should look at their underlying motivations. An important principle would appear to be that everyone wants to do a good job—if they get the chance to do so [29]. Even more loftily, the instigators of the grand engineering achievements of the Victorian age “spent their whole energy on devising and superintending the removal of physical obstacles to society's welfare and development …the idea of development for its own sake had a place, and in the first instance …the thought of making man's dwelling place more commodious cast into insignificance anticipations of personal enrichment” [30]. It is only natural for technologists to respond to feedback with incremental innovation. Natural curiosity and the energy to follow it up is sufficient to provide a basis for ξ in equation (3.3). But the further growth of innovation, including the actual values of the parameters r and K, depends on other factors, including retarding ones such as inertia and friction. It depends on the fiscal environment [31], because much innovation requires the strong concentration of capital. It also depends on “intellectual capital”—knowledge and skills—which must in turn depend in some way on public education. Globalization means that, even more so than ever before, “no country is an island” and, in consequence, it becomes difficult to discern what elements of national policy favor innovation.
Given that the average lifetime of a firm is a mere 12 years [32], one might suppose that the directors of even the largest and best-managed companies are constantly afraid of the possibility of sudden extinction. The management literature abounds with exhortations for companies to “remain agile” in order to be able to adapt and survive. What can be offered in the way of specific advice? An excellent general principle is to balance exploration with incremental improvement (in other words, revolution with evolution, respectively) [33]. Merging materials research and manufacturing technologies has also been found to be fruitful in practice, especially within large companies that, formerly, initiated new products with design, then proceeded to materials selection, and finally went to manufacture; now, after merging, these processes take place essentially simultaneously. Sometimes the merging is achieved by the simple tactic of housing designers, engineers, scientists, and production staff within the same building, reversing an earlier trend that, flaunting globalism, often ended up having the units responsible for these different activities in different countries. That trend has morphed into crowdsourcing, in which innovative ideas are invited from the entire world Internet community (discussed in more detail in Chapter 10), with open source software and hardware as the endpoints of that trend.
There has been much emphasis in recent years, especially by governments seeking to justify public expenditure on universities, on academic research as the source of innovation. Nevertheless, careful empirical studies have shown that academic research makes a very minor contribution to innovation [34]. We return to this theme in Chapter 13. The rôle of clusters in promoting innovation is discussed in Section 12.5.
In fact, it was already well accepted during the Industrial Revolution that most inventions and innovations came from workmen on the shopfloor. Perhaps religion played a rôle in inspiring them to look at their work, much of which must have been rather repetitive and dull, in a fashion more akin to that of deliberate practice [35], which, as we know from the work of Ericsson et al. [36], is very necessary to achieve expertise; they reckon that about 10,000 h of deliberate practice is needed, which for a 40–50 h working week would be achieved after about five years.
More formally, a fruitful approach might be to start with an empirical examination of whether ξ in equation (3.3) can be correlated with factors such as the percentage of personal income that is saved (and, hence, available for concentrating in large capital enterprises); and the organization of public education in a country.
Such empirical examination can yield surprising results. For example, although the number of patents granted to a company does correlate with its spending on research and development, there is no simple relationship between the spending and corporate performance, from which one might infer that one cannot simply buy effective innovation [37].
New, truly disruptive technologies may require special attention paid to public acceptance. It is widely considered that the failure of companies developing genetically modified (GM) cultivated plant technology to foster open discussion with the public was directly responsible for the subsequent mistrust of foodstuffs derived from GM plants, mistrust that is especially marked in Europe, with huge commercial consequences. Problems associated with the lack of discussion were further exacerbated by the deplorable attitude of many supposedly independently thinking scientists, who often unthinkingly sided with the companies, unwarrantedly (given the paucity of evidence) assuming that ecosystems would not be harmed. Insofar as many scientists working in universities are nowadays dependent on companies for research funding, this attitude came close to venality and did nothing to enhance the reputation of scientists as bastions of disinterested, objective appraisal. Nanotechnologists are now being exhorted to pay heed to those mistakes and ensure that the issues surrounding the introduction of nanotechnology are openly and properly debated. There is, in fact, an unprecedented level of public dialog on nanotechnology and, perhaps as a direct consequence, a clear majority of the population seems to be well disposed towards it.
One of the greatest discouragements to the introduction of innovation is when the field to which it applies is already in a highly mature state. This corresponds to the logistic curve (3.1) asymptotically approaching . A good example is the market for prostheses (hip, femur, etc.). Although many surgeons implanting them in patients have innovative ideas about how to improve their design, and new materials are emerging all the time, especially nanocomposites and materials with nanostructured surfaces promoting better assimilation with the host tissue, the existing technology is already at such a high level in general it is extraordinarily difficult to introduce novelty into clinical practice. Furthermore, unlike IT, the biomedical field is heavily regulated; in many countries, onerous animal trials must be undertaken before a product is permitted to even be tested on humans. The fact that after years of consolidation (cf. Figure 12.3) global supply is dominated by two very large companies, both in the USA, is also not conducive to innovation.
Another way of looking at this is to consider the market as a complex dynamical system with multiple basins of attraction. As proved by Ashby [38], such a system will inevitably end up stuck in one of its basins and exploration will cease. This is the phenomenon of habituation. The system can only be reset if it receives a severe external (endogenous) shock. This is what Schumpeter was presumably trying to convey with his notion of “creative destruction” [39].
Manufactured artifacts in do not exist in isolation, but are embedded in society and their adoption produces lifestyle changes within it. It is be noted the products with high functionality do not always become diffused and do not necessarily create value (welfare) for society [40]. This situation has been modeled by Ueda and others (Figure 3.5), identifying three classes of value creation:
It is difficult to find a solution for class III, into which it is expected that most nanoproducts will fall; generally a sophisticated approach such as multi-agent simulation must be used, in which the interactions between humans and artifacts can be explicitly modeled.
This term was introduced by Illich in order to describe tools that serve their human masters rather than the opposite [41]. A tool remains convivial to the extent that anyone can use it, without undue difficulty, as often as he or she desires. The optimal tool fulfills three criteria: it increases the productivity, or efficiency, of work without depreciating the personal autonomy of the user; it creates neither masters not slaves; and it enlarges the radius of personal action. Illich called a convivial society one in which modern tools were at the disposal of anyone integrated into the society, and not restricted to a core of specialists. It is one in which control of tools is firmly in the hands of man.
It can be inferred that hand tools—for example, a plane for woodworking—are more likely to be convivial than machine tools—such as a planing machine. This implies that the ultimate nanomachines—assemblers—are unlikely to be convivial, because probably some kind of evolutionary computing will have to be used to drive them (see Chapter 10), making them more-or-less autonomous.
Clearly a slide rule is more convivial than a calculator, which in turn is more convivial than a computer. A computer that one programs oneself is more convivial than one running software made by a third party. There is no explicit influence of nano, except that integrated circuits with nanoscale features have a higher spatial density of components and hence are more powerful, hence in turn more likely to run with a greater degree of autonomy than a less powerful processor.
A word processor, especially one that automatically makes spelling and grammar suggestions as one writes, is less convivial than pen and paper. These ostensive examples should suffice to convey the essence of conviviality. If it is found that nanotechnology generally strengthens the power and efficacy of machines, making them more autonomous, then it is opposed to conviviality. This generalization may, however, be misleading: nanotechnology should enable mass customization, which appears to signify more rather than less conviviality.