Chapter 22
The Robotization of Financial Activities: A Cybernetic Perspective

Hubert Rodarie

Deputy CEO Groupe SMA (insurance), France

The purpose1 of this chapter is to provide a practitioner's perspective of the ongoing developments in the organization of the financial world since the crisis of 2007.

Based on the commonly held observation of mounting complexity, we analyze the current situation in terms of the collective attempt to mechanize financial operations. This mechanization systemizes and generalizes the trend that began in the mid-1980s in asset management and banking portfolios, and in automated trading functions known as program trading.

Such mechanization, or robotization, is, in our opinion, indicative of what may be called the embedding of modern-day risk modeling within professional organizations. As such modeling produces a measure of risk considered to be accurate, this logically implies that future uncertainties, other than those calculated, are now attributed to human error. This point has been put forward in previous studies (Walter, 2010; Rodarie, 2012) as the main diagnosis of the 2007/2008 financial crisis.

Consequently, organizations are focusing on preventing human error by drawing on the efforts employed in other economic sectors to define and prevent it. We will thus show that organizations logically make use of the organizational frameworks used for industrial output, known as “quality assurance.” Initially developed to control quality in the aeronautic and, then, nuclear sectors, these frameworks have gradually been extended to all industrial and service sectors, and are known as quality management systems. These measures have been subject to various and successive standardizations (ISO), and since the mid-1990s have been used to prevent the risks managed by, and inherent to, the banking and financial system.

To assess this process, we will employ the ideas put forward by Norbert Wiener, one of the founding fathers of cybernetics. We will identify the requirements for successfully implementing the process with respect to its ultimate goal of risk control. We will thus highlight the consequences of the choices currently employed in the financial system, which potentially reduce the number of limited-impact incidents, but which, on the other hand, create conditions favorable for the occurrence of extreme events.

22.1 An Increasingly Complex System

What is the most striking feature of financial activities today? It is their growing complexity. It is the main defining feature of the evolution of the financial system since the 2007–2008 North American banking crisis and its subsequent developments in Europe. Nothing is simple for anyone, be they professionals, customers, or regulators.2 Everything has become complicated, and no sector has escaped – banking, market, asset management, and insurance activities have all been affected. Everything is becoming increasingly complex.

This complexity appears to have two main sources:

  • Creativity: New calculation methods have facilitated the development of new products and new ways of entering into and trading commitments. From securitization, to derivatives, passing by high-frequency trading or alternative management, these new capabilities have led to the creation of new organizations that even today's finance professionals struggle to learn, and above all to master, both globally and in detail, immediately and over the long term.
  • Regulation: Since the 2007–2008 crisis in particular, the desire to regulate activities has produced an impressive body of laws, rules, and specifications from various sources; the 2300-page Dodd–Frank Act in the United States is one such example. European output has not been left behind, and the same “risk control” trend is seen in the construction of an integrated European system. And to add further complexity, the European administrative, political, and monetary framework, which is still under construction, is particularly multifaceted, imprecise, and slow, with the notable exception of the euro zone and its own institutions, the ECB in particular, which is naturally playing an ever greater role.

Overall, while the economic world is on the brink of recession and deflation, the regulatory world, on the other hand, is enjoying a period of enlargement and unfettered growth.

  1. Is this complexity inevitable?
  2. Has it arisen solely from the aforementioned creativity, or not? Does it derive from other sources?

We are not going to criticize creativity, because it is clear that all systems tend to expand. The multiplication of activities and services proposed by an economic sector is not, in itself, an undesirable symptom.

On the other hand, examining the origin and structure of the complexity generated by regulation, or more accurately by the standardization of activities, whether controlled by public supervision or not, is worthwhile.

To summarize, it could be argued that this increasing complexity principally arises from the desire of all actors, professionals, and regulators to automate. We are witnessing an attempt to organize financial activities as if they were a machine, or rather a collection of machines that must collectively be identifiable, predictable, controllable and reproducible, valid across all continents, in every corner of the world, and, finally, adapted to globalization.

Evidently, confronted with these developments, a professional who is familiar with other business sectors and who has, for example, worked in a manufacturing industry, is bound to recognize organizational frameworks that are very similar or analogous to those in the automotive or electronics industries. Given the views of the Basel Committee, it appears that the hope is for the Rhine Valley to become the new Silicon Valley, where a new way of working and producing will be invented.

Thus, concretely, we are trying to classify or reduce financial commitment activities, that is, contracts performed over time, to the production of reproducible objects that, with the exception of their aging, are only minimally time-dependent.

On this topic, it could be noted that the finance sector, in contrast to all manufacturing industries, never talks about its products aging, and even less about technical obsolescence. A bond will remain “fresh” as long as the issuer pays the interest; the financial sector speaks only in terms of default.

Why does this imitation exist, given that the objects in question are so different? Why is there a desire to replace people currently working in diverse organizations with quasi-robotized participants identically carrying out standardized procedures?

Is it the fascination with industrial efficiency? For many years, this efficiency has generated improvements in productivity, so, apparently, continuation of that seems to be a normal product of such activity. Yet, this does not appear to be especially decisive. In fact, if competition has driven industrial companies to continuously improve their techniques and products, in the financial domain the “risk” factor is not, evidently, a reproducible object likely to be treated by similar means to those used in manufacturing industries. This would effectively necessitate having resolved the quantification of risk.

We, therefore, need to consciously classify financial activities as issues that can be perfectly addressed by calculations, if not necessarily deterministically, at least stochastically. Accordingly, as expressed by André Gorz:

Using quantitative measures as a substitute for rational value judgements provides moral security and supreme intellectual comfort; the Good becomes measurable and calculable.

Gorz (1988)

And we could add to this citation that risk could be controlled and reduced to admissible bounds by employing adapted “metrics,” to use the new terminology.

In such a context, there is an a priori refusal to consider the occurrence of all causes as fundamentally unexpected, or even as arising from the discontinuities that can compromise the organization's calculation model. All dysfunctions will thus be attributed to human error.

22.2 Human Error

We now need to define human error and determine whether the robotization of activities is likely to prevent or reduce such human error.

Various university studies have been undertaken to characterize human error, and, at the request of professionals, to recognize and prevent its sources.3 Drawing on the work of James Reason, we note that

The more predictable varieties of human fallibility are rooted in the essential and adaptive properties of human cognition. They are the penalties that must be paid for our remarkable ability to model the regularities of the world and then to use these stored representations to simplify complex information-handling tasks.

Reason identifies two types of error in his study – slips and mistakes – which apply to two cognitive fields:

  • The first is the stack or working memory, where the subject focuses his or her attention.
  • The second is the declarative knowledge base, comprised of all of the information the subject possesses and the action plans, or schemas, that he or she is familiar with and which are likely to be suitable.

He identifies three types of error, in accordance with the generic error modeling system (GEMS):

  • SB slips (skill-based slips), occurring in activities based on automated practices well integrated into familiar plans, and which do not require conscious control;
  • RB mistakes (rule-based mistakes), occurring in activities based on applying specific rules;
  • KB mistakes (knowledge-based mistakes) occurring in activities requiring confirmed knowledge.

Two modes of control are employed to prevent these human errors:

  • The proactive mode: Composed of a collection of control rules chosen a priori according to teams' goals and experience and according to well-known and detailed activity plans. The focus is on organization and discipline. The errors in this mode are at the SB or RB level.
  • The retroactive mode: Control is attentional in this mode; there are no longer any rules applicable a priori, the subject acts on the basis of his or her knowledge, reacting to setbacks as quickly as possible, step by step. In this case, mistakes are at the KB level, generated by insufficient knowledge or inappropriate choices of action plan. Expertise and responsibility are the main tools used to deal with these mistakes.

We will illustrate these situations using the following examples:

For the driver of a car, the “changing gear” schema in general requires little attention. SB-level slips are quickly identified and corrected. RB-level mistakes may arise during schemas such as “driving out of a parking space” or “changing lanes.” Controls are in place, taught, and often codified in law (use of indicator and rear-view mirror).

On the other hand, a KB-level mistake may arise if the driver needs to react to an unexpected event: for example, if another car goes through a stop sign, what should the driver do? Hit the breaks or swerve, etc.? We can easily find other examples related to driving or to operating other devices or installations.

This example also introduces, in very simplified manner, an indication of the limits between rules-based and knowledge-based modes of behavior. This point will be addressed later in this chapter.

In fact, based on this analysis, it is clear that, in companies, the organization will ensure that its overall activities take place as far as possible in a “knowledge-based” domain where all actions and behaviors are described and known by all the relevant participants. The organization will also clearly implement a “proactive mode of control,” and will thus believe that it has reduced uncertainty regarding its results and increased confidence in its activities.

22.2.1 Quality Assurance

We thus find the origin of what is now known as “quality assurance.” Giving rise to the Anglicism “Assurance Qualité” in French, the word “assurance” is used here to express certainty or confidence. The organization's goal is therefore to inspire confidence in the quality of the outputs generated by the production chain.

Thus, according to the definition extracted from the ISO 8402 standard, which has now been replaced, the Quality Assurance System is

All the pre-established and systematic activities implemented within the quality system framework, and proved when necessary, to give appropriate confidence that an entity will meet quality requirements.

If we replace the word “quality” with “risk control,” we find, almost word for word, the definition of risk management or risk control systems, the subject of various regulations and codes issued by European and U.S. legislators and regulators. To a certain extent, quality and risk have been likened to one another, which was tempting because both approaches strive to create confidence and certainty.

In the last 50 years, a number of practices and tools have therefore been developed, and are used to control quality: procedures manuals, identification, traceability, process diagrams, mapping, error modeling, recording of malfunctions, corrective and preventative actions, etc. All of these practices and tools have been standardized via the development of international standards (ISO standards). These standards are sometimes imposed by regulations, for example, in the case of dangerous activities, or health care activities. Finally, identical practices and controls already exist for asset management and risk control in the regulatory provisions currently in place for the financial sector.

We can therefore conclude that the organizational model currently in use has been designed and developed over several decades in order to prevent human error, and in particular the two first types of error (SB and RB).

However, the proforma model of this organization is the machine. Only with machines can all procedures be perfectly identified and recognized, and proactively controlled in a predefined manner.

Clearly, this type of organization has been shown to respond well to preventing human error, but only limited forms of error. Moreover, it has not been developed to address all the sources of deficiency or uncertainty that can arise in financial companies.

22.3 Concretely, What Do We Need to Do To Transform A Company Into A Machine?

First of all, in the interest of clarity, we set out the definition of a machine.

A machine is a device or a group of devices used to transform strictly defined inputs into planned outputs using controlled processes.

Techniques have been developed enabling an increasingly refined control of operations and their sequencing, limiting human intervention as far as possible, with the result that certain operations are referred to as robotic or automated.

This very same approach, applied to all financial company activities, is either in place, or in the process of being implemented by all participants, professionals, and regulators. It can be described as totalizing because it not only covers all actions, whether they are material services, often known as operational, or whether they relate to developing pricing, the counterpart to commitments and the assessment of the risk that they bring to the establishment, but also because its goal is to reduce risk to a single measure, for which the board of directors is responsible.

To do this, the large amounts of information that make up the inputs need to be gathered (the famous data). The focus will be on processes, data, internal control, and due diligence. Every effort needs to be made to make each activity as transparent as possible; no data can be missing. Each of these procedures is subject to laws and rules, and to the circulation of best practices with which, regardless of the theoretical legal force of the text, the professional needs to comply, either by conviction or out of inertia faced with the need to “comply or explain.”4

Next, we need to develop the machine itself. It needs to process the data and produce meaningful results. What are the challenges for such a machine once the calculations have been made in accordance with the planned processes?

How can we judge whether the outputs are satisfactory? How can we continually improve the machine? Because, naturally, we would imagine that an a priori imperfect system must necessarily evolve and adapt to new developments.

The designers of such a machine thus face two main concerns:

  • The first is detecting an error or a noncompliance compared to the expected results.
  • The second is the ability to continually improve the quality of the results.

The overall framework to be implemented may thus be schematized as follows: it should comprise two stages, one producing the outputs, the other analyzing them. If a noncompliant result is obtained, this should trigger a so-called feedback effect, whose goal is ideally not just to reject the output but also to improve the first-stage processes.

We first simply introduce what is known in the manufacturing industry as control. According to the definition given in Larousse

“A control system is a subservient mode of operation in which the variable to be controlled tends towards a reference value.”

It should be noted, somewhat ironically, that in a Francophone context, the word for control, “régulation,” transposed from the English “regulation,” which according to the Oxford Dictionary simply means “Rule or directive made and maintained by an authority,” in other words a regulation, has replaced the previously used terms of administrative control. By adopting this term for activities such as financial services, the Francophone legislator has perhaps involuntarily translated its search for an automated mode of operation that is physically regulated.

Control is the cornerstone of productivity and automation in the manufacturing sector. But the important technical word here is “subservience,” which describes the relationships between processes. We can therefore identify two types of systems in a complex machine: the governed system and the governing system.

For example, the centrifugal “flyball” governor used in steam engines to simply control the speed of the engine is the governing system, and the vast boiler and the connecting rods are the governed system.

A considerable part of engineering sciences involves creating subservient mechanisms to make the particular machine as stable as possible: in other words, according to the textbook definition, to ensure “that the free response of the system tends towards zero at infinity” (Papanicola, 2010).

But this is not sufficient, because the aim, if possible, is to adapt the process to eliminate the source of the disparity, with the justified rationale that simple subservience will not suffice. This property is known by cyberneticians as autoadaptivity.

22.3.1 Cybernetics

Our research now needs to turn to cybernetics. We begin with a reminder of the purpose of cybernetics.

A definition: Cybernetics is the science of control and communication processes and their regulation in animals, machines, and in sociological and economic systems. Its main purpose is to study the interactions between “governing systems” (or control devices) and “governed systems” (operational devices) controlled by feedback processes.5

This is exactly the intellectual tool that a regulator would need in order to control a priori the quality of the devices that he or she wanted to implement to organize companies, individually and collectively, into a social machine.

A founder: The American mathematician Norbert Wiener is one of the founding fathers of cybernetics. He studied these issues, and popularized them through three main works: “Cybernetics, or Control and Communication in the Animal and the Machine” (1948)6; “The Human Use of Human Beings” (1950)7; and “God & Golem, Inc.: A Comment on Certain Points where Cybernetics Impinges on Religion” (1964).8

The analogy between the financial machine put in place to comply with prudential regulations and the automated translation processes that he studied appears entirely appropriate.

Effectively, for Wiener,9 an automated translation machine is the textbook example of a machine in real need of autoadaptivity.

In other words, a machine that needs to integrate a self-learning mechanism, which in the course of its operation should enable the machine to continually improve the quality of its output.

The mechanism can be schematized as follows (Figure 22.1).

c22f001

Figure 22.1 Self-learning mechanism.

One of the essential points is the design of the control device. However, according to Wiener,10 this control device can be defined in two, and only two, ways:

This will involve one of two things:

  • either a complete set of objectively applicable rules determining when a translation is good,
  • or some agency that is capable of applying a criterion of good performance apart from such rules.11

We examine below various different devices:

  1. An industrial machine: This falls into the first case. The mechanic-designer has a complete set of objective rules enabling him to manufacture his item. These rules are made up of technical specifications and plans. Result: a control device is feasible.
  2. A translation machine: This is the example proposed by Wiener. It falls into the second case, as there is no complete set of objective rules for translating a text. Linguistic sciences are incomplete. We can, however, make use of an expert (a translator) who will apply a distinct criterion of the implementation rules (process) used by the translation machine, “The normal criterion of good translation is intelligibility,”12 which means that the text must have the same meaning regardless of the reader. The expert can validate the translation, make corrections, and enhance the device. Result: we can construct an autoadaptive translation machine.
  3. The financial machine is constantly evolving. Clearly, all participants want to use the most efficient and mechanized “control device” possible, in particular to be able to run increasingly large series of operations. It is therefore clear that effort is primarily focused on defining a body of objective rules, to be implemented via regulations, in order to control the models as effectively as the results.

However, financial science, like linguistic science, is incapable of producing such a body of objective rules.

Thus, to comply with the second criterion, we need to introduce a competent agent possessing a criterion separate from the output delivery rules.

Currently, risk is the main criterion used and imposed by western regulations. The acceptability of its level is left to the judgment of the expert or the person taking responsibility for it.13

Is it distinct from financial sector output delivery (or execution) rules? Is it measureable? These are the fundamental questions asked by the cybernetician to assess the suitability of the control device.

We thus come to the main subject of our study and research – risk measurement.

Precisely, and a genuine matter of reflection, the overall device14 and its best execution criterion – risk and its measurement – are constructed and expressed based on the same elements that have enabled the activities to develop and lead to the current conglomeration of institutions: financial science and the powerful calculation methods provided by computer technology.

However, as highlighted in the quotation above, the criterion must be independent of the processes and the automatic execution rules.

Thus, intelligibility does not form part of the mechanics of translation; it merely allows the result to be judged.

In fact, financial science, which underpins the activities, also provides regulators with the means of constructing a regulatory framework considered as acceptable. It is developed from the same conceptual frameworks, paradigms, and models. The calculation tools used to create products (outputs) and fix prices are identical to those that calculate the risk and fix the limits for such products.

This tendency to reuse tools is even stronger, given that the professionals themselves have promoted auto-regulation. They have converted extremely senior figures such as Alan Greenspan in his time, or the Basel Committee. They have therefore favored this route themselves. Going even further, participants are no longer aware of the identity of these tools.

22.3.2 Cybernetician's Diagnosis of Financial Regulation

Result of this analysis: The financial structures device, intended to be autoadaptive by the regulators, does not meet either of the required criteria for cybernetic tools.

At the very least, the cybernetician will be tempted at this stage to consider that current regulation of the financial system must be suffering from an irremediable design error. This regulation, naturally and under such conditions, has every chance of non-satisfactorily achieving its objectives, and of generating potentially harmful, or even extreme, unexpected effects.

22.3.3 Observed Effects of the Dependency Between the Best Execution Criterion and Activity Process Design Rules

In fact, the dependency between the best execution criterion of the mechanized financial system and the design of the processes generates two principal distortions observable today, which would not have existed had there been independence:

  • The first distortion, caused by using identical concepts, prevents the regulator-official from taking a global or organic approach to the financial system.

    In fact, the professional does not think in terms of system, but in terms of activities, while the regulator-official should give priority to the system, independently of activities.

    By using the professional's conceptual framework, the official is only able to visualize the financial system as an aggregation or addition of entities, and will therefore have a mainly aggregative vision, rather than an organic vision. The security of the overall system is thus reduced to ensuring the individual security of each entity. This immediately leads to an overwhelming tendency to favor an oligopolistic mode of organization, which facilitates “moral hazard” situations, the painful and costly consequences of which have been in evidence since 2008.

  • The second distortion is that accepting devices for regulatory purposes enshrines paradigms, theories, and models for systemic uses in the social field, although they may not necessarily have been designed for this purpose.

    All models, including those relating to activity, thus become untouchable and beyond criticism, and worse, those who do not apply them are considered as unprofessional. This statement is not an exaggeration. Despite his colossal success, Warren Buffet is periodically criticized in this way. His method of financial management does not comply with financial theory standards, with his fund even being downgraded by S&P in May 2013.

22.3.4 What Thoughts Do We Have At This Stage?

In any case, is it not sensible to reliably automate certain tasks that we could, perhaps, consider as simple? This allows better control of the continuing growth and the market concentrations already achieved. Given that there is no point in seeking perfection, could we not accept, in cost-benefit analysis terms, the two distortions highlighted above? Could we not consider the subservience endured by teams as necessary evils for the benefit of all?

Perhaps, or why not?

Nevertheless, it is important to not only provide an answer in principle, but above all to analyze the system, both in its current form and as it develops, in order to come to a conclusion as to its quality.

22.3.5 An Increasingly Rigid System

And yet, the combined, and already visible, effect of the two distortions highlighted is to make activities increasingly rigid.

In fact, the first distortion, an aggregative vision, leads organizations, businesses, and regulators to concentrate on the device rather than the substance. It focuses attention on collecting information rather than on the relevance of the information. Professionals become more sensitive to the gadget nature of the device, as previously shown by Norbert Wiener. The loss of risk control becomes all the more likely given that this change in the focus of attention will reduce professionals' vigilance and ability to deal with actual risks.

This mechanization, and the standardization that frames it, thus create a series of devices that facilitate the abrogation of responsibility, or even the de-skilling, of the various actors in an often-described context where complying with the form quickly takes precedence over issues of substance.

In addition, standardization, whether it is regulatory or professional, forces imitation, which is always dangerous in periods of market shocks. Effectively, instead of being absorbed, the effects of even limited variations are at times amplified by reactions that have become identical. In the case of shocks, a so-called resonance phenomenon may arise, as was the case at a famous bridge on the River Maine in France, which on 16 April 1850 collapsed under the double marching of a regiment (Duban, 1896), although the bridge could easily bear the weight of the men and their equipment. In the elegant parlance of the statistician, we would in this case refer to instabilities caused by the sudden appearance of unforeseen correlations.

Pragmatically, we thus have every right to ask whether the system creates situations where extreme events can occur. This financial-system-wide reflection can already be confirmed by the misadventures that have befallen automated order placement mechanisms such as flash trading or high-frequency trading.

In terms of banking regulation, observers during the 2008 crisis, such as the FSB,15 revealed the disastrous systemic effects of the synchronized generalization of practices judged to be appropriate on an individual level. The FSB thus showed that the mechanism for adjusting commitment guarantees in function of variations in default risk and the market price of collateral (collaterization), undertaken by all players at the same time, dramatically amplified the effects of market movements.

Overall, this mechanization, and the standardization that frames it, thus create a series of devices that make the structures rigid. It does reduce the frequency of limited-scope incidents, but by removing responsibility from the various actors and subjecting them to regulatory synchronization, the risk of extreme events, or even ruptures, increases substantially. We could therefore say that this organization reduces daily volatility but increases the risk of serious disturbance.

The second distortion, the enshrining of rules, facilitates what is known in cybernetics as the homeostasis of the financial system; in other words, scientific knowledge is part of the group of rules or ideas that are unaffected by external events. Thus, irrespective of the events encountered and of the results of the system, the paradigms, models, and conceptual frameworks are no longer called into question. On this aspect, which depends more particularly on the scientific community, we need only recall the words of Norbert Wiener in God and Golem Inc.:

The growing state of the arts and sciences means that we cannot be content to assume the all-wisdom of any single epoch. … In a period of relative stability, if not in the philosophy of life, then in the actual circumstances that we have produced in the world about us we can safely ignore new dangers such as have arisen … Nevertheless, in the course of time we must reconsider our old optimization, and a new and revised one will need to take these phenomena into account.

Homeostasis, whether for the individual or the race, is something of which the very basis must sooner or later to be reconsidered. This means … that although science is an important contribution to the homeostasis of the community, it is a contribution the basis of which must be assessed anew every generation or so. … Permanent homeostasis of society cannot be made on a rigid assumption of a complete permanence of Marxianism, nor can it be made on a similar assumption concerning a standardized concept of free enterprise and the profit motive.”

And Wiener concludes:

It is not the form of rigidity that is particularly deadly so much as rigidity itself, whatever the form.

22.3.6 A Degenerating and Increasingly Fragile System

The consequences of this increasing rigidity are obvious for the cybernetician: the risk of deadlock, reduced adaptability and, in the end, a weakened device that becomes increasingly unable to endure shocks. Rigidity should therefore clearly be proscribed as far as possible.

But, worse, after a certain point, degeneration may occur. How can this be defined? It is the situation in which the system produces the required effects on an increasingly infrequent basis, and where undesirable phenomena appear. Participants are thus increasingly unwilling to adhere to the system, which as a result becomes fragile.

This degeneration occurs rapidly, and the route is simple. In fact, the overall device inevitably shows flaws, certain of which may strengthen the demand for effective reforms. Where and how should attention be focused? Clearly not in the homeostasis validated by financial science; instead attention should be focused on data-collection devices and the technocratic framework of activities.

In fact, if the principles cannot be called into question, the error or the undesirable phenomenon can only be the result of a lack of data, an error in this data, or of a calculation considered too limited to measure risk. The authorities therefore seize the opportunity to impose new rules, interpretations, or additional monitoring.

Overall, the successive accumulation of rules means that they lose their relevance. They increasingly separate practices from concrete realities. The devices move further and further away from the initial principles. They maintain their initial form, but without their initial effectiveness.

My analysis is that, for financial activities, we have already entered this degeneration phase.16 This analysis supports the identical diagnosis made by Niall Ferguson in his lecture series, initially presented for the BBC, and collected in the book, The Great Degeneration: How Institutions Decay and Economies Die (Ferguson, 2012) – quite a programme! He extends the same judgment to what he identifies as the pillars of the Western system, the free market and representative democracy.

22.3.7 So What Should be Done?

Faced with this situation, there is no question of giving in to such pessimism. On the contrary, it is a historic opportunity for researchers to make their mark by updating paradigms and models, and to combat the homeostasis of our system, the source of this dangerous degeneration. The crisis has weakened justifications, and facilitates questioning of the current situation.

This questioning evidently finds applications in extremely wide fields. It is likely to affect the three fields that form the basis of the financial system – business models, to begin with, but also prudential and accounting models.

  • Activity: Because theoretical studies have a concrete impact. They have helped to create markets in, and the trading of, previously inaccessible subjects or concepts. As a result, traders, using the Black–Scholes formula, sell and trade volatility; managers have risk indicators that assist them in their decision making, and so on.
  • Prudential standards: Because financial system activity forms part of general economic activity. It serves it as well as makes use of it. It offers services and puts amounts at stake, both of which are activities that need to be controlled at both the company level and socially, and especially so given that companies in this sector are not subject to that shared indicator of difficulties common to all other businesses, the cash-flow crisis.
  • Accounting: Because the output of interventions always takes place in the context of a limited liability company. Although this legal form enables risk raking, its main constraint is that it is required to establish and respect rules for sharing cash, the concrete product of its success, between shareholders, employees, and the company itself as legal entity.

The system will be even more solid when these three devices are structured by solid, homogenous, and mutually compatible concepts.

There is therefore no cause for concern, as this boldness is proportionate to the new scientific tools that are continuously under development. Furthermore, Norbert Wiener came to the same conclusion when he stated:

No, the future offers little hope for those who expect that our new mechanical slaves will offer us a world in which we may rest from thinking. Help us they may, but at the cost of supreme demands upon our honesty and our intelligence. The world of the future will be an ever more demanding struggle against the limitations of our intelligence, not a comfortable hammock in which we can lie down to be waited upon by our robot slaves.

Wiener (1966)

References

  1. Duban, C. Souvenirs Militaires d'un Officier Français, 1848–1887. Second ed. 1896.
  2. Ferguson, N. The Great Degeneration. Paris Penguin Books; 2012.
  3. André Gorz, in Métamorphoses du travail, éd. Galilée; 1988.
  4. Papanicola, R., editor. Sciences industrielles pour l'ingénieur. Ellipses; 2010.
  5. Reason, J. Human Error. Cambridge University Press; 1990.
  6. Rodarie, H. Dettes et monnaies de Singe. Second ed. Salvator; 2012.
  7. Rodarie, H. Réformes financières, Progrès ou dégénérescence, July 2013 Policy Paper 286 Robert Schumann Foundation, and Le Cercle des Echos; July 2013
  8. Walter, C. Proceedings of the First SMABTP Scientific Symposium, “Nouvelles normes financiers, organiser face à la crise,” Springer; 2010.
  9. Wiener, N. God & Golem, Inc.: A comment on certain points where cybernetics impinges on religion. MIT Press; 1966.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset