4
Supply and Demand in Communications

Network operators provide architecture and access to next-generation networks (NGN). Different types of communication are available on the Internet and financial services consider this an opportunity to make their applications more varied. The range of applications offered by multiple players is supplemented by numerous offers.

4.1. Providers and customers

4.1.1. Service provisions

4.1.1.1. Overview

The communications services market cannot be considered a market that is freely open to the choices of users. The availability of all technologies everywhere at the lowest price is a natural obstacle.

Since the creation of national telecommunications networks in around 1890, technical adjustments to the required equipment have been gradually superseded by a pattern in which new generations of equipment are introduced every decade, defined and approved by international standardization. While there have been clear improvements in the provision and quality of service, the rationing imposed by the high cost prices of equipment and network management has fortunately been reduced (Appendix 10).

Faced with innovations, user demand is never expressed clearly and completely. On the other hand, development centers work on the varied concerns of users with relatively long-term goals. They often prioritize technical performance requirements to the detriment of ergonomics adapted to the overly high number of profiles presented by customers. The overview of the main communications services given in the following shows that, in the current state of technology, the confluence of supply and demand can create disagreements of varying levels of significance.

4.1.1.2. Voice services

For over a century, technological developments have allowed the quality of voice signal transmissions to improve. It seems that the technology has now reached an optimum technique, which enjoys a global consensus. International agreements have stabilized the definition of the band of voice frequencies transmitted around two definitions; the 300–3,400 Hz range (instead of 200–2,100 Hz for the American speech of the 1930s) and the 150 Hz to 7 kHz range for enhanced bandwidth (known as HD, high definition), which should become essential in the future.

In digital technology, several coding standards are in use in wired and radio-based networks (VoIP). The different versions of coding standards and their shared conversions seem to generally satisfy customers overall, and anything that remains hidden to users belongs to the operators’ own area. On this topic, the choice of conversion systems for standards in voice systems is a problem for operators, given the real costs of installing equipment in all network centers. For example, in recent years, many operators have hesitated to set up the VoIP on LTE option (VoLTE) because it has been relatively expensive in comparison to other simpler solutions, and demand has remained low (section 5.4.5.1) [BAT 14]. Indeed, LTE began implementing VoIP in packet mode on radio networks at a time when all operators were still using circuit-switching systems for voice services.

For economic reasons, the logical development would be to only retain the wideband HD voice standard in networks and to reduce the number of options used in radio systems. In general, users are currently satisfied by the sound quality they receive and their requests largely focus on the prices to be paid (fees relating to the terminal, to subscriptions and to the rate of usage).

4.1.1.3. Written services

For many years, the transmission of written messages was subject to a particular treatment in networks, as demand was at once very low and completely essential to some clients, that is for the State, the original owner of all aspects of the network, and for those in charge of industry and trade.

From the outset, in 1880, a specific network was created for telegraphy, which is divided into several stages within the Telex automatic network, reserved for State services and businesses (1946), and within a network open to the public with the telegraph service delivering sealed messages (blues) to homes. Despite the existence of different national standards, various efforts were made by administrations between 1970 and 1980 to increase the speed of transmission (telegraphy service at 200 symbols/sec; Teletex service at 2.4 kbps). Due to the high prices of terminals and subscriptions, these commercial practices were never particularly successful. The demand did not respond to the supply offered.

Nevertheless, in 1977, a study on the standardization of Telex in university libraries was carried out. It underlined the need to plan for “the extension of computer networks”. Telex’s half-duplex mode of transmission between the caller and the party called likely inspired the development of the Minitel terminal (1982), which was very warmly received by the public. Indeed, 9 million Minitel devices were in use in 2000, in comparison with the peak figure of 134,000 Telex terminals recorded in France in 1986.

The major technological innovation, which weakened the Telex network, was certainly fax. Created in 1848 as “automatic telegraphy” and successively modernized in 1920 and 1970 through the availability of new components, fax was a major development for the unilateral transfer and reproduction of hand-written or typed pages in black and white with a medium-quality definition. This service was offered to business users in various forms (Publifax, Viafax and HomeFax). The million and a half fax machines in use in France in 1993 have been gradually replaced by file transfer applications and computer messaging services on the Internet.

Some ill-fated offers have been made to business users, such as Group 3 fax machines, which were operationally incompatible with PBX branch exchanges on ISDN. Group 4 fax machines were largely unsuccessful, as was teletext, designed to become a “super telex”. However, fax machines that provide verification of each of the pages transmitted are still in service for some dedicated applications (transmission of files between the judiciary and attorneys, for example).

4.1.1.4. Data services

Data services have long been guaranteed through leased lines directly connecting the two distant services involved with the aid of modems adapted to the features of these connections. Businesses, particularly those in the written press, were affected by the high rental costs for these lines (around 100,000 lines with data rates between 9.6 and 48 kbps were rented in France in 1980). Lacking adequate network knowledge, many businesses were surprised by the digital connection offers made by operators and “expert” consultants have not always known how to reconcile the features of leased lines or the available network protocols (frame relays, ISDN, X.25, ATM) with the technical and economic priorities of businesses.

With technology assisting the development of offers, European organizations have suggested constructing national data networks on which businesses could connect. Circuit-switching (Datex-L in Germany) and packet-switching (Datex-P in Germany and Transpac X.25 in France) data networks were then conceived and constructed, with the latter model proving to be the least expensive and most profitable as it is adapted to the data rates of all terminals.

The Internet, the network of networks, has found itself in the historical path of the development of data networks in packet mode. It has currently taken on the transportation of all data services, along with the advantages and disadvantages of its technology and its own features. In return, although Transpac still served 90,000 businesses for their banking transactions in 2011, it closed in 2012 in the face of competition from the Internet.

4.1.1.5. Image services

Image transmission began in 1914 with the Belinograph, which, through the possibility of reproducing images on paper, served the press around the world very well until the development of thermal paper in 1970. The broadcasting of televisual images, which began in 1931, only took off commercially in 1950 and the technical developments in this field have only continued to increase. The most recent developments principally concern the variety of display technologies and image formats.

Industry competition has led to the diversification of solutions related to the quality of the image displayed (4 K screen to 4,000 pixels on the screen’s horizontal line), without a significant increase in cost price and with an improvement in terminal ergonomics. Several standards have been drafted on the topic of subtitling television images, but few broadcasters are able to offer this facility due to the too great diversity of options in the equipment used in the broadcasting chain.

Network operators are considering the need to broaden their service provisions in broadcasting television programs and retransmitting them to other networks, including the Internet, for example. The image quality depends on the availability of high bitrates at the time of the broadcast. It appears that the only successful examples of media/telecom integration are cases where the operator has been able to retain exclusivity of premium content for its subscribers, with the agreement of the regulatory authority. In France, ambitions to acquire television sporting rights have been completely restricted.

4.1.1.6. Services for transferring funds

Having been in charge of transferring funds since the Middle Ages, postal services were all designed to cooperate with telecommunications services for this purpose. Telegraphic mandates, established in 1868 and boosted in 1879 by the merger of these two administrations, used the telegraph and the telephone, followed by the telex and the fax machine. With regard to the post office, a promissory note is a “mandate” issued so that a transfer of funds may be carried out, with the service being paid for through commission paid by the sender, including the transaction insurance costs and potentially the financial exchange rates (currency conversion).

The electronic transfer of funds with a memory card is the result of a long succession of patents filed between 1968 and 1978, which led to the distribution of the Télécarte in France (1983), intended to pay for telephone calls from a public phone booth. Banking cards appeared in 1998 through Gemplus and Gemalto. In 2016, remote payments are possible through cards with EMV (Europay Mastercard Visa) microprocessors used for banking applications with the transmission of a personal code word in client/server communications systems. In this case, it is again the bank that issues the card or begins the service that handles transaction security through the network, which deducts fees and possible exchange rates (see section 4.3.2.2).

Since 2013, with the aim of reducing network transit fees, various offices have been offering their cooperation in order to carry out financial transfers using virtual currencies (Bitcoin, for example) created through P2P (peer-to-peer) software downloaded onto the computers of both relevant parties (see section 4.4).

4.1.1.7. The balance sheet of supply and demand

Several common points can be seen in this brief overview of the development of the primary provisions of communications services:

  • – in a little under a century, the successive stages of technological development have allowed there to be a response to the requirements brought about by a demand that was undoubtedly greater, and therefore more profitable, than could have been foreseen;
  • – the availability of good quality communications services throughout the national territory facilitates economic development and the spread of culture, without any specific metrics being able to offer support in assessing the efficiency of the factors under consideration;
  • – the business market is only a small part of national traffic (between 5 and 30% depending on the country) and yet it is this market that accounts for the recently agreed network developments on an enlargement for all users;
  • – the availability of new components has led to the development of new technologies;
  • – the need for operators to turn toward new network protocols remains completely impenetrable to users. This attitude is explained by the desire not to confuse the public with a complicated operating mode;
  • – the corresponding protocols and equipment set-up on a network are not always used to their full potential by customers;
  • – network costs remain even higher in investment and operation/maintenance than the networks’ density;
  • – the quality of service and the security of transactions require very high cost prices, the reasons for which are not understood by the public;
  • – customers always experience great difficulty in expressing precisely which service features they would like to have, even in the short term;
  • – the economic decisions of international institutions have led to the adoption of a digital agenda for Europe, which aims to set up access at 100 for 50% of the European population and very high speeds for all (higher than 30 Mbps) in 2020.

Between 1920 and 1980, businesses that wanted to have their own connections could lease fixed P2P lines, which complied with international standards. These provisions, charged per day or per month, were quite expensive and could represent the price of several hours of continuous connection per working day. The key interest in this offer of leased lines could be considered very differently by user companies: permanent availability of connection, capability to transmit music in HD or data at 9.6 kbps, guaranteed assistance in the event of line breakage and so on. The creation of a data network allowed this system of several tens of thousands of connections, each difficult to manage and highly expensive, to be greatly reduced. Today, technological advances have allowed costs to be reduced while proposing new solutions, which are likely better suited to demand.

In total, in the past as in the present, customers use the service provisions that are available and standardized, with the differences between operators only affecting the pricing level in accordance with frequency of use. Service provisions can sometimes surpass the level of demand, which gives the customer the opportunity to “explore” the range of offers. On a practical level, the communications services market currently finds itself quite far from the theoretical framework of “supply and demand”, with the supply being overabundant, in Europe at least, and the demand not always managing to express its own needs.

4.1.2. Satisfaction of needs in communications services

The concept of supply, economically speaking, indicates the available quantity of products and services that are ready to be sold. The concept cannot be separated from demand, which is the quantity of products or services that consumers are willing to purchase. The confrontation between these two concepts shows the state of the market for a given product or service, and leads to prices being set.

According to the content of economic and political international agreements, which have been concluded, national regulatory bodies are tasked with watching for any abuse of dominant positions between market players and controlling the “fair and reasonable pricing level of communications services”. In these conditions, network operators are obliged to adjust their offers according to the most obvious needs of their customers, which is not always consistent with high profitability of the available resources. Competition between operators leads them to a certain standardization of their offers, sometimes to the detriment of research and innovation.

As an example, one French scientific research body constructed its own computer services, adapted to its own activities and security constraints. This service provision is available as follows:

  • secure Cloud: virtual servers on demand and according to requirements, secure hosting of websites, bulk data storage and automatic saving of workstations;
  • integrated messaging: modern features, shared, secured and integrated into other daily tools for greater ease and performance;
  • portal on a secure collaborative platform: dedicated to researchers, managers and their external partners, as well as administrative and operational bodies;
  • sharing and nomadism: synchronized multistation access to personalized files and secure “Dropbox” sharing;
  • individual videoconferencing: from workstations or mobile devices, with interactive document sharing and display management.

In this way, the computer service of this center of activity has been “custom” made, and for a specific purpose, with the assistance of software adapted to its own needs and in conjunction with the connections available on the Internet. Subject to proper coordination with the company’s Director of Computer Services, the task of the network operator is reduced to a minimum. The cost of accomplishing this is likely to be quite high and the experience gained remains within the company. The relative levels of security obtained in these two particular cases cannot be compared.

This example is intended to illustrate that balancing needs in communications services can only be custom made through the Director of Computer Services of the company itself. The optimum communications system in a hospital, a technical research center or a large retail space can only be best planned and achieved by the services of a company’s Director of Computer Services. The link between a company’s IT and its communications requires close collaboration between the company and the network access provider.

4.1.3. The long tail

The concept of the long tail was introduced in 2004 by Chris Anderson, an American economic journalist [AND 06]. It concerns the statistical distribution of sales of a product or service. In a way, this expression shows that sales can be considered in two categories; one focusing on a highly responsive but small number of customers (the left part of the diagram), and the other, greater in number, but moving more slowly to a total amount of purchases, which is higher than those of the first group (the right part). Manufacturers must therefore decide, from the conception of their product or service, which part of the market they wish to interest in principle. Customers faithful to a brand and technological devotees are found in the first part of the curve, while the “followers” watch the market and make their purchases through different means over a longer period of time. These two types of sales require different approaches and distribution methods.

image

Figure 4.1. Probability of sales distribution

Chris Anderson believes that some low demand products can collectively form a larger part of the market than best sellers, if the suitable distribution channels are able to make them known. When the cost of storage and distribution is low, marketing expenditure is also reduced. If, on the other hand, logistical costs are high, marketing cannot last very long. This concept, which supports sales systems on the Internet where the connection, referencing and dynamism of search engines are key, has been taken over by “barbarians” and will be mentioned later (sections 4.5 and 5.3.3.2).

4.1.4. Monopoly, duopoly or competition

According to the current international rules, competition is essential to market dynamics and a vibrant commercial atmosphere. As of 2016, the European Union still has 155 network operators and reconciliation operations are underway in Germany, Ireland and even in the United Kingdom, in the hope of reducing this number to 140, which is still much too high in comparison with the United States or China. It is easier to invest when there are fewer competitors. Yet, it is impossible to reconcile investment or innovation with pricing levels that are too low. The European communications services market is too fragmented, being led by precarious players who tomorrow could find themselves dependent on non-European players. At the beginning of the 1980s, the European Commission wished to set up, at the European level, one or two large network operators with a European (or global) dimension. It is hoped that new positive developments could soon occur to allow the European Commission to achieve this objective.

4.1.5. Billing the data rate

Technically, it is possible to increase the flow of a network’s incoming and outgoing connections, as well as the traffic’s main features. The billing and regulation of these parameters establish the basic rules. Each of the applications currently available on the Internet responds to the typical parameters, and billing the data rate would signify the penalization of some applications for the benefit of others and the end of the concept of “Net neutrality” (see section 4.2.5).

“Over half of the traffic routed through Internet access networks in France is issued by just five independent systems, corresponding to service providers, hosts, or technical intermediaries, across the 70,000 which make up the Internet. The partner at the direct traffic’s origin can be identified, unless it acts under the name of a third party. The level of P2P traffic cannot be assessed” (Arcep).

Within the current framework, the measurement of real traffic cannot be arranged and a pricing system according to volume would risk penalizing the small players in the digital economy.

4.2. Obligations of network operators

4.2.1. Responsibilities of a country’s main operator

In every country, the State must rely on an operator for the necessary technical resources to ensure proper usage of its communications service and, furthermore, if there is no other option, to support its economy. A country’s main operator is subject, in principle, to clear obligations, including:

  • – tariff obligations related to geographic averaging;
  • – the obligation to offer reduced tariffs to the disabled or people in need;
  • – the obligation to have payphones for public use;
  • – the obligation to provide an information service and a directory in both printed and electronic forms;
  • – the obligation to ensure universal public service;
  • – monitoring of international standardization;
  • – organizing vocational training in communications systems.

All of these obligations have a cost, even when they sometimes offer a slight competitive advantage. This feature must be taken into consideration in financial balance sheets.

4.2.2. Public service tasks and universal service

Communications regulations cover a set of measures whose aim lies in the desire to protect the consumer. These measures have evolved over time. In the 1950s, the public service tasks entrusted to the French telecommunications service included universal service (complete service for the territory), continuous service (continuous availability of staff in the event of a serious occurrence) and access to emergency services. In 1996, these tasks were redefined and the operator Orange was asked to adopt a limited universal service for fixed-line telephony only (mobile telephony was not included), which focused on the following points:

  • – connection and fixed telephone service at an “affordable” rate for voice, fax and reduced rate data usages;
  • – management of an information service (118 612 in France);
  • – providing a departmental directory;
  • – access to emergency telephone services (firefighters, police, etc.);
  • – providing dedicated services and facilities for disabled people;
  • – installation and maintenance of payphones for public use (presently limited in number due to the development of mobile networks).

As of 2016, in most countries all of these considerations have altered further due to technological development and uses. Payphones should have disappeared by the end of 2017. Two arguments have been put forward to justify this decision in France: maintenance costs, which are close to 8 million euros per year, and an average usage, which in 2014 had gone down to < 1 min/day and per booth.

In France, public service activities in telecommunications, intended to serve the public interest, fall into the following two categories:

  • – state communications services, which are essential to the functioning of public institutions and administrations;
  • – community-controlled communications services (infrastructure, waterways, ports, airports, hospitals and public education).

The concept of public service is guided by the “subjective” approach to the Council of State’s jurisprudence. It covers the willingness to ensure, whatever the circumstances, uninterrupted service offered to all citizens without exception and to compensate for the shortcomings of market offers (without proposing competition). On the other hand, the French definition of universal service has been considered broader than the European vision [NGU 00].

4.2.2.1. The concept of universal service

Universal service is based on three main principles:

  • – availability: the level of service must be identical for all users in all places and at all times;
  • – affordability: the price of the service must not be prohibitive;
  • – accessibility: subscribers must be treated equally in terms of service, price and quality of service, in all places and at all times.

A revision of the rules of the aims of public service and universal service may be considered in order to take account of changes to the market and to technologies or of the needs identified for a sustainable practice (social service). The French territory is vast and the population is concentrated around certain centers, leaving large areas that are expensive to equip. On the other hand, using digital services requires a learning process and continuous support, which elderly or isolated people cannot access. “Digital illiteracy”, whether partial or total, reduces the scope of public investments in favor of new technologies. In rural areas, the gradual closure of post offices and the difficulty of setting up Internet cafes inspire regret at the loss of the former cooperation between postal and telecommunications services. Agreements on international trade, however, underline the need to take the specific characteristics of each country into account in applying these principles. As a result, the regulations on the concept of universal service and public service in communications must be adapted to societal factors in order to take into consideration the changes brought about by technology, customer behavior and the need to support investment and national innovation. After several decades of privatization, the balance sheets of networked companies, which have taken up the liberal position, are far from convincing [JEA 15, VAN 15].

In the United Kingdom, the British regulatory authority Ofcom has established a high-speed universal service obligation (USO) to enable citizens to have a 10 Mbps connection, wherever they live. In 2015, three and a half million households and businesses in the United Kingdom have already had access to data rates of 24 Mbps or more. The United Kingdom intends to provide 95% of households with very high speed access by the end of 2017, despite the existence of numerous technical difficulties caused by geographic and demographic circumstances.

At the European level, and for those who are not legal experts, the regulations that have been developed to “protect the interests and rights of end users” seem incomplete and quite uncoordinated from one country to another. It seems that initially worthy sentiments on the issue have been reflected in the clarity and simplicity of the regulations. Certainly, technical developments occur rapidly and their spread is unforeseeable. Within Europe, in 2016, universal service is understood quite differently in each of the 28 countries and it is likely that the gradual demotion of this aim is linked to the decline of the required funding [BOU 16].

In the United States, they are currently moving toward replacing some payphones with Wi-Fi hotspots with high speeds of 1 Gbps. More than 7,500 hotspots with ranges of 45 m will be installed in New York, and will be equipped with screens displaying advertising and city information as well as users to call the emergency or municipal services. These public booths, financed by advertising (some of which will come from Google), will allow portable devices and tablets to be charged. Some booths will even be equipped with a tablet. This development, linking network operators to advertising agencies and OTT services, could be a means of developing a broader implementation of universal service within this framework.

Public service delegation is used for the construction and management of communications infrastructure, for example to connect an island to a continent, or to create a fiber-optic network in an urban or industrial area.

4.2.3. People with disabilities and under-age children

Public facilities must be accessible to people with disabilities. Communications services, in particular, must follow this arrangement. Special terminals have been designed for people who are blind, deaf or hard of hearing. Communications services designed for deaf people or those with speech impairments are also operational among some network operators.

In the United States, Web companies must respect the Children’s Online Privacy Protection Act (COPPA), which forbids them from collecting personal information on young people under 13 years of age without the permission of their parents. In theory, the majority of social networks do not allow users to be under 13 years old. In Europe, this rule is not reflected in national regulations. In France, the prior consent of parents is only required in cases where a website collects a photograph of a minor, or uses his or her personal data for advertising purposes. At the end of 2015, the European Council suggested that this age limit should be raised to 16 and to leave it to each member state to choose the age of consent, between 13 and 16 years of age. A European document on data protection in Europe should be adopted by mid-2016, and then written into the different laws of member states.

The ISO/CEI Guide 71:2014 provides suggestions for standardization on how to handle demands and recommendations relating to accessibility, which, directly or indirectly, target systems used in computing and communications. The ITU and the ETSI have also drafted standards on this topic. While one group of people with disabilities may benefit from new facilities in communications practices, there is presently still much to be done to make these upgrades more widely available.

The International Bureau for Children’s Rights (IBCR), an international non-governmental organization (NGO) located in Canada responsible for protecting the rights of children, is calling for public authorities to exercise greater vigilance with regard to the Internet. It has mentioned that, “the fast expansion of the Internet has not only made it easier to buy and sell sexual and pornographic services, but it has also created an environment favorable to sexual interactions. The Internet has become a place where children explore their burgeoning sexuality. Furthermore, the Internet has broadened the scope for sexual exploitation of children by facilitating their direct and anonymous contact with adult predators”.

With regard to the content of public services available on the Internet, regulatory bodies (including the CNIL in France) are demanding that specialized websites protect the personal data of children aged between 6 and 12, as well as of teenagers. The websites accessed by young people variously include social networks, games, educational sites, academic support sites, TV channels and news sites. Often, the websites visited are not suitable for the age of the children. On the other hand, the vast majority of websites collect personal data (IP address, location, email address, geographic coordinates), and a minority of these sites provide the option of deleting personal addresses. The majority of these websites are not suggesting any monitoring measures for young people, or for parents to ask for their agreement. The French body CNIL has underlined the shortcomings of monitoring and parental control in this area.

The WHO wants use of the suffix “.health” in electronic addressing to be associated with the establishment of justifiable codes of conduct by effective and applicable accountability mechanisms under the relevant government authority. These codes of conduct should correspond to all concerns related to healthcare on the Internet. They deal with the quality and reliability of sources of information on health; trust in products, services and practices; and combating illegal activities and deceptive, malicious and underhand practices.

The Swiss “Health On the Net” (HON) foundation guarantees the annual certification of health-based websites, founded on respect for deontological principles, which include the reliability and credibility of medical and health-based information. The French National Authority for Health (NAH – Haute Autorité de Santé) is planning to suggest “methodological reference points to editors of health-based Internet sites and Internet users for access to reliable information”. However, the NAH has not negotiated any convention with the HON foundation since 2013. This project should therefore be constructed in tandem with healthcare professionals, public authorities, users and editors. Nonetheless, in 2014, close to 882 French-language Internet sites (out of over 3,000) were still receiving the benefits of “HON Code” certification (www.hon.ch/).

The lack of Internet governance by a responsible authority currently leaves Internet users without information on serious aspects of the sites they visit and on the possibility of all directly accessible content being available (see section 5.1.1.5).

4.2.4. Security of transactions

Since Edward Snowden’s revelations about the PRISM scandal in 2013 (section 4.6) [BAT 14], everyone knows that the Internet is a network designed for easy communication, but that it provides no security. Every year, the press reveals dozens of significant breaches of personal or banking databases, due to the permeability of the network and the lack of precautions taken by managers and Internet users. Currently, the Internet remains a network on which security is not guaranteed and it consequently falls to users to take responsibility, pending new technical measures or the availability of more effective individual protection.

The NSA maintains that its actions are entirely within the law, with regard to the wiretapping and surveillance of the Internet and mobile networks. All countries are victims of computer espionage and the majority of American businesses are victims of cyberattacks. Most of these crimes are often little known among the public. Big Data companies, which stock and manage large volumes of data, are the most exposed to cyberattacks. Tomorrow, the Internet of Things could become the target of organized cybercriminals. It belongs to all businesses to estimate the risk precisely and protect themselves against this threat.

Protections against wiretapping are faultless if they are comprehensively carried out, but, in such cases, the content of the message can only be intercepted through possession of the data encryption key. If it is a case of terrorists preparing a guerilla attack, the authorities trained to monitor messages will not be able to decode them. All that they could obtain would be a scrambled message. Policy makers must continue to question cybersecurity experts to find a response to this situation.

In France, since the end of 2015, the legal branch of the national police has been receiving the assistance of a unit dedicated to combating all forms of Internet-related criminality. This unit is concerned with combating digital crime, meaning that it focuses on terrorism, Internet fraud, computer piracy and bank card piracy. This unit carries out research on the conventional Web or on the “Darkweb”. It also handles many issues linked to the Blockchain with regard to smartphone applications or messages hidden in online video games.

4.2.5. Internet neutrality

4.2.5.1. Definition of neutrality

Through the phrase “network neutrality”, regulatory authorities are expressing a principle guaranteeing the equal treatment of all data rates from the moment they enter the network. This equal treatment must be understood both technically (no discrimination with regard to the source, destination or content of the information transmitted) and economically (the data rate must be accessible at the same price for all content providers). “Net neutrality” is an important concept, as it is linked to the quality of connections (VoIP, data, video), with good quality only guaranteed at a high price.

4.2.5.2. Technical aspects of “neutrality”

The desire for “non-discrimination”, which has been expressed since the beginning of the Internet, is in fact a theory and, for around 30 years, this contradiction has been raging without examination of the technical realities. Naturally, the provider of communications channels in IP protocol would like to have the largest possible number of satisfied customers. In order to attain this, their varied requirements must be taken into account. There is no question for the provider with regard to favoring one or other of its friends or partners; on the contrary, it must ensure that the massive flows to a broadcaster of television programs does not obstruct the security service provider seeking to avoid delaying or cancelling the transmission of emergency alarms due to temporary congestion on the network. The network operator must therefore organize the traffic in separate channels to avoid any possible difficulties, which naturally leads to different cost prices and therefore to an infringement of the neutrality principle. In fact, everything could be confidently carried out if Net neutrality was understood in the sense of “non-discrimination in the event of equal traffic” (see also section 2.2.3.2).

4.2.5.3. Initial organization

After lengthy discussion of this topic, American regulations have come to accept specific treatment for the broadcasting of videos on the Internet by cable operators, in return for a financial contribution and thereby organizing a two-speed Internet. In Europe, the situation is different with regard to broadcasting video, probably due to the linguistic division of the market, and in October 2015, the European Parliament voted on a text protecting the quality of the high-speed transmission services planned for teleconferencing and telemedicine. This regulatory agreement, which was adopted by each of the 28 European Union member States, leaves operators the possibility of slowing down the data rate to avoid network congestion, provided that all of the management measures are “transparent, non-discriminatory and proportionate”, and do not last any longer than necessary. This compromise also safeguards the quality of services.

4.2.5.4. Points still to be clarified

By maintaining the “ambiguity” of the concept of “specialized services” likely to benefit from priority network access, network operators will have an argument for bypassing Net neutrality.

The impact of the new provisions taken in developing the connected Internet of Things is yet to be clarified. Cases of “vital” applications with low latency times in mobile telephony do not appear to have been taken into consideration. The BEREC, which brings together European telecom regulators (section 3.2.2), has been tasked with developing “general guidelines” on Net neutrality by the end of 2016 to avoid differing interpretations of the text from the European Parliament.

To this effect, one law, currently being prepared in the Parliament, is intended to clarify the French position on European regulations. Among the major principles proposed, this act should highlight some of the key elements, such as “the freedom of innovation”. This act must also clarify the French concept of the “principle of Net neutrality”, understood as the absence of operator discrimination regarding the content on its network and the “loyalty” of computing platforms connected to the Internet.

This indicates that common rules regarding the Internet should soon be available at the European level, contributing to a single market without fragmentation. It should be noted that, with the two aforementioned principles, it would be difficult for the network operator to correctly size its links and routers if the actors in charge of applications have not previously defined the features of their traffic, with the Internet network not carrying out any dynamic data rate allocation on demand or by reservation.

4.2.5.5. Tactile Internet and “Net neutrality”

On the other hand, it appears that, in certain areas, the debate on Net neutrality is somewhat outdated, with several applications requiring use of an Internet network in almost real time (to the nearest millisecond), which currently only the “Tactile Internet” is able to provide at distances under 150 km, pending the arrival of a quantum Internet.

4.2.6. Respect for personal data

In each of their actions, citizens leave traces of their activity and these traces can be stored by the employees of popular bodies (post office, insurance, shopping). By the same token, Internet users leave on the network traces of their addresses, personal data and visiting patterns for different sites. The permeability and automatic memory of the Internet means that it is easy to profile each Internet user and use these profiles for marketing operations or market research.

The detailed information of Internet users can be harnessed for competitive purposes by marketing professionals, insurance agents or the managers of GAFA companies (Google, Apple, Facebook, Amazon) and social networks. The door is open to innovative and competitive action in these areas. As it happens, Internet users are no more the owners of their personal data than the banks they trust or their mutual health insurance plan. The paradox lies in the fact that, conversely, Internet users, despite everything, have the advantage of being able to use their personal data to enhance their community’s wealth of healthcare information. Their medical history is useful to others like themselves, just as their consumption habits assist in the organization of a coordinated regional supply system. While their shopping habits are known to providers, Internet users are in some way protected against the actions of impersonators who seek to use these factors to steal from users. Through the information stored, banks can uncover deceptions and block any attempted fraud. In reality, the discussion lies between the potential misdeeds of an artificial “Big Brother” and the paternalism of a benevolent “Godfather”.

The priority status afforded to client-oriented services considered to be “profitable” is a constant feature in free trade. The novelty brought by the Internet is linked to the dynamism and power of the computer tools used, which have revolutionized habits and which have significant storage and information management capabilities. It is therefore necessary for Internet users to consider both the positive and negative aspects of their personal data being used by sites that they frequently visit.

The discussion has already continued into 2016 and all evidence points toward past abuses being corrected by more restrictive future legislation. The CNIL has recently given Facebook a 3-month deadline to stop storing the data of people who do not have an account on that social network. Several regulators in the European Union have begun inquiries on Facebook’s actions toward protecting the privacy of individuals.

Within the framework of Internet neutrality, the Indian communications regulatory authority (TRAI) has forbidden mobile networks from offering free differentiated access to the social network Facebook. There is no reason, according to the TRAI, to offer data services for different prices on the same network.

Moreover, Facebook, due to its paternalistic attitudes, has suffered setbacks in other countries. It is intermittently blocked in several countries in Asia and the Middle East because its content is deemed discriminatory with regard to religion. The security of accounts and personal data has been compromised. On the other hand, Facebook is contentious as it can cause disturbances due to people using it during working hours and frequent instances of rudeness from people witnessing irresponsible behavior. Finally, like most OTT, Facebook carries out undisguised tax evasion practices.

4.3. Remote payments

Within communications applications, this sector of activity has suddenly grown to an enormous scale, not because of the sums of money involved, but due to the speed and security of money transfers that are now possible, even on the move or in developing countries.

4.3.1. Currency and remote payment

4.3.1.1. The triple role of currency

Since the days of Aristotle, currency has been defined by the triple functions; it is guaranteed by each sovereign State. It serves as a unit of account, a store of value and a medium of exchange. Currency allows the payment of goods and services as a function of being legal tender, which is the result of the total of social, economic and political conventions based on mutual trust between partners. Economic theorists, including Adam Smith, Ricardo and J.-B. Say to name but a few, reduced the role of currency to that of a simple intermediary agent in economic exchanges. More recently, J. Keynes underlined its power in the economy. Currency is a store of value and a social commodity that, through its circulation, galvanizes the employment level and growth of a country.

Different forms of currency can be used at the same time in the same territory. Currency areas are jointly defined by government authorities. The International Monetary System (IMS) manages exchanges of national currencies between agreed parties, which today are no longer linked to a gold or monetary standard (Jamaica Agreements, 1976), but to bilateral floating values determined by the IMF.

In support of regulatory texts, national authorities can be authorized to create additional or replacement currencies to be used for a limited period (vouchers, meal tickets, treasury bills, future values, promissory notes, stamps and transfer forms from the post office or for stamp collectors, telephone tokens, plastic cards for phone calls of a fixed duration). A currency’s credibility relies on a general acceptance, the threshold for which is connected to the aims of the community that uses the currency. Banks, the post office, communications network operators and businesses create currencies. There is a strong trend to create a dematerialized currency that is able to avoid tax authorities, control and various taxation systems, which are always considered excessive.

To illustrate the importance of this market, PayPal’s online payment activities have connected 175 million consumers with 13 million merchants. Because of the development of mobile phone technology, this customer–retailer relationship can be enhanced further.

Monetary circulation is created by the credits that support the borrower’s actions. However, as far as currency creation is concerned, banks cannot pass a certain threshold, with the risky assets not exceeding a percentage determined by the bank’s equity [COU 14, PLI 13].

4.3.1.2. Special drawing rights

The IMF decided that from October 1, 2016, international exchanges, after having used the reference points of dominant countries (Pound Sterling, Deutschmark, Yen, Swiss Franc, CFA Franc), will be based on a “reference basket of currencies”, defining the special drawing rights formed as follows: 42% in dollars, 31% in Euros, 11% in Chinese Yuan (or Renminbi), 8% in Japanese Yen and 8% in Pound Sterling. All countries in the world use this for the international trade of goods and services, with this reference point having the advantage of being a relatively stable temporary arrangement.

4.3.2. Electronic trade

4.3.2.1. Definition of electronic trade

Electronic trade (or “e-commerce”) uses the medium of communications networks. Due to its specific challenges, it has a new approach to customers. Different aspects of transactions, including collecting the taxes payable in place of delivery, monitoring transactions to protect customers and respecting national regulations, must be clarified further by national regulatory bodies.

4.3.2.2. Securing online transactions

Several solutions have gradually been developed to secure online transactions. Among these, the “tokenization” technique in remote payment solutions opens the way to possibilities that can be easily implemented and are capable of offering customers a simple, fast and secure ergonomic purchasing procedure.

The tokenization process is carried out by the card reader that, through the bank card’s number and expiration date, determines three control values that are transmitted to the Internet server via the network: the three figure security code on the back of the card (card verification value (CVV)), the expiration date and a number, known as the “token”. This “token” serves to make transactions secure and provides protection against fraudulent use of payment cards. As of 2016, new bank cards have a CVV that is randomly selected and can be renewed every 20 min. This new CVV is visible in a small window located on the back of the card (dynamic cipher).

Recently, technologies for dialog between users and Internet servers have been added to Web browsers. The Ajax (asynchronous Javascript and XML) method allows rich Internet applications to be installed, offering a greater level of manageability and ease (so-called “Web 2.0” applications). Internet browsers have been made compatible because of the availability of JavaScript code, which can be integrated within Web pages and executable on the client’s terminal. JavaScript, used in the Ajax method, is used to modify the content of Web pages to anonymize the process within commercial databases. In this way, transmitting the bank card number of the payment services provider (PSP), performed using Ajax, does not pass through the merchant’s server. In this way, the card’s security code, as well as the sensitive data linked to authorizing transactions, cannot be recorded by merchants.

4.3.2.3. Criticism of the standards used

A binding regulation, administered by the Payment Card Industry (PCI) Security Standards Council, has been created in order to reduce the fraudulent use of payment instruments. The PCI DSS (Data Security Standard) standard was established by payment card providers and version 3.1 of it was published in April 2015. It specifies 12 conditions of compliance, divided into six groups called “control objectives”, which are a substantial burden (there are more than 300 “good practices”, with equipment auditing and implementation procedures). Setting up a collaborative network for monitoring fraud allows the repetition of activity to be monitored and guarantees a good level of protection.

Tokens are stored within electronic wallets and smartphones. They are at the center of numerous integrated solutions in the purchasing process. Interbank networks could certainly standardize this token method, with or without smartphones.

4.3.2.4. The international framework

E-commerce is progressing slowly. It appears to be difficult to connect identity verification to user-friendliness in light of the security that has become a major concern. The exchange rate between currencies and the interbank commission to be paid for the requested transaction is also slowing down adoption of this procedure.

The European Commission would like to quickly provide Europe with a digital single market (DSM), constructed to assist European businesses in preparation for their own growth and the growth of the labor market. In 2015, 7% of European SMEs sold abroad online and 15% of European consumers bought online from outside their own country. These low percentages are explained by delivery costs, existing regulations and the different VAT rates in force in the 28 member States. Digital exchanges of cultural goods are restricted by a complicated system for dealing with copyright that is related to territoriality. Lastly, according to European Commission research, online trade has been slowed down by the low density of wired and radio-based broadband connections in Europe and the fact that Europe has fallen behind in Cloud technology and in managing massive amounts of data. Drawn in by the supply and frequency of purchasing, the expenditure of French online shoppers continues to grow and should rise to 70 billion euros in 2016.

The practice for sharing the responsibility for actions undertaken in this area is still poorly defined and the DG Connect Commission seems to have forgotten to appoint someone capable of bringing the messy written regulation into line on the technical and legal fronts.

In February 2016, the ITU-T prepared a draft standard on MFS (mobile financial services) (see Appendix 11).

4.3.2.5. Electronic wallets and mobile payment

It was banking networks that had the initiative to launch a digital wallet. However, mobile devices appeared perfectly designed to serve as focal points for the service known as “mobile banking”. In this way, the “Orange Money” service, launched in Africa and the Middle East in 2008, has acquired, in the 13 countries where it was established in 2015, some 14 million customers, a number that should rise to 30 million in 2018. “Orange Money” assists in the transfer of 500 million euros every month and also guarantees the loyalty of its customers. In 2014, this service generated around 50 million euros in revenue. The operating method requires that a set of rules on setting up “tokens” be followed. According to countries, banks and network operators are either competing or collaborating in this service offer, with or without recourse to the Internet. In Kenya, in 2013 the activity of “Mobile Money” represented close to 43% of the country’s GDP. In India, the three biggest mobile network operators opened bank accounts for a total of 400 million mobile subscribers. In 2015, Spanish, Polish and Turkish banks optimized their customer pathways with a videoconferencing service. The number of users of mobile banking services could double between now and 2019, reaching 1.8 billion people around the world (KPMG) and, in 2017, mobile phone applications could produce a global sales revenue of 77 billion dollars, with each mobile user able to send personalized information to more than 100 services each day (Gartner Inc.).

4.3.2.6. The case of the stock market

Stock market transfer orders made over the Internet are often associated with software such as Acorns, Digit.co, Level Money and Mint, which allow users to quickly ascertain the general overview of transactions made at the end of a connection session. However, the ability of stock market applications to do this presents a risk to customers due to the potential vulnerability of connections established without contractual security (particularly where movement is involved), and, due to its success with the public, it creates a great changeability in pricing, with the speed of execution and the simultaneousness of orders able to cause panic among market observers.

Accused of systematic manipulation for nefarious purposes, the stock market’s computing automatons (automatons for high-speed trading, also called “high-frequency trading”) are seen as the “barbarians” of Wall Street and act shamelessly. This technology is accused of assisting in dubious market operations, price manipulation, “shadow markets” and the sharp fall of Wall Street prices, which sparks plenty of controversies [LEW 14].

4.3.2.7. Virtualized transfers

Alternative financing arrangements, which are made using prepaid bank cards, and the practice of microfinancing, or “crowdfunding”, allow tax and intelligence services to be evaded. On the other hand, the Nickel account, arranged by tobacconists, goes through a triple verification process for the client’s identity, and the movements of each account are monitored in real time to track suspicious behavior.

4.3.3. GAFA and online sales

Online sales greatly interest the GAFA companies, innovative Internet players with considerable economic and political power, which have memorized the personal data of a large number of Internet users and acquired dominant positions in ICT markets and on the Internet. The GAFA companies help one another to promote their services or related goods created by their subsidiaries or associated companies.

Facebook (which alone is used by a billion users per day) and Google take up 25% of the time spent by Internet users on mobile applications. For its part, Facebook, which had a turnover of 12.5 billion dollars in 2014 (of which 78% was mobile advertising revenue with 2.5 million advertisers), is testing a new type of page that would allow products to be purchased. It has already tested a buy button in 2014, and it officially launched money transfers on its messaging application Messenger at the start of 2015.

Google has set up a feature allowing its users to purchase products directly through the results of searches made on mobile devices, without having to go to another merchant. Google’s economic model uses sponsored links paid for by businesses and on which commission is paid. This system, which works on desktop computers, mobile devices and tablets, is always highly successful.

4.3.4. Contactless payment

The system of remote payment with an NFC (near-field communication) payment card uses an NFC radio system, which is brought into play between the buyer’s card and the merchant’s terminal. One simple application, freely downloadable from the Internet, allows the confidential information contained on a bank card equipped with NFC technology to be accessed via an MSC-compatible smartphone. By placing the mobile telephone on the card, the card number and expiration date, and therefore the two pieces of information required for payment, are readable. Indeed, the MAC (message authentication code), intended to prove that the data come from a trusted third party, is poorly positioned in IT terms.

Moreover, the merchant’s terminal sends its user name through information exchanges with the body handling the payments. This connection is encrypted, but it can also be read on each printed receipt. Mistakenly, the password of the encrypted link is generally the same for all of the merchants connected to the same payment management organization. A hacker could uncover it using special software in a matter of seconds.

Use of payments without NFC contact can only move forward by increasing the amount of reading equipment in businesses and by strengthening security systems.

4.3.5. FinTech

New players have recently appeared in the digital sector – “FinTech” organizations. These are corporate law firms that combine their profession with basic expertise in cryptographic software, antifraud systems and bank card management. At the same time, they encompass four branches of regulation relating to payments, banking law, asset management and insurance, considered at the national, European and international levels.

Because of the Basel III agreements, which came into force in December 2010, FinTech organizations have been able to develop in Europe, as they are not subject to the same cumbersome regulations as banks. On some aspects, traditional banks, FinTech organizations, network operators and the GAFA companies find themselves in an increasingly international remote payment environment, either competing or obliged to accept a partnership. As both the competitors and allies of banks, FinTech organizations use the resources of banks’ client bases.

FinTech organizations are developing computer applications (API) that only require a small volume of data and allow fast access to massive amounts of information through the Cloud technology. It will likely be banks and FinTech organizations that, together, after examining the available solutions, will select the most suitable system for the biometric authentication required for secure bank transactions. Alternatively, they will at least determine the best usage of the Blockchain.

The aim is to reduce costs and timeframes, while increasing security and reducing losses in order to attract customers. A bank’s founding principle is to gather funds to finance its activities. However, the growth of players in this sector is causing instability in customer capital. The key concern of banks is currently to create customer loyalty to avoid losing funds.

The European Parliament is aiming to encourage market access among new players, in order to create competition, which will be beneficial to consumers. In the second version of the Directive on Payment Services (DPS2) from October 2015, the existence of FinTech organizations was validated, including an expert assessment consisting of collecting and collating the banking data of their customers in several financial establishments and using them in applications. In 2014, according to Venture Scanner, over a thousand FinTech organizations were active globally, four times more than in 2013.

4.4. “P2P” exchanges

The P2P communications system must not be confused with either the term “point-to-point connection” or the “point-to-point” network protocol (PPP). Here it concerns a direct exchange of files between computers equipped with the same software.

4.4.1. P2P, Blockchain and Bitcoin

P2P communication is used to share files in P2P (BitTorrent, for music or video), for grid computing or for communication (streaming media, Skype and TeamViewer, for example). Downloading a file on a P2P network does not allow either the file’s creator or the users connected to the network to be identified, which is a security threat. The sharing of P2P files is replicated on a large number of hubs, which reduces the burden on demand and eases network traffic. This system makes censorship or hacker attacks more difficult. On the other hand, it is impossible to measure the volume of traffic (section 2.7.2.2).

It is said that, in 2008, Satoshi Nakamoto, a Japanese computer specialist, invented a piece of software (known as Blockchain) that is able to make exchanging virtual currency in Bitcoin faster and less expensive without the assistance of central or commercial banks. The Blockchain software is a secure software program with P2P encryption, which uses an information hash feature to encode information, with the elements being dated and stored in the memory of several sites simultaneously, thereby guaranteeing the authenticity of the message. The universal database records transactions and grows in a linear fashion in chunks known as “blocks”, forming the “Blockchain”. The Blockchain can also operate with encryption technologies. This allows the full traceability of transactions and prevents counterfeiting. All transactions on Blockchain are public and the user is not obliged to disclose his or her identity completely. The first application deals with transferring funds in units called “Bitcoin” between partners on the P2P network. Bitcoin has become at once a currency, a technology, a network and an open accounts book.

4.4.2. Alternative cryptocurrencies

In the past, thousands of instruments for monetary exchange were developed. Since 1980, several projects aiming to transform the world of economics have supported the creation of local, social and complementary currencies, known by the acronym “LETS” for “Local Exchange Trading System”. Some are based on the idea of the time used (the Canadian “accorderie”), others on solidarity and still others on promoting local development (of which “LETS” is an example). The roughly 200 LETS, which have been set up globally, still cannot be converted and can only be used in a 50 km radius [SER 99, KEN 08].

There are still several hundred digital currencies that claim to be usable as methods for virtual exchange without benefiting from State liability. “These virtual currencies circulate on P2P networks through cryptographic algorithms. Such currencies include Reddcoin, Litecoin, NXT, the Hayek (based on the value of a gram of gold) and Bitcoin. These trading tools can be classified according to their uses: socially oriented currencies, currencies intended for infrastructure, alternative currencies and currencies for targeted purchases” [FIE 14].

4.4.3. Other Blockchain applications

“Blockchains” allow financial transactions to be tracked through a decentralized IT system: incredibly powerful computers, known as miners, verify that transactions are real and that no imposter has slipped through. Transactions are confirmed by blocks and then added to the record, forming a chain or blocks, hence the name “Blockchains”. Every computer connected to the network hosts a copy of all of the exchanges made. Some user volunteers run a computer program to update the record and detect anomalies. The anonymity of users is protected.

Since 2014, the development of the North American Ethereum project, similar to the Bitcoin project, seems to indicate that it is possible to simplify all slow and expensive authentication procedures, such as those of insurance companies, notaries, Wall Street, online film distribution companies such as Netflix and Hulu, gaming platforms, messaging services such as Twitter, currency exchanges and so on. The total amount of Bitcoin transactions, supported by the current Blockchain software, is around 3.4 billion dollars.

Originally designed for Bitcoin, the Blockchain IT network appears destined for some development. The P2P principle can, according to the promoters of collaborative economics, create value, namely common, tangible or intangible property, without being compelled to ask for permission [BAU 15]. It could transform global finance by removing the “trusted third parties” who currently verify transactions (notaries, clearing houses and so on). Several possibilities are currently being assessed for activities that could connect order executions to the safeguarding of securities1.

Several study groups (Agora, BitCongress and Swarm) are researching the possible use of Blockchain for online voting. Blockchain technology can also be used to keep track of the different owners of diamonds, boats, watches or works of art, any object which must be registered for life in order to be insured. This project is also an attempt to try and eradicate fraud and theft. Land registry management, patent administration, accounting, commercial “couponing”, music and journalism could use “Blockchain” systems.

Presently, Bitcoin only allows seven transactions per second and 10 min must elapse before beginning a new transaction. A new Bitcoin XT differentiates itself from the original version though a larger memory to store the traces of transactions, manage a greater number of transactions and increase network speed. However, the medium size of blocks has continued to increase and is currently approaching 700 kB. On the other side, networks are becoming unstable and endangering trade. Yet, network reliability was one of the main arguments in favor of Bitcoin. Without a significant agreement in 2016, this virtual currency risks disappearing in favor of other competing digital currencies (Ripple, Litecoin and so on).

Regulatory bodies are trying to understand technology and better grasp how it functions to guarantee the protection of citizens by creating the most suitable playing field. P2P technology is legal and increasingly used in various business models (European Commission, https://ec.europa.eu/digital-agenda/en/glossary). However, a warning on the usage of these currencies was given by the French Ministry of Finance in 20142.

The main problem is that virtual currencies such as Bitcoin are not issued or guaranteed by a suitably powerful authority, or by a government, for example. The disadvantages of Bitcoin – its instability and the appearance of unexpected technical problems – explain why it has not achieved the level of acceptance that venture capital companies expected.

4.4.4. Banks and P2P

4.4.4.1. Features

Blockchain is quickly becoming not only the most controversial, but also the most promising technology in the world of financial services, where it sparks as much confusion as curiosity. The interest of banking institutions and FinTech organizations in Blockchain lies in the potential to reduce fees and transfer times, as well the number of actors in financial operations.

4.4.4.2. Financial markets

Blockchain technology seems very promising for the financial markets (NASDAQ, for example), as it is more efficient and both more transparent and more secure, which is essential. It is an IT breakthrough distributed on the basis of hash functions. Blockchain technology is able to bypass financial intermediaries and avoid fiscal controls (and the potential financial transactions tax (FTT), or Tobin tax, currently being discussed by the G20).

The banks Santander and Barclays are looking for their own means to improve their services through distributed and accessible servers in trusted institutions. This money transfer technology would allow infrastructure costs to be reduced by 20 billion dollars per year in total, if the sector accepts a thorough examination of its current technology.

IBM would like to establish a payment system based on Blockchain and on a more efficient and less expensive infrastructure. Partnerships could unite the Fed, IBM and Apple, leading to an open and less expensive payment system that could give a competitive advantage to American players in the field of payments.

Bank of America has filed a patent for a system and method for wire transfers using cryptocurrency on the basis of several systems (including OKCoin, BitStamp and BTCChina). The NASDAQ has implemented a Blockchain to facilitate transfers and sales of shares on its private market. Some large banking institutions, such as Barclays, Visa Europe and the Royal Bank of Canada (RBC), are also interested in Blockchain. In 2015, around 20 companies were guaranteeing the international transfer of funds in Bitcoin, particularly in the United States and Asia.

At the end of 2015, the Chinese Banking Regulatory Commission reduced the activities of some of its online banking platforms. From now on, P2P sites should provide specific information on the quality of lenders, the outstanding amounts of borrowers and the risks of investment projects. China does not wish to exhaust a source of investment, which is so useful to the economy. During 2015, almost 57 billion euros were loaned to Chinese SMEs through P2P sites.

4.4.4.3. Hidden loans

The P2P platform financial service aims to democratize banking practices beyond conventional channels by simplifying procedures and sharing the risks through a network or partners, which would separate the needs of individuals from those of institutions. The sums of money involved are relatively low:

  • – a Lending Club is a company for participatory finance between individuals registered with the Securities and Exchange Commission (SEC);
  • – Funding Circle is a P2P loan platform that allows loans to be directly allocated to small and medium-sized businesses over 5 years.

Crowd Funding – due to the high level of banks’ management fees, alternative loans in P2P seem to have a promising future around the world. Since 2005, market loan platforms have offered an alternative to banking systems and traditional payments. Around 20 French platforms were operational in 2015 for local projects. The amount of loans is close to the amount of microloans (4,500 euros on average, for a total of 140 million in comparison with 4,400 billion in savings in France in 2015). The regulations should soon be clarified, as well as the related taxation.

GAFA – the creation of financial platforms for shared financial operations (crowdfunding) in Blockchain is currently being researched by the GAFA companies. Facebook has obtained accreditation to offer P2P payment services in the United States by bank card through its own P2P messaging service.

4.4.4.4. Potential risks

The risk inherent in the process is connected to its originality, as anyone can join the group and an organization with sufficiently powerful IT calculation abilities could modify its software. As there is no supervisory authority for this sort of activity, it is possible that Bitcoin will remain restricted to minor activities, with the risk of the proliferation of versions of Blockchain software. Virtual currency transfers could be used by drug traffickers, or hackers of online purchases in foreign currencies who finance terrorism or seek to evade the watchfulness of the authorities or the chain of intermediaries. On the other hand, Blockchain technology is likely to infringe the financial sovereignty of States.

Like the Internet, Bitcoin will develop further. However, mass adoption of Blockchain seems further than ever from being achieved due to the numerous psychological, cultural and technological obstacles that have been identified.

4.5. Remote computing

The click-through rate (CTR) of a website reflects the activity of Internet users. In this way, the histogram in Figure 4.2 shows the hourly visitation of a site located in France. It shows that, in reality, over half of customers (55.6 %) are located in traditionally French-speaking time zones, which is confirmed by the majority of the Internet users’ IP addresses. The Internet is indeed a global “village” where information is exchanged at all hours of the day and night. In this example, 44.4% of requests come from countries very far away from France. It transpires that this site has an audience that very happily matches its purpose. Conversely, when dealing with purely local activity, it is worth trying to ascertain whether the installation of an Internet site is indeed justified or if it would be preferable to turn toward a mobile application focused on the audience, which would be simpler to implement and more efficient.

image

Figure 4.2. Hourly click rate on a French language website

ICT development is currently dealing with the production of all things and the management of all services. It is modifying traditional approaches to innovation, organization and production. It is bringing about new activities, breaking with existing activities and thereby giving new life to Schumpeter’s concept of “creative disruption”. The use of IT in the business sector has been integral to moving the concentration of points of sale and creating large retail spaces. Investment dedicated to new technologies, and therefore to software, should not be forgotten as it produces a significant reduction in working time, energy consumption to the customer’s benefit and the dissemination of knowledge.

Software is eating the world; so said Marc Lowell Andreessen, founder of Netscape, in 2011. This assessment simply continues to develop as the world becomes more digitized. According to Afdel, in 2014 the 100 leading French players in software saw their sales revenue jump 10% to 5.6 billion Euros, as opposed to just 2% the previous year. Ten years after its launch, SaaS (software considered to be a service) has undergone significant development. The banking sector and insurance policies will finally benefit from applications that are more secure than in the past.

4.6. Features of the digital economy

4.6.1. Key features

Jean-Marc Vittori, columnist for economic daily Les Echos, characterizes digital technology according to the four major changes it has brought to the economy:

  • – first, digital technology leads to a very wide choice offered by a multitude of web providers;
  • – it reduces the number of intermediaries through disintermediation;
  • – third breakage, some links in the participant’s chain are voluntary;
  • – finally, if a digital service has a development cost, its dissemination is very inexpensive.

It is possible to add other considerations to these features attributed to digital technologies. Digital technology allows information to be distributed among a large number of players more quickly and widely. By the same token, digital technology sets off a large number of reactions and subsequent positive actions. When innovations are increasing in all areas and at a faster pace than before, this acceleration is due to a set of digital means that are now accessible to all. This spread of information is carried out at a very low cost and the number of barriers restricting access to information is also increasingly low.

There is no longer any public report available on the transition of the French publishing sector to digital technology. Each media group will undoubtedly carry out its own internal research on this development as digital technology allows economies of scale to be made, as well as savings in information storage and in production and labor costs. However, the transition from paper editions to entirely digital editions poses a large problem for daily and monthly newspapers with regard to the adaptation of their readers, be they connected through a fixed-line or mobile communication system. The website of the 2016 French press and media watchdog (l’Observatoire de la Presse et des Médias3 notes the very low circulation rates of digital newspapers (under 2%). For now, in 2016, so-called “digital” editions are still largely considered a specific service reserved for subscribers who are willing to be charged. Nonetheless, everyone can agree that progress in this area is slow but steady.

4.6.2. Preferred sectors

Nonetheless, the digital economy has only introduced its innovations in a liberal environment for specific activities and for a limited clientele. However, while its actions have been enough to slightly destabilize the highly organized markets for decades, it is the market’s margins that were large and some “downsizing” was only possible there through external third parties. A natural development of sorts, but not a revolution bringing along an epidemic.

Digital technologies have allowed a wider distribution of communications tools (mobile telephones, photographic equipment, sound recorders, geolocation devices, sensors for the state of health and well-being). The social progress linked to the power of wider communication is important, but a number cannot be attached to it. There are topical facts that illustrate this advancement through the distribution of reports shot on location, immediately resonating with official media and reaching a large audience. While society is better informed today, it also knows that it is more vulnerable as the spread of information is fast and significant.

4.6.3. Company organization

In order for a company targeting the transition to digital technology to succeed, it appears that its management should be organized in a collegial manner around few supervisors cooperating closely with the managers of the IT platform. The employment contract between the actors can be replaced by a tacit agreement between colleagues joining together for a quick and optimal profit without concerns over the time spent. The digital economy creates jobs that will not be allocated to existing employees as a matter of course. Union demands must be forgotten as the digital economy is itself temporary, playing a game of cat and mouse with the administration and regulatory bodies. Wherever the digital economy may develop, the administration fails to enact legislation to stabilize the suggested approaches while respecting corporate and fiscal conventions.

A digital business is not managed in the same way as an ordinary business. It must be more watchful and faster in its decision-making, and because of this it must have indicators built in to its management software. On this subject, Stéphane Schultz, founder of “15 marches”, a consultancy firm specializing in strategy and innovation4, is addressing the management methods of young digital businesses and suggesting strategic plans to them [OST 10].

The commercial and financial management plan, or business model canvas, is a reference document that presents the way in which a business, or any public or private organization, plans to establish and ensure its profitability, the way in which it earns money, conquers parts of the market or increases its traffic. This fundamental document is based around nine components (source: 15 marches).

  1. 1) customer segments;
  2. 2) value proposition;
  3. 3) distribution channels;
  4. 4) customer relations;
  5. 5) revenue streams;
  6. 6) key activities;
  7. 7) key resources;
  8. 8) key partnerships;
  9. 9) cost structure.

According to the authors, this functional decomposition should allow each element to be analyzed and all activities to be synthesized as well as possible.

The pace of innovation is currently so fast that heads of industry and trade are barely able to foresee or even survive the evolution of developments in digital applications. Their future is becoming hard to predict and it must be taken into account as the situation develops. Certainly, stabilizing the volume of use and a better control of applications could be achieved through increasing the cost of Internet usage.

4.6.4. Digital refusers

Who wants to profit from the digital economy? Alongside those fanatical about the innovations brought by the Internet, it must be noted that there are still fringe group of people who are either very reluctant to accept or unconvinced by the usefulness of these developments. The United Kingdom, which is currently making significant efforts toward the success of its “Digital Britain” program, bitterly stated in 2014 that an average of 12.6% of its citizens showed no interest in digital technology, with this proportion reaching almost 25% in Northern Ireland [UK 15].

Senior citizens, in particular, are slow to familiarize themselves with how to handle computers, tablets and high-speed applications. SMS exchanges are a step that many seniors have not managed to reach. There are fears (and not only in the United Kingdom) that digital connections between seniors and the administration will not be achieved. The absence of Internet cafes or centers for continuing education in home computing in rural areas, in particular, restricts Internet access among older people. Planners have forgotten the broad variety of user profiles.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset