In this chapter, we'll talk about the technologies that are poised to impact the way that we work in the future—and perhaps even sooner than we think.
An emerging technology is one that exists but we have yet to fully use or experience its true potential. Additional research, application, refinement, and even regulation are necessary before we can consider it an established technology—a technology that has been in existence for some time, is used by many, and is considered a standard by just about everyone.
Emerging technologies may also have the power to become disruptive, meaning they have the potential to replace an established technology or have the power to change how an entire society does things.
Let's take email as an example. Before email, documents, cards, and similar items had to be sent physically, and it took days for recipients to receive them. With email, your recipient receives your documents in seconds, reducing the time and costs involved.
Although email is now considered an established technology, it was very much an emerging technology when it was introduced into mainstream society in the mid-to-late 1990s. As email became more secure and robust to handle image and music files, people and businesses started using postal services and paper products like envelopes much less.
Many research organizations and studies have pointed out that the adoption and improvement of emerging technologies will cause structural unemployment, or people losing their jobs because their skills have become out of date. Artificial intelligence and automation have been widely cited as technologies that will likely cause the most disruption for jobs worldwide.
There's no question that certain roles will feel this effect more than others. Roles that have routine and low-value tasks will face more phase-out than roles involving highly complex work. But tech will not replace the need for humans at every job—at least for right now. Emerging technologies will help people be even better at their existing jobs and will create many new ones, as businesses will need people who understand and can use them.
I was a contract worker at McKinsey's Digital Capability Center (DCC) in Chicago a few years ago. The goal of the DCC was to show small and midsize original equipment manufacturers (OEMs) how they could utilize emerging technologies like the Internet of Things (IoT), augmented reality, and autonomous vehicles to make their manufacturing operations safer and more efficient.
Fun fact—it was here that I earned a Class III forklift license to operate an automated vision–guided vehicle (AGV), which is a forklift that can carry supplies and maneuver around a manufacturing floor on its own using cameras and sensors. The vehicle needed little to no user guidance or intervention.
For clients, the center would simulate a fictional manufacturing floor, where workers were creating compressors by hand. Daily inventory and any production issues that came up were recorded with pencil and paper. If equipment broke down, operators would need to find paper manuals for answers. Any malfunctioning equipment would stop assembly, possibly for the entire day or several days, and require the operator to call for outside service.
The center would then quickly transform itself into a manufacturing floor outfitted with technology, where production data was automatically captured using a variety of sensors, cameras, and cloud-based applications. Using augmented reality–enabled glasses, operators could fix equipment quickly through the onscreen instructions. To ensure that workers spent more time on the assembly line versus getting materials, the AGV could be called through a smartphone or tablet application to bring the materials over, while always checking that the environment was safe to do so (for example, checking to see whether the AGV's path was clear of people).
As the center went from tech-less “current state” to tech-enabled “future state,” at no point were the human operators replaced. The technologies used helped to enhance worker productivity and the quality of their output. I thought this was important to highlight—people are still needed to implement and operate the technologies being used. Rather than replace employees, companies can provide learning opportunities for them so that they are able to move into these new jobs.
You can learn more about the McKinsey Digital Capability Center at www.mckinsey.com/business-functions/operations/how-we-help-clients/capability-center-network/overview
.
In the next few sections, I'll provide a general overview of each emerging technology area, why it's worth becoming familiar with, and places you can go to learn more information.
In the broadest sense of the term, artificial intelligence (AI) is the ability of computers or programs to think and learn like a human being would. A computer or program is considered to be employing AI if it is able to simulate how a human would think and behave in a given situation—applying logic, reason, and ethics and enacting the best way to address the situation.
While we may not have realized the full potential of AI, we are using AI now in places you probably didn't expect. Look at Table 4.1 for examples of how you've probably interacted with AI recently.
Table 4.1: Examples of AI in Common Use
Source: Everyday Examples of Artificial Intelligence and Machine Learning, Emerj.com
SOURCE | HOW AI IS BEING USED |
Ride sharing (Lyft, Uber) | Determine cost of ride, minimize wait time |
Email (Gmail) | Mail filtering, spam elimination |
Banking (most major US banks) | Credit decisions, fraud detection |
Shopping (Amazon) | Product recommendations, fraud detection |
Social media (Facebook, Twitter) | Determine which posts and articles appear in your feed |
People may use the terms AI, machine learning, and deep learning as synonyms for each other. But each term has a specific definition and can be considered different subsets of one another.
At the very highest level is artificial intelligence. To illustrate this point, think about chatbots, which are programs used to simulate conversations, either through voice or text, between a human and computer. You may have commonly interacted with chatbots on shopping websites like Staples or Starbucks or over the phone on customer service telephone lines.
Humans need to program chatbots to work; they need to know the different types of users who may use their chatbot, the type of information they might want to know or tasks they'd like to complete, and how to integrate the chatbot within existing systems. They then need to develop scripts and prompts that the chatbots will use based on this information.
When a user interacts with the chatbot, they are interacting with prebuilt scripts that human programmers have developed. If the user asks a question or makes a request that is unclear or one that hasn't already been programmed, the chatbot will not respond or will return an error message. The chatbot can't make this adjustment on its own, so the programming team must step in to adjust. In simple AI, everything must be programmed by humans; the computer or program does not have the ability to learn and adapt.
Machine learning helps to address this limitation. Using the chatbot example, when the chatbot encounters a request or question that it doesn't understand, the chatbot learns from the exchange (using data and statistical algorithms) when you tell it that's not what you meant or inform it that it wasn't being helpful. When it encounters the same request or question again, it will have a better, and ideally right, answer next time. No reprogramming is necessary for this to happen—it all happens on its own with machine learning.
You can commonly see this type of AI in action in virtual assistants like Amazon's Alexa and Apple's Siri.
There are three different types of machine learning.
Deep learning takes machine learning much further. Here, neural networks, or sophisticated algorithms that loosely mimic a human brain, are used to complete a task. Given the huge task that machine learning is attempting to take on, it requires a lot of data and computing power. An example would be self-driving cars, like Waymo, Google's self-driving car project.
AI is becoming more ingrained in our everyday lives and will not stop anytime soon. As it becomes more accessible for businesses, in terms of costs, ease of use, and implementation, they see the many ways that it can add value for them. Companies can use AI to do the following:
Figure 4.1 and Figure 4.2 shows examples of AI smart assistants.
You can learn more from these resources:
Augmented, virtual, and mixed reality aim to offer users interactive experiences (see Figure 4.3). The level of interactivity and immersion for each of these is different. Let's explore each one.
Although AR, VR, and MR have been relegated to mostly video games and social media, they can be hugely beneficial to, many industries.
In manufacturing, AR and MR technologies can allow operators to be more efficient and produce higher-quality output, as the technology can help them in their work. When you need to make repairs, rather than trying to find a manual, AR and VR technologies can bring up the instructions in no time and physically guide an operator through the steps of repair.
VR is already actively used in flight and driving simulation. For those who are relatively new to using a vehicle, simulations allow them practice as much as they'd like before using the real thing. This helps makes the new pilots and drivers feel more comfortable, while saving on time and fuel costs.
In advertising and marketing, AR and VR give businesses an opportunity to create memorable experiences for audiences. For example, IKEA has an AR app where, using your phone's camera, customers can virtually see what a piece of IKEA furniture would look like in their home. The app allows users to place a virtualized version of IKEA furniture overlaid in your bedroom, living room, kitchen, etc. The app allows you to see what a piece of furniture would look like in your home well before going all the way out to an IKEA store, buying it, and realizing that it's all wrong for your home.
AR can even be helpful in real estate. Instead of spending time and effort to look at multiple properties, you can use a real estate agency's app to view what an apartment or house looks like. This offers a fuller experience than looking at a bunch of photos and does not require you to leave the comfort of home.
You can learn more from the following resources:
Blockchain is a ledger or registry, shared by two or more parties, that contains transaction information—usually when an asset has changed hands from one party to another—that is permanently stored. Once the transaction information is entered into the blockchain and time-stamped into the ledger, it cannot be altered by anyone.
Blockchain solves several problems that can come up when people and businesses are engaged in transactions. It is mostly commonly associated with financial transactions, but blockchain has a variety of use cases in different industries.
For example, if each party in a transaction is maintaining their own transaction records, whose ledger should be trusted if each one is showing different or missing information for the same transaction, especially if they're paper records? How do we know that someone didn't alter their records fraudulently or just simply forgot to enter information? Blockchain removes that uncertainty in the following ways:
Blockchain technology can help remove some of the anxiety of doing business with people who you may not know very well. Because consensus handling is automatic and no one person or organization controls the network, it gives more confidence that you can trust the records completely. As the transaction information is available to all members, there's far more transparency, and it allows people to view information at any time.
Many blockchain frameworks are currently available; some of them are open source, meaning they are freely available for individuals and organizations to use and alter for their purposes (see Figure 4.4). These include Hyperledger Fabric, Openchain, and Multichain. Major technology companies also offer blockchain technologies at a cost.
Blockchain should not be confused with bitcoin. Bitcoin is a popular cryptocurrency (a form of currency that is available only in digital form). Bitcoin and other popular cryptocurrencies use blockchain as the foundation of their respective platforms.
Blockchain can be helpful for people and businesses that need to protect high-value assets. When we say assets, this could be money or deeds/leases to physical property. Financial services companies have been very interested in blockchain technology, as it gives them a solid means to reduce fraud and their overall risks. This could lead to financial services being able to send payments faster and reduce the fees they charge to customers.
Assets can also be noncash items, like food or supplies. In the case of supply chain–based businesses, a blockchain will allow them to monitor from start to finish where their goods are and identify whether the goods they are receiving are genuine.
For all industries, especially those that have heavy regulatory requirements, having a blockchain-based system makes it easier to verify and audit records, and this reduces the overall time and costs involved with audit activities.
You can learn more from the following resources:
The term cloud computing can cause a lot of confusion. Many organizations and companies define cloud computing differently from one another, and the use of the word of cloud may conjure images that there is some grand computer in the sky where services are accessed. It's misleading, as there are no actual clouds, or computers peeking behind clouds, involved.
Cloud computing isn't a technology per se. Rather, it's more a business concept that utilizes several existing Internet technologies (see Figure 4.5). What makes it emerging is that it has radically changed how people and businesses consume computing resources.
For clarity, I use the National Institute of Standards and Technology's definition of cloud computing; the reason is that NIST is a US government agency that governs standards regarding measurement in several different areas, including technology. This helps compare apples to apples when talking to others about cloud computing.
The NIST definition is as follows:
Let's take a moment and drill down further into NIST's five key characteristics of cloud computing:
NIST takes the definition further by defining three distinct service models, or the types of services that cloud providers can offer to people and businesses: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS).
A traditional IT department for a business organization is typically responsible for purchasing, maintaining, and servicing all software and hardware, as shown in the far-left column of Table 4.2.
Table 4.2: Comparison of Cloud Computing Service Models
BUSINESS | IAAS PROVIDER | PAAS PROVIDER | SAAS PROVIDER |
Applications | Application | Application | Application |
Data | Data | Data | Data |
Operating systems | Operating systems | Operating systems | Operating systems |
Servers | Servers | Servers | Servers |
Storage | Storage | Storage | Storage |
Networking | Networking | Networking | Networking |
Shaded: Managed by the customer
Clear: Managed by the cloud provider
When a company agrees to work with an IaaS provider, they are asking the provider to give them access to their servers, storage, and network capabilities. The provider is responsible for ensuring that these resources are available just about all the time, as well as maintaining the physical equipment necessary, upgrades, and, to a certain extent, security. The compromise is that the company has little to no control over how the provider gets this done.
Going further, when a company agrees to work with a PaaS provider, the provider is giving the company IaaS services, in addition to providing a platform to create new applications or run their existing applications. This allows the company to make applications that can scale to demand and to make applications available anytime, anywhere. Upgrades for applications, as well as operating software, can happen automatically. This can be good for businesses that have apps where their demand can vary—an example would be retail store apps that have to maintain normal usage patterns throughout the year, but their demand spikes during the holiday season.
Finally, when a company uses a SaaS provider, it is using the provider's complete stack. The software is available for use on the cloud and usually isn't downloadable onto the company's computers, and the company can't change the underlying code or other aspects of the software. All software upgrades are automatically performed by the provider and do not require intervention from the company. The company pays a recurring fee, usually referred to as a subscription fee, on a monthly or yearly basis. SaaS is sometimes called on-demand software and is the most widely used cloud service model.
Table 4.3 shows examples of different examples of IaaS, PaaS, and SaaS providers.
Table 4.3: Examples of IaaS, PaaS, and SaaS Providers
IAAS PROVIDERS | PAAS PROVIDERS | SAAS PROVIDERS |
Amazon Web Services | Amazon Web Services | Adobe (e.g., Adobe Photoshop) |
Google Cloud | Google Cloud | Google (e.g., G Suite) |
Microsoft Azure | Microsoft Azure | Microsoft (e.g., Office 365) |
Rackspace | Salesforce | Salesforce.com |
It should be noted that the number of cloud service offerings are increasing. For example, interest is growing in serverless computing because it allows users to manage servers or run applications when needed using code. This allows them to use services only when they are absolutely needed and to be charged accordingly.
Adding to the complexity, there are three main deployment models, or environments in which you can access cloud services:
Cloud computing has allowed people and organizations alike more control over the costs and investments that they make in computing resources. Businesses can access more computing resources than if they tried to purchase equipment and maintain that equipment on their own. This also decreases the overall amount of money they must spend on hardware, software, security, electricity, and physical space.
Businesses can create, or spin up, resources when there is a need for them and then either decrease resources or get rid of them depending on the current needs. This is the opposite of buying hardware and software outright; rarely can businesses return hardware and software once purchased. They may only be able to sell it at a price that is much lower than what they paid for it.
Businesses today can't afford not to be on the cloud, as the landscape has become far more competitive. Our society has become very reliant on apps for travel, food, banking, communication, etc. Because of this, we have been groomed to expect what we want or need to be available, when we want or need it. A business that wants to remain operational, let alone competitive, will need to understand how to leverage the cloud to meet the market's demands and needs.
People also consume software applications much differently. In the past, users needed to purchase a software license—anywhere from a few dollars to several hundred dollars—and were typically allowed to install the software on only one device and pay a separate fee for any upgrades. You couldn't access that program and its data from any other computer but the one you installed it on. Now, people can pay a small monthly subscription fee for a program and can use it on any device they own that has Internet access. When they no longer need the application, they can cancel the subscription at any time.
You can learn more at the following resources:
The Internet of Things refers to devices, big and small, that share data with other objects or people. This data is being shared automatically and without any human or computer interaction or intervention.
A device can be any electronic device that can access the Internet. This can include the following:
According to the software company SAS, there will be more than 150 billion IoT-enabled devices used in the world by 2025; they will also produce 175 zettabytes of data by that time. To put that into perspective, 1 zettabyte is equal to 1,000,000,000,000,000,000,000 bytes.
To illustrate how IoT works, we will use a fitness tracker, like a WiFi-enabled Fitbit, as an example. A user puts on the fitness tracker at the start of a workout. The tracker records certain aspects of the user's workout, as it has several sensors on it that can detect heart rate, body temperature, motion, etc.
As this data is being collected, the fitness tracker is using its internal Internet connection to transmit this data to Fitbit's servers. This connection happens instantly, without the user entering commands on the tracker to do so.
With this data, Fitbit is now able to keep a running tally of your fitness metrics and habits and make suggestions to you on how you can meet your fitness goals much faster. You can access these records and suggestions through Fitbit's app at any time. If, for some reason, the Fitbit is noticing irregular heart or health activity, it will immediately notify you that you should see a doctor right away.
Rather than using the commonly used HTTP (how we typically access web pages), IoT can use one of the following protocols to communicate:
The major reasons that these protocols are used is that they send information faster than HTTP and in a manner that is optimized for devices that have limited battery life and that may not always be connected to the Internet.
Many industries can reap the benefits of using IoT devices. When patients use wearable technology, like fitness trackers, medical professionals can better treat and diagnose patients because they have a more accurate record of a patient's vital records. IoT can also be used to tag medical equipment and determine where it is physically located.
Media and entertainment companies can also offer more engaging content for audiences through IoT devices by analyzing what audiences are viewing or listening to through them.
In manufacturing and utilities, companies can use sensors to predict when machines or other equipment is failing and determine whether preventative maintenance (fixing an item before it is likely to break down) is necessary.
IoT offers organizations the ability to perform their tasks in a well-organized way that minimizes wasted time and increases the likelihood of customer satisfaction.
Because there are tons of these devices collecting and transmitting data, businesses will want to take every opportunity to analyze this data. This ideally will lead businesses to be able to get actionable insights. Also, as sensor technology improves, different types of data will be detected.
For those interested in cybersecurity, IoT offers a wealth of opportunities too. Cybersecurity company Auth0 estimates that more than 20 billion IoT devices are at risk of a cyberattack, as many of these devices have vulnerabilities and don't have native security software in place. Securing these devices will become more and more important as time progresses.
You can learn more here:
3D printing, or additive manufacturing, is the ability to create a solid, three-dimensional object from a computer file. 3D printing can make just about anything—from jewelry to entire houses!
3D printing is referred to as additive because objects are built in layers. This is different from typical manufacturing methods because no materials are being cut, drilled, or ground down. No materials are being wasted, and there is very little human interaction in the process. Once the computer file is introduced to the printer, the printer will work on its own.
At a high level, a user will create the object to be printed using computer-aided design (CAD). Once complete, the user will set up the printer for use. This includes making sure that the printer materials have been carefully added to the printer. After the user sends the file to the printer, the user confirms that everything is the way they'd like, and the printer starts. Depending on the complexity of the object being printed, it can take anywhere from a few minutes to many hours. It is best to leave the printer undisturbed and away from places where printing can be interrupted. The last thing you want is all your hard work undone by someone accidently bumping into the printer.
When 3D printers first came out, the printers and materials needed to print were very expensive and too big to fit into a normal room. Now, printers can be purchased on Amazon for less than $200 (see Figure 4.7). Additionally, the number of materials that can be purchased and used has also increased—3D printers can use ceramics, plastics, and metals, among other materials, to create objects.
Rather than purchase expensive materials or wait for suppliers to provide parts, people and organizations from just about any industry can print out what they need. By being able to print their own materials, construction and manufacturing companies can save time and money on purchasing materials.
For example, the homebuilding and construction company ICON, in partnership with the San Francisco nonprofit New Story, built what could be the first livable, 3D-printed home for $10,000 and in under 24 hours in 2019. This was done in the hopes of addressing the housing issues that are prevalent in San Francisco and other parts of the world. They did this at a fraction of the time and costs it would take to build a home. The site HomeAdvisor says that the average 2,000 sq. ft. home, like the one ICON built, costs $303,488 to build and takes roughly four to five months, depending on overall complexity.
3D printing can also reduce the cost of creating prototypes. A user can quickly create and build out a product prototype, without a huge investment in materials or time. Manufacturing-based businesses have been exploring how to integrate 3D printing more and more.
You can learn more at the following resources:
There is no shortage of technologies yet to be fully utilized, and that won't change. If you intend to work and stay in tech, staying on top of, or even ahead of, the technology curve is important. The last thing you want to do is invest too much time and money on a technology area that is becoming obsolete. How then can you stay informed, while not feeling completely overwhelmed?
The first thing to realize is that you don't have to be an expert in every single technology area. It's impossible to do so. The best approach is to have a general understanding of what is out there, and you can do a deeper dive into the areas that interest you or resonate with you the most.
You want to strive to be a T-shaped tech professional, having the benefit of both generalization and specialization (see Figure 4.8). This means you have a broad understanding of current and emerging technology area, as well as skills, while at the same time, you are a subject-matter expert in one or two key areas. Not only is this attractive to prospective employers, but it gives you a variety of topics you can speak to beyond just one.
Here are some information sources:
The following are tools to use:
#artificialintelligence
#cloudcomputing
#blockchain
#future
#iot
#innovation
#technology
When you sign up for them, your newsfeed will start to have content on those subject areas. You can unsubscribe from them at any time. Also, on both platforms, you can follow tech leaders and companies.