CHAPTER 4
Emerging Technologies

In this chapter, we'll talk about the technologies that are poised to impact the way that we work in the future—and perhaps even sooner than we think.

What Is an Emerging Technology?

An emerging technology is one that exists but we have yet to fully use or experience its true potential. Additional research, application, refinement, and even regulation are necessary before we can consider it an established technology—a technology that has been in existence for some time, is used by many, and is considered a standard by just about everyone.

Emerging technologies may also have the power to become disruptive, meaning they have the potential to replace an established technology or have the power to change how an entire society does things.

Let's take email as an example. Before email, documents, cards, and similar items had to be sent physically, and it took days for recipients to receive them. With email, your recipient receives your documents in seconds, reducing the time and costs involved.

Although email is now considered an established technology, it was very much an emerging technology when it was introduced into mainstream society in the mid-to-late 1990s. As email became more secure and robust to handle image and music files, people and businesses started using postal services and paper products like envelopes much less.

The Future of Work

Many research organizations and studies have pointed out that the adoption and improvement of emerging technologies will cause structural unemployment, or people losing their jobs because their skills have become out of date. Artificial intelligence and automation have been widely cited as technologies that will likely cause the most disruption for jobs worldwide.

There's no question that certain roles will feel this effect more than others. Roles that have routine and low-value tasks will face more phase-out than roles involving highly complex work. But tech will not replace the need for humans at every job—at least for right now. Emerging technologies will help people be even better at their existing jobs and will create many new ones, as businesses will need people who understand and can use them.

I was a contract worker at McKinsey's Digital Capability Center (DCC) in Chicago a few years ago. The goal of the DCC was to show small and midsize original equipment manufacturers (OEMs) how they could utilize emerging technologies like the Internet of Things (IoT), augmented reality, and autonomous vehicles to make their manufacturing operations safer and more efficient.

Fun fact—it was here that I earned a Class III forklift license to operate an automated vision–guided vehicle (AGV), which is a forklift that can carry supplies and maneuver around a manufacturing floor on its own using cameras and sensors. The vehicle needed little to no user guidance or intervention.

For clients, the center would simulate a fictional manufacturing floor, where workers were creating compressors by hand. Daily inventory and any production issues that came up were recorded with pencil and paper. If equipment broke down, operators would need to find paper manuals for answers. Any malfunctioning equipment would stop assembly, possibly for the entire day or several days, and require the operator to call for outside service.

The center would then quickly transform itself into a manufacturing floor outfitted with technology, where production data was automatically captured using a variety of sensors, cameras, and cloud-based applications. Using augmented reality–enabled glasses, operators could fix equipment quickly through the onscreen instructions. To ensure that workers spent more time on the assembly line versus getting materials, the AGV could be called through a smartphone or tablet application to bring the materials over, while always checking that the environment was safe to do so (for example, checking to see whether the AGV's path was clear of people).

As the center went from tech-less “current state” to tech-enabled “future state,” at no point were the human operators replaced. The technologies used helped to enhance worker productivity and the quality of their output. I thought this was important to highlight—people are still needed to implement and operate the technologies being used. Rather than replace employees, companies can provide learning opportunities for them so that they are able to move into these new jobs.

You can learn more about the McKinsey Digital Capability Center at www.mckinsey.com/business-functions/operations/how-we-help-clients/capability-center-network/overview.

In the next few sections, I'll provide a general overview of each emerging technology area, why it's worth becoming familiar with, and places you can go to learn more information.

Artificial Intelligence

In the broadest sense of the term, artificial intelligence (AI) is the ability of computers or programs to think and learn like a human being would. A computer or program is considered to be employing AI if it is able to simulate how a human would think and behave in a given situation—applying logic, reason, and ethics and enacting the best way to address the situation.

While we may not have realized the full potential of AI, we are using AI now in places you probably didn't expect. Look at Table 4.1 for examples of how you've probably interacted with AI recently.

Table 4.1: Examples of AI in Common Use

Source: Everyday Examples of Artificial Intelligence and Machine Learning, Emerj.com

SOURCE HOW AI IS BEING USED
Ride sharing (Lyft, Uber) Determine cost of ride, minimize wait time
Email (Gmail) Mail filtering, spam elimination
Banking (most major US banks) Credit decisions, fraud detection
Shopping (Amazon) Product recommendations, fraud detection
Social media (Facebook, Twitter) Determine which posts and articles appear in your feed

The Difference Between Artificial Intelligence, Machine Learning, and Deep Learning

People may use the terms AI, machine learning, and deep learning as synonyms for each other. But each term has a specific definition and can be considered different subsets of one another.

At the very highest level is artificial intelligence. To illustrate this point, think about chatbots, which are programs used to simulate conversations, either through voice or text, between a human and computer. You may have commonly interacted with chatbots on shopping websites like Staples or Starbucks or over the phone on customer service telephone lines.

Humans need to program chatbots to work; they need to know the different types of users who may use their chatbot, the type of information they might want to know or tasks they'd like to complete, and how to integrate the chatbot within existing systems. They then need to develop scripts and prompts that the chatbots will use based on this information.

When a user interacts with the chatbot, they are interacting with prebuilt scripts that human programmers have developed. If the user asks a question or makes a request that is unclear or one that hasn't already been programmed, the chatbot will not respond or will return an error message. The chatbot can't make this adjustment on its own, so the programming team must step in to adjust. In simple AI, everything must be programmed by humans; the computer or program does not have the ability to learn and adapt.

Machine learning helps to address this limitation. Using the chatbot example, when the chatbot encounters a request or question that it doesn't understand, the chatbot learns from the exchange (using data and statistical algorithms) when you tell it that's not what you meant or inform it that it wasn't being helpful. When it encounters the same request or question again, it will have a better, and ideally right, answer next time. No reprogramming is necessary for this to happen—it all happens on its own with machine learning.

You can commonly see this type of AI in action in virtual assistants like Amazon's Alexa and Apple's Siri.

There are three different types of machine learning.

  • Supervised learning: The user takes the lead in training the computer or program. The user will introduce training data to the computer program. When it's given new data, the computer or program will use the training data to make predictions.
  • Unsupervised learning: Here, the computer or program takes a bit more of an active role in its learning. It's given data with no labeling, and on its own, it will make observations and assumptions on the data and begin organizing data in way that the program thinks a human likely would.
  • Reinforcement learning: In this type of learning, the user rates how well (or badly) a computer or program is doing in making its predictions. A user will introduce data—if the user feels that the computer or program produced great output, the user will “reward” it. If the user feels that the output was bad, it will “punish” the computer or program by taking away its reward. The computer or program continues to learn through this process until the maximum possible reward is achieved.

Deep learning takes machine learning much further. Here, neural networks, or sophisticated algorithms that loosely mimic a human brain, are used to complete a task. Given the huge task that machine learning is attempting to take on, it requires a lot of data and computing power. An example would be self-driving cars, like Waymo, Google's self-driving car project.

Why Does It Matter?

AI is becoming more ingrained in our everyday lives and will not stop anytime soon. As it becomes more accessible for businesses, in terms of costs, ease of use, and implementation, they see the many ways that it can add value for them. Companies can use AI to do the following:

  • Analyze and gain more insights from their data
  • Make more accurate predictions
  • Aid workers in performing tasks that require critical analysis
  • Minimize repetitive, low-value work that humans currently do

Figure 4.1 and Figure 4.2 shows examples of AI smart assistants.

Photo depicts a smart speaker from LG.

Figure 4.1: Smart speaker from LG

Photo depicts the Japanese company Line's take on smart speaker.

Figure 4.2: Japanese company Line's take on the smart speaker

Where Can I Learn More?

You can learn more from these resources:

  • AI Topics: AI Topics is a directory of the latest research, news, and events centered around artificial intelligence. The site is maintained by the Association of the Advancement of Artificial Intelligence (AAAI), a professional organization dedicated to the advancement, as well as ethical use, of artificial intelligence.

    aitopics.org

  • AI Trends: AI Trends is a website devoted to applying artificial intelligence in the business world.

    aitrends.com

  • Google AI: Google hosts a free site where people at all skill levels can complete courses, tutorials, and interactive exercises on AI and machine learning. It also offers a guide on what is considered ethical uses of AI.

    ai.google/education

Augmented, Virtual, and Mixed Reality

Augmented, virtual, and mixed reality aim to offer users interactive experiences (see Figure 4.3). The level of interactivity and immersion for each of these is different. Let's explore each one.

Photo depicts a woman having fun in a virtual world.

Figure 4.3: Having fun in a virtual world

  • Augmented reality (AR) is where the actual environment you're in is enhanced with computer-generated objects. Your actual physical environment remains the same, but digital objects enhance that environment. Examples of this include those cool stickers you put on top your selfies and videos on Snapchat and the Pokémon Go app, where users attempt to capture virtual versions of their favorite Pokémon monsters in their local environments
  • Virtual reality (VR) allows you to be completely immersed in a computer-generated, artificial environment. This environment may be a digitized version of the one you're currently in, or it could be a completely different one. Facebook's Oculus VR headsets, as well as Sony's Project Morpheus, are major examples of VR.
  • Mixed reality (MR) is, as the name implies, a combination of both AR and VR. It is like AR in that your environment is enhanced with computer-generated objects. The difference, however, is that there is a level of interactivity possible with these objects. Think of it as an “enhanced environment” of sorts. An example would be Microsoft's HoloLens 2 and its apps HoloBlock and HoloBrush. In these applications, you can digitally paint or build simple block structures, which are overlaid on your physical environment.

Why Does It Matter?

Although AR, VR, and MR have been relegated to mostly video games and social media, they can be hugely beneficial to, many industries.

In manufacturing, AR and MR technologies can allow operators to be more efficient and produce higher-quality output, as the technology can help them in their work. When you need to make repairs, rather than trying to find a manual, AR and VR technologies can bring up the instructions in no time and physically guide an operator through the steps of repair.

VR is already actively used in flight and driving simulation. For those who are relatively new to using a vehicle, simulations allow them practice as much as they'd like before using the real thing. This helps makes the new pilots and drivers feel more comfortable, while saving on time and fuel costs.

In advertising and marketing, AR and VR give businesses an opportunity to create memorable experiences for audiences. For example, IKEA has an AR app where, using your phone's camera, customers can virtually see what a piece of IKEA furniture would look like in their home. The app allows users to place a virtualized version of IKEA furniture overlaid in your bedroom, living room, kitchen, etc. The app allows you to see what a piece of furniture would look like in your home well before going all the way out to an IKEA store, buying it, and realizing that it's all wrong for your home.

AR can even be helpful in real estate. Instead of spending time and effort to look at multiple properties, you can use a real estate agency's app to view what an apartment or house looks like. This offers a fuller experience than looking at a bunch of photos and does not require you to leave the comfort of home.

Where Can I Learn More?

You can learn more from the following resources:

  • Google Cardboard: Using your smartphone and a cardboard viewer (which can be purchased on the Google Cardboard site for $9–$17), you can experience and experiment with AR and VR yourself in a low-cost way.

    arvr.google.com/cardboard

  • VR Scout: VR Scout is a website with the latest news and events in the AR/VR world. It offers a weekly newsletter called the Scouting Report, which is a roundup of the most popular VR and AR news.

    vrscout.com

  • Enter VR: Although much of Enter VR's articles and podcasts talk about the using VR in gaming, it devotes content to VR's application in other industries, like medicine.

    entervr.net

Blockchain

Blockchain is a ledger or registry, shared by two or more parties, that contains transaction information—usually when an asset has changed hands from one party to another—that is permanently stored. Once the transaction information is entered into the blockchain and time-stamped into the ledger, it cannot be altered by anyone.

Blockchain solves several problems that can come up when people and businesses are engaged in transactions. It is mostly commonly associated with financial transactions, but blockchain has a variety of use cases in different industries.

For example, if each party in a transaction is maintaining their own transaction records, whose ledger should be trusted if each one is showing different or missing information for the same transaction, especially if they're paper records? How do we know that someone didn't alter their records fraudulently or just simply forgot to enter information? Blockchain removes that uncertainty in the following ways:

  • Decentralization: No one member of the blockchain owns or controls the network. All members of the network are considered peers.
  • Consensus: Using very sophisticated algorithms, called cryptographic hashes, transactions are checked in the order they occurred, which ensures that the ledger posts the same information for all members.
  • Immutability: Once the transaction is confirmed and posted on the ledger, it cannot be deleted or altered by anyone in the network.
  • Digital signatures: These ensure that no one was posing as one of the network's members when the transaction posted.

Blockchain technology can help remove some of the anxiety of doing business with people who you may not know very well. Because consensus handling is automatic and no one person or organization controls the network, it gives more confidence that you can trust the records completely. As the transaction information is available to all members, there's far more transparency, and it allows people to view information at any time.

Many blockchain frameworks are currently available; some of them are open source, meaning they are freely available for individuals and organizations to use and alter for their purposes (see Figure 4.4). These include Hyperledger Fabric, Openchain, and Multichain. Major technology companies also offer blockchain technologies at a cost.

Blockchain should not be confused with bitcoin. Bitcoin is a popular cryptocurrency (a form of currency that is available only in digital form). Bitcoin and other popular cryptocurrencies use blockchain as the foundation of their respective platforms.

Schematic illustration of an example of a blockchain network. The dots signify each participant's equal ownership over the blockchain.

Figure 4.4: Example of a blockchain network. The dots signify each participant's equal ownership over the blockchain.

Why Does It Matter?

Blockchain can be helpful for people and businesses that need to protect high-value assets. When we say assets, this could be money or deeds/leases to physical property. Financial services companies have been very interested in blockchain technology, as it gives them a solid means to reduce fraud and their overall risks. This could lead to financial services being able to send payments faster and reduce the fees they charge to customers.

Assets can also be noncash items, like food or supplies. In the case of supply chain–based businesses, a blockchain will allow them to monitor from start to finish where their goods are and identify whether the goods they are receiving are genuine.

For all industries, especially those that have heavy regulatory requirements, having a blockchain-based system makes it easier to verify and audit records, and this reduces the overall time and costs involved with audit activities.

Where Can I Learn More?

You can learn more from the following resources:

  • Blockchain at Berkeley: Headquartered at the University of California at Berkeley, this blockchain-focused organization offers virtual classes on blockchain fundamentals and technology.

    blockchain.berkeley.edu

  • The Hyperledger Project: Started by the Linux Foundation in 2015, the Hyperledger Project is a collaborative project aimed at making blockchain technologies, like Hyperledger Fabric, widely adopted by business communities. Major technology companies continue to invest money and time into the project. The project's website offers free beginner online courses and tutorials on blockchain, as well as project news and developments.

    hyperledger.org

Cloud Computing

The term cloud computing can cause a lot of confusion. Many organizations and companies define cloud computing differently from one another, and the use of the word of cloud may conjure images that there is some grand computer in the sky where services are accessed. It's misleading, as there are no actual clouds, or computers peeking behind clouds, involved.

Cloud computing isn't a technology per se. Rather, it's more a business concept that utilizes several existing Internet technologies (see Figure 4.5). What makes it emerging is that it has radically changed how people and businesses consume computing resources.

For clarity, I use the National Institute of Standards and Technology's definition of cloud computing; the reason is that NIST is a US government agency that governs standards regarding measurement in several different areas, including technology. This helps compare apples to apples when talking to others about cloud computing.

Schematic illustration of the devices accessing the cloud.

Figure 4.5: Devices accessing the cloud

The NIST definition is as follows:

Let's take a moment and drill down further into NIST's five key characteristics of cloud computing:

  • On-demand self-service: When users want to use cloud services, they can provision, or create, them on their own, and they're available automatically or near instantly. Users don't need to reach out to a third party to get the service going—they start it up when they need it.
  • Broad network access: If you have access to the Internet and a device that can connect to the Internet (e.g., a computer, smartphone, or tablet), you will be able to access the cloud service.
  • Resource pooling: Because cloud providers have a considerable number of servers and processing resources, they are able to offer their products and services to many customers, or tenants, at the same time. Resources can be dynamically assigned according to the demand of their customers.
  • Rapid elasticity or expansion: Users can consume as much, or as little, of a service as they need or want. If you find that you need more resources, you can add them. In contrast, if you need to completely stop or take a break, you can reduce your utilization or discontinue service completely. You can, of course, always come back at a later date.
  • Measured service: Like a telephone or electricity bill, a user is charged only for the service that is actually used. All their usage is monitored, controlled, and reported. This provides visibility and transparency to both the user and the cloud service provider on costs.

Service Models

NIST takes the definition further by defining three distinct service models, or the types of services that cloud providers can offer to people and businesses: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS).

A traditional IT department for a business organization is typically responsible for purchasing, maintaining, and servicing all software and hardware, as shown in the far-left column of Table 4.2.

Table 4.2: Comparison of Cloud Computing Service Models

BUSINESS IAAS PROVIDER PAAS PROVIDER SAAS PROVIDER
Applications Application Application Application
Data Data Data Data
Operating systems Operating systems Operating systems Operating systems
Servers Servers Servers Servers
Storage Storage Storage Storage
Networking Networking Networking Networking

Shaded: Managed by the customer

Clear: Managed by the cloud provider

When a company agrees to work with an IaaS provider, they are asking the provider to give them access to their servers, storage, and network capabilities. The provider is responsible for ensuring that these resources are available just about all the time, as well as maintaining the physical equipment necessary, upgrades, and, to a certain extent, security. The compromise is that the company has little to no control over how the provider gets this done.

Going further, when a company agrees to work with a PaaS provider, the provider is giving the company IaaS services, in addition to providing a platform to create new applications or run their existing applications. This allows the company to make applications that can scale to demand and to make applications available anytime, anywhere. Upgrades for applications, as well as operating software, can happen automatically. This can be good for businesses that have apps where their demand can vary—an example would be retail store apps that have to maintain normal usage patterns throughout the year, but their demand spikes during the holiday season.

Finally, when a company uses a SaaS provider, it is using the provider's complete stack. The software is available for use on the cloud and usually isn't downloadable onto the company's computers, and the company can't change the underlying code or other aspects of the software. All software upgrades are automatically performed by the provider and do not require intervention from the company. The company pays a recurring fee, usually referred to as a subscription fee, on a monthly or yearly basis. SaaS is sometimes called on-demand software and is the most widely used cloud service model.

Table 4.3 shows examples of different examples of IaaS, PaaS, and SaaS providers.

Table 4.3: Examples of IaaS, PaaS, and SaaS Providers

IAAS PROVIDERS PAAS PROVIDERS SAAS PROVIDERS
Amazon Web Services Amazon Web Services Adobe
(e.g., Adobe Photoshop)
Google Cloud Google Cloud Google
(e.g., G Suite)
Microsoft Azure Microsoft Azure Microsoft
(e.g., Office 365)
Rackspace Salesforce Salesforce.com

It should be noted that the number of cloud service offerings are increasing. For example, interest is growing in serverless computing because it allows users to manage servers or run applications when needed using code. This allows them to use services only when they are absolutely needed and to be charged accordingly.

Deployment Models

Adding to the complexity, there are three main deployment models, or environments in which you can access cloud services:

  • Public cloud: Resources that are available in a public cloud can be accessed by anyone; they are not exclusive to one person or organization. Costs for services in public clouds tend to be cheaper, but quality may be negatively impacted, as you're sharing resources with a bunch of people.
  • Private cloud: Resources in these types of clouds can be accessed and used only by a single person or organization. Availability and security are of the highest priority, but the associated costs increase considerably.
  • Hybrid cloud: Using both public and private clouds, the resources from both are tightly integrated with one another.

Why Does It Matter?

Cloud computing has allowed people and organizations alike more control over the costs and investments that they make in computing resources. Businesses can access more computing resources than if they tried to purchase equipment and maintain that equipment on their own. This also decreases the overall amount of money they must spend on hardware, software, security, electricity, and physical space.

Businesses can create, or spin up, resources when there is a need for them and then either decrease resources or get rid of them depending on the current needs. This is the opposite of buying hardware and software outright; rarely can businesses return hardware and software once purchased. They may only be able to sell it at a price that is much lower than what they paid for it.

Businesses today can't afford not to be on the cloud, as the landscape has become far more competitive. Our society has become very reliant on apps for travel, food, banking, communication, etc. Because of this, we have been groomed to expect what we want or need to be available, when we want or need it. A business that wants to remain operational, let alone competitive, will need to understand how to leverage the cloud to meet the market's demands and needs.

People also consume software applications much differently. In the past, users needed to purchase a software license—anywhere from a few dollars to several hundred dollars—and were typically allowed to install the software on only one device and pay a separate fee for any upgrades. You couldn't access that program and its data from any other computer but the one you installed it on. Now, people can pay a small monthly subscription fee for a program and can use it on any device they own that has Internet access. When they no longer need the application, they can cancel the subscription at any time.

Where Can I Learn More?

You can learn more at the following resources:

  • NIST publications: NIST offers perhaps the most comprehensive, vendor-neutral information on cloud computing, or information that does not show favor to any one cloud provider. NIST provides general education publications, as well as overall advice on how organizations can adopt cloud computing while minimizing risk.

    csrc.nist.gov/Projects/Cloud-Computing/publications

  • Major cloud service provider platforms: Each of the major cloud providers provides education via general cloud computing overviews, tutorials, and access to free credits or trial versions to experiment with their services. For those attending school, additional free credits and extended training are available Sites like A Cloud Guru offer instruction on how to access free trials, as well online training on the major platforms.

    acloud.guru

  • The Cloudcast: The Cloudcast is perhaps one of the most comprehensive podcasts currently out there on cloud computing. The podcast interviews leaders in the cloud computing space, as well as providing information on new trends and career advice. It's produced weekly and is available on Apple iTunes, Google Play, Spotify, and other platforms.

    thecloudcast.net

Internet of Things

The Internet of Things refers to devices, big and small, that share data with other objects or people. This data is being shared automatically and without any human or computer interaction or intervention.

A device can be any electronic device that can access the Internet. This can include the following:

  • Fitness trackers
  • Smart watches (Figure 4.6)
  • Smart TVs and speakers
  • Home appliances
  • Sensors
Photo depicts an example of a smart watch.

Figure 4.6: Example of a smart watch

According to the software company SAS, there will be more than 150 billion IoT-enabled devices used in the world by 2025; they will also produce 175 zettabytes of data by that time. To put that into perspective, 1 zettabyte is equal to 1,000,000,000,000,000,000,000 bytes.

To illustrate how IoT works, we will use a fitness tracker, like a WiFi-enabled Fitbit, as an example. A user puts on the fitness tracker at the start of a workout. The tracker records certain aspects of the user's workout, as it has several sensors on it that can detect heart rate, body temperature, motion, etc.

As this data is being collected, the fitness tracker is using its internal Internet connection to transmit this data to Fitbit's servers. This connection happens instantly, without the user entering commands on the tracker to do so.

With this data, Fitbit is now able to keep a running tally of your fitness metrics and habits and make suggestions to you on how you can meet your fitness goals much faster. You can access these records and suggestions through Fitbit's app at any time. If, for some reason, the Fitbit is noticing irregular heart or health activity, it will immediately notify you that you should see a doctor right away.

Rather than using the commonly used HTTP (how we typically access web pages), IoT can use one of the following protocols to communicate:

  • Messaging Queue Telemetry Transport (MQTT)
  • Constrained Application Protocol (CoAP)
  • Data Distribution Service (DDS)
  • Advanced Messaging Queueing Protocol (AMQP)

The major reasons that these protocols are used is that they send information faster than HTTP and in a manner that is optimized for devices that have limited battery life and that may not always be connected to the Internet.

Why Does It Matter?

Many industries can reap the benefits of using IoT devices. When patients use wearable technology, like fitness trackers, medical professionals can better treat and diagnose patients because they have a more accurate record of a patient's vital records. IoT can also be used to tag medical equipment and determine where it is physically located.

Media and entertainment companies can also offer more engaging content for audiences through IoT devices by analyzing what audiences are viewing or listening to through them.

In manufacturing and utilities, companies can use sensors to predict when machines or other equipment is failing and determine whether preventative maintenance (fixing an item before it is likely to break down) is necessary.

IoT offers organizations the ability to perform their tasks in a well-organized way that minimizes wasted time and increases the likelihood of customer satisfaction.

Because there are tons of these devices collecting and transmitting data, businesses will want to take every opportunity to analyze this data. This ideally will lead businesses to be able to get actionable insights. Also, as sensor technology improves, different types of data will be detected.

For those interested in cybersecurity, IoT offers a wealth of opportunities too. Cybersecurity company Auth0 estimates that more than 20 billion IoT devices are at risk of a cyberattack, as many of these devices have vulnerabilities and don't have native security software in place. Securing these devices will become more and more important as time progresses.

Where Can I Learn More?

You can learn more here:

  • Eclipse IoT: Eclipse IoT is an open source project that allows users to build their own IoT solutions. The site allows users to experiment in sandbox environments.

    iot.eclipse.org

3D Printing

3D printing, or additive manufacturing, is the ability to create a solid, three-dimensional object from a computer file. 3D printing can make just about anything—from jewelry to entire houses!

3D printing is referred to as additive because objects are built in layers. This is different from typical manufacturing methods because no materials are being cut, drilled, or ground down. No materials are being wasted, and there is very little human interaction in the process. Once the computer file is introduced to the printer, the printer will work on its own.

At a high level, a user will create the object to be printed using computer-aided design (CAD). Once complete, the user will set up the printer for use. This includes making sure that the printer materials have been carefully added to the printer. After the user sends the file to the printer, the user confirms that everything is the way they'd like, and the printer starts. Depending on the complexity of the object being printed, it can take anywhere from a few minutes to many hours. It is best to leave the printer undisturbed and away from places where printing can be interrupted. The last thing you want is all your hard work undone by someone accidently bumping into the printer.

When 3D printers first came out, the printers and materials needed to print were very expensive and too big to fit into a normal room. Now, printers can be purchased on Amazon for less than $200 (see Figure 4.7). Additionally, the number of materials that can be purchased and used has also increased—3D printers can use ceramics, plastics, and metals, among other materials, to create objects.

Photo depicts the examples of 3D printers.

Figure 4.7: Examples of 3D printers

Why Does It Matter?

Rather than purchase expensive materials or wait for suppliers to provide parts, people and organizations from just about any industry can print out what they need. By being able to print their own materials, construction and manufacturing companies can save time and money on purchasing materials.

For example, the homebuilding and construction company ICON, in partnership with the San Francisco nonprofit New Story, built what could be the first livable, 3D-printed home for $10,000 and in under 24 hours in 2019. This was done in the hopes of addressing the housing issues that are prevalent in San Francisco and other parts of the world. They did this at a fraction of the time and costs it would take to build a home. The site HomeAdvisor says that the average 2,000 sq. ft. home, like the one ICON built, costs $303,488 to build and takes roughly four to five months, depending on overall complexity.

3D printing can also reduce the cost of creating prototypes. A user can quickly create and build out a product prototype, without a huge investment in materials or time. Manufacturing-based businesses have been exploring how to integrate 3D printing more and more.

Where Can I Learn More?

You can learn more at the following resources:

  • Makerbot: Makerbot sells 3D printers to businesses and educational institutions. It also hosts the 3D Innovation Center for universities that provides training and guidance in utilizing 3D printers in different industry applications.

    makerbot.com

  • 3D Printing Industry: This site hosts a short, free beginners guide on 3D printing.

    3dprintingindustry.com

  • AM Basics: AM Basics provides a good overview of 3D printing and its implications for industry.

    additivemanufacturing.com

  • Tinkercad: Tinkercad is a free, easy-to-use CAD app that's available online.

    tinkercad.com

Keeping Up with Technology Trends

There is no shortage of technologies yet to be fully utilized, and that won't change. If you intend to work and stay in tech, staying on top of, or even ahead of, the technology curve is important. The last thing you want to do is invest too much time and money on a technology area that is becoming obsolete. How then can you stay informed, while not feeling completely overwhelmed?

The first thing to realize is that you don't have to be an expert in every single technology area. It's impossible to do so. The best approach is to have a general understanding of what is out there, and you can do a deeper dive into the areas that interest you or resonate with you the most.

You want to strive to be a T-shaped tech professional, having the benefit of both generalization and specialization (see Figure 4.8). This means you have a broad understanding of current and emerging technology area, as well as skills, while at the same time, you are a subject-matter expert in one or two key areas. Not only is this attractive to prospective employers, but it gives you a variety of topics you can speak to beyond just one.

Schematic illustration of a T-shaped professional which has the benefit of both generalization and specialization.

Figure 4.8: The T-shaped professional

Information Sources

Here are some information sources:

  • Your network: Your network can be an incredible source of information on what's new and can perhaps help you sift through the hype, which there is no shortage of in tech.
  • MIT Technology Review: Published by the Massachusetts Institute of Technology, the bimonthly magazine offers the latest information on emerging technologies. It also maintains several free newsletters, ranging in topics from AI and blockchain to space technology. Articles on the website are free to access. A yearly issue is dedicated to the top 10 technologies they believe will have the most impact in the future.

    technologyreview.com

  • TED Conferences: Short for “technology, entertainment, design,” TED conferences are talks on a variety of tech topics. While the conferences occur all over the world, video versions of TED Talks are available on the website for free.

    ted.com

  • ThoughtWorks' Technology Radar: ThoughtWorks, a global technology consulting firm, publishes a free semiyearly report that offers the company's thoughts on what technologies you should be paying more attention to and which ones may not be worth investing in.

    thoughtworks.com/radar

  • Wired magazine: Published monthly, Wired magazine covers emerging technologies and their effects on business and our society.

    wired.com

  • Wall Street Journal (WSJ) Tech: Although much of the content here is written from a business perspective, WSJ Tech does provide the latest in tech news and where tech investments are being made for the future. While WSJ requires a subscription, WSJ Tech Weekly is a free newsletter filled with tech news highlights.

    wsj.com/news/technology

Tools

The following are tools to use:

  • Google Alerts: Google Alerts is a free tool where you can tell Google to email you every time new and relevant articles, blogs, etc., are published on a search term you enter. You can set up alerts on phrases, companies, and people—just about anything. You can specify which sources you are interested in as well as how often you want to be notified.

    google.com/alerts

  • Social media: Using LinkedIn and Twitter specifically, you can follow certain hashtags that are of interest. The following are examples of hashtags you can subscribe to on LinkedIn:

    #artificialintelligence

    #cloudcomputing

    #blockchain

    #future

    #iot

    #innovation

    #technology

When you sign up for them, your newsfeed will start to have content on those subject areas. You can unsubscribe from them at any time. Also, on both platforms, you can follow tech leaders and companies.

Summary

  • An emerging technology is one that is in existence but its full potential has not been completely realized.
  • Emerging technologies can have disruptive effects to jobs and industries. However, emerging technologies can also produce new jobs and industries. We are not at a place in our society where technology will completely replace humans.
  • Staying aware of technology trends will be an important part of your tech career. In addition to keeping your skills sharp, pay attention to which technologies are becoming popular.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset