CHAPTER FOUR

COMPETING ON ANALYTICS WITH INTERNAL PROCESSES

FINANCIAL, M&A, OPERATIONS, R&D, AND HUMAN RESOURCE APPLICATIONS

Analytics can be applied to many business processes to gain a competitive edge. We’ve divided the world of analytical support for business processes into two categories: internal and external. Chapter 5 will address external applications—customer and supplier-driven processes—and this one will focus on internal applications (refer to figure 4-1). It’s not always a perfectly clean distinction; in this chapter, for example, the internal applications sometimes involve external data and entities, even though they are not primarily about supply and demand, customers, or supply chains. But the focus here is on such internally facing functions as general management, finance and accounting, operations, R&D, and human resource management. These are, in fact, the earliest applications of “decision support.” The original intent was that this data and these systems would be used to manage the business internally. Only more recently have they been applied to working with customers and suppliers, as companies have accumulated better data about the outside world with which they interact.

FIGURE 4-1


Application domains for analytics

image


The challenge, then, is not simply to identify internal applications of business analytics but to find some that are clearly strategic and involve competitive advantage. Any company, for example, can have a financial or operational scorecard, but how does it contribute to a distinctive, strategic capability? Customer-related applications are more likely to be strategic almost by definition; internal applications have to work at it to make a strategic impact. They have to lead to measurable improvements to financial or operational performance. In the box “Typical Analytical Techniques for Internal Processes,” we list some common approaches.

Financial Analytics

Despite being a quantitative field by nature, finance has trailed other functions like marketing, supply chain, operations, and even human resources in employing advanced analytics to make key decisions. Chief financial officers (CFOs) are generally thought of as the “numbers people” in their organizations. They are often articulate advocates for using predictive and prescriptive analytics for decision making—but usually when it is in some other department.

Finance groups, naturally, have long used reports, dashboards and scorecards, and online queries in the course of their work. The problem is that these descriptive analytics applications don’t tell the user anything about underlying patterns in the numbers, and they only describe the past. Finance professionals may experiment with an occasional regression model in Excel, but for the finance function to make advanced analytics a core capability—on the same level as external reporting or the closing process—is quite rare.

We remain optimistic that this situation is beginning to change. As CFOs become more comfortable with the use of analytical models in the rest of the business, we expect more innovative analytical applications are being implemented in finance. At Intel, for example, a small number of finance professionals began to advocate for greater use of analytics three years ago. They presented to the senior finance team the idea of building a generalized competency in advanced analytics, and the reaction was positive. One early initiative was to compare Intel’s finance analytics capabilities to those of leading firms in the area, and Intel found that some online or technology-focused firms in Silicon Valley (many of which have strong analytical orientations in general) had more advanced capabilities than it had. Intel started several specific initiatives in the forecasting area, including statistical forecasts of revenue and inventory levels, and prediction of impairments in Intel Capital’s investments. The finance function at Intel has also embarked on a broad effort to educate finance professionals and managers about advanced analytics topics and is planning certification programs for them as well.

Given the amount of data available to finance functions and the level of insight that can be achieved, it seems inevitable that finance’s use of advanced analytics will become more prevalent. In the following pages, we offer some examples of analytical financial applications, since they of course have the most direct tie to financial performance. There are several categories of financial analytics applications, including external reporting, enterprise performance management (management reporting and scorecards), cost management, and risk management.

External Reporting to Regulatory Bodies and Shareholders

External reporting doesn’t lead to competitive advantage under “business as usual” conditions. It’s not normally an advantage to report more quickly and accurately beyond a certain level. Cisco Systems, for example, has touted over the years its ability to close the books instantaneously and to report its results almost immediately at the end of a financial period. But we often wondered why the company bothered; the SEC doesn’t demand instantaneous closes, and it’s not clear what else a company could do with financial results intended for external consumption. But it is a different story with the information used by managers to make better strategic, investment, and operational decisions.

Reporting and scorecards, both financial and operational, are some of the most common applications of business intelligence and decision support. They’re obviously important to managing any business, and they’re increasingly important (with the advent of Sarbanes-Oxley legislation, for example) to keeping senior executives out of jail. While organizations do not compete on the basis of their reporting and scorecards, having systems in place to monitor progress against key operating metrics and monitor progress against plan is critical to strategy execution.

New regulatory data requirements can become opportunity to create a rich data source to improve performance. For example, pharmaceutical companies routinely interact with healthcare professionals both as customers and as consultants, researchers, and marketing speakers. Due to the potential for a conflict of interest, more than forty countries have enacted “anti-kickback” regulations that require transparency into these relationships. As a result, companies must aggregate all physician spend data into a single location so that it can be reported appropriately to the correct jurisdiction. According to Everest Group, “The industry is viewing the increased regulatory supervision as a burden. However, the regulatory requirements are putting in place a foundation for data requirements that can be used to drive pharma and life sciences analytics. Organizations can extract additional value in a challenging market, and create competitive differentiation by utilizing this opportunity.”1 Medical device manufacturer Zimmer Biomet, for example, uses MediSpend’s data and analytic dashboards to monitor compliance with global compliance laws. But it also leverages the data to create new business models that drive device and drug development.

As we argued in chapter 1, reporting activities have typically not made extensive use of analytics, but there are exceptions. One is the prediction of future performance.

Public companies have to make regular predictions of future performance for investors and analysts. The consequences of poor predictions can be dramatic; investors may severely punish companies that fail to “meet their numbers.” Most companies make straightforward extrapolations of results from earlier periods to later ones. Yet in certain industries with a high degree of change and uncertainty, extrapolations can be problematic and yield poor predictions.

The information technology industry is one of those uncertain industries. Products and customer desires change rapidly, and disproportionate sales volumes take place at the ends of reporting periods. Hewlett-Packard found that it was very difficult to make accurate predictions of revenues in such an environment, and in one quarter of 2001 had a “nearly disastrous” revenue growth forecasting error of 12 percent.2 Hewlett-Packard executives decided to put some of their quantitative researchers in HP Labs onto the problem of creating more accurate revenue forecasts.

The researchers used a Bayesian inference approach (see definition in the box earlier in this chapter) to predict monthly and quarterly revenues from data thus far in the period. After several adjustments, the algorithm yielded significantly better revenue predictions than the more straightforward approach used previously. Hewlett-Packard incorporated the new algorithm into its performance-reporting dashboard, and the company’s (then) CFO, Bob Wayman, noted, “It is reassuring to have a solid projection algorithm. It’s crisper and cleaner, it has rigor and methodology as opposed to my own algorithm.”3

Better prediction of future performance from an operational standpoint also allows companies to take action sooner. Using “near-real-time” operational data, managers can quickly identify emerging trends, make predictions, and take prompt action. For example, during the last recession, Dell executives used predictive models to recognize months before their competitors how thoroughly sales were going to soften. They took preemptive action on prices and products, which resulted in better (or at least, less awful) financial performance during the downturn. More importantly, as the recession ended, they were able to adjust again to boost their market share.

Enterprise Performance Management and Scorecards

Another exception to financial reporting applications involving analytics is when companies try to explain financial performance from nonfinancial factors. A successful enterprise performance management initiative requires companies not just to predict performance accurately but also to come to grips with some broader questions: Which activities have the greatest impact on business performance? How do we know whether we are executing against our strategy? An organization needs quantifiable insight into the operational factors that contribute to business results and a way of measuring progress against them.

Over the past decade or so, the biggest advance in management reporting has been the shift from IT delivering structured reports or fairly static scorecards to giving managers unprecedented access to enterprise data. Information workers and managers are able to access and explore financial and operational information from across the organization using business intelligence applications that incorporate alerts, management dashboards, dynamic filters, and data visualization tools. As analytical tools incorporate artificial intelligence technologies like natural language generation, they are able to help managers understand and interpret the data, too. For example, Credit Suisse integrated Quill (an advanced natural language generation platform from Narrative Science) into HOLT, its investment data analytics platform. Credit Suisse analyzes both proprietary and market data to produce investment research reports that assess company expectations, upside and risk to help analysts, bankers, and investors make long-term investment decisions. Credit Suisse has achieved full analyst coverage on all five thousand companies profiled within its platform, increasing the number of companies that Credit Suisse analyzes by more than 300 percent. Because investment analysts no longer need to write summary reports themselves, they are able to focus more on their clients and on conducting more sophisticated analyses.

These performance management systems report not only on financial performance but also on such nonfinancial domains as customer relationships, employee learning and innovation, and operations. These have been a great step forward in understanding performance. But too many adopters of balanced scorecards still aren’t very balanced, focusing primarily on financial reporting. It’s great to add nonfinancial metrics to a scorecard, but if investors and regulators don’t really care about them, it’s natural to emphasize the financial numbers.

Another problem with most dashboards is that even when companies do include both financial and nonfinancial measures, they don’t relate them to each other. Management professors David Larcker and Chris Ittner studied several companies that had adopted balanced scorecards.4 None of the companies they examined had causal models that related nonfinancial to financial performance.

Nonfinancial or intangible resources (such as human knowledge capital, brand, and R&D capability) are growing in importance to both company business performance and external perceptions of company value.5 Even the most favorable studies only show an explanatory power of approximately 50 percent for the effect of financial measures such as earnings per share, net income, economic profit, or return on invested capital on a company’s market value.6 In some industries, EPS accounts for less than 5 percent of value. We believe that a management team that manages all its sources of value—tangible and intangible, current and future—has a significant advantage over those who do not.

Some companies are working to develop a holistic understanding of both financial and nonfinancial value drivers. A few companies at the frontier are seeking to manage both their current and future shareholder value.7 These organizations are exploring how to infuse their scorecards with data from Wall Street analysts and future-value analytics to gain better insight into the implications of decisions on shareholder value. Companies that develop such a capability might be well on the way to competitive advantage.

Reporting and scorecards are most likely to lead to competitive advantage when the business environment is changing dramatically. In those circumstances, it’s particularly important to monitor new forms of performance. New measures need to be developed, new models of performance need to be created, and new management behaviors need to emerge. The speed and effectiveness of organizational transformation in such periods can certainly be or lead to a competitive advantage. For example, a property and casualty insurance company we worked with needed to transform itself. It was a poor financial performer, with losses over the last four years of well over a billion dollars. It paid out $1.40 in claims and expenses for every premium dollar it took in. The company had monitored its financial results, but it didn’t have a handle on what was creating the poor performance.

As part of a major corporate transformation, the company focused on three key processes: producer (agent) relationships, profitable underwriting, and claims execution. In addition to redesigning those processes, the company created new measures of them and collected them in a balanced scorecard. The scorecard served as a means to rapidly communicate the performance of and changes in the processes, and the success of the change initiative overall, to the management team. The scorecard assessed the company’s ability to deliver on such objectives as:

  • Selecting profitable new markets to enter
  • Attracting and selecting the right customers
  • Driving pricing in accordance with risk
  • Reducing the severity of claims

Reward systems for employees and managers at all levels of the company were changed to tie to performance on these objectives. Some automated analytics were embedded into underwriting systems to speed the process and improve the quality of pricing decisions.

The company used these process changes and reporting approaches to dramatically turn itself around. It began to make substantial amounts of profit and was eventually acquired by another insurance firm for $3.5 billion—up from perhaps zero value a few years earlier.

Cost Management

Although some might question whether cost management can lead to a competitive advantage, there are a still a few examples of organizations that have made effective analysis and management of costs a strategic capability.

In health care (at least in the United States), few hospitals even know their costs of treating patients, and are thus rarely able to control them. One exception is University of Utah Healthcare, which includes four academic medical centers and a variety of clinics and centers in the Salt Lake City area. UUHC embarked on a multiyear effort to accurately measure its costs. The project, called Value Driven Outcomes (VDO), used an activity-based costing approach. VDO fully integrates all available data regarding a patient encounter to provide a complete and comprehensive view of the care of patients. UUHC started by developing accurate cost accounting data using the actual acquisition costs of each item used in each patient encounter. Then comprehensive utilization methods were employed to assign all the costs to each patient encounter. Once cost information had been gathered, UUHC also developed key quality metrics and integrated the quality data with the cost and clinical data, so that a comprehensive analysis of cost-effectiveness could be used.

The hospital was then able to apply statistical methods to answer obvious—but previously impossible-to-answer—questions such as: What is the cost of poor quality? Where is there meaningful variation in cost or quality between physicians? Where is patient care over-utilizing labs or imaging where there is no additional quality or value as a result?

UUHC also engaged its physicians in the development and use of the VDO data and analytics tools. They developed self-service descriptive analytics tools with physician guidance, including continual feedback to ensure that what was being developed would meet their needs and answer their questions. The results are impressive: while costs at other academic medical centers in the area have risen by 2.9 on average annually over the past few years, UUHC’s have declined by 0.5 percent a year.

Energy costs are another expensive resource for many organizations. Microsoft’s five-hundred-acre corporate campus in Redmond, Washington, contains 14.9 million square feet of office space and labs. Seeking to save energy and reduce utility and maintenance costs, the company considered investing over $60 million to upgrade its incompatible networks of equipment and over thirty thousand sensors. Darrell Smith, director of Facilities and Energy, wanted to avoid a disruptive and expensive construction and replacement project. Additionally, the Microsoft team was concerned about displacing employees and the resulting loss in productivity. So instead, Smith and his team developed an “analytical blanket” to connect and weave together the diverse systems used to manage the campus buildings. The analytical layer enabled the team to string together data from thousands of sensors in the buildings, as well as in equipment such as HVAC, fans, and lighting systems. Soon they were accumulating billions of data points every week. Microsoft describes the benefits this way:

That data has given the team deep insights, enabled better diagnostics, and has allowed for far more intelligent decision making. A test run of the program in 13 Microsoft buildings has provided staggering results—not only has Microsoft saved energy and millions in maintenance and utility costs, but the company now is hyper-aware of the way its buildings perform. . . . It’s no small thing—whether a damper is stuck in Building 75 or a valve is leaky in Studio H—that engineers can now detect (and often fix with a few clicks) even the tiniest issues from their high-tech dashboard at their desks . . . rather than having to jump into a truck to go find and fix the problem in person.8

For example, in one garage, exhaust fans had been running continuously for a year (resulting in $66,000 of wasted energy). Within moments of coming online, the smart buildings solution detected the mistake and the problem was corrected. The software also informed engineers about a pressurization issue in a chilled water system in another building. It took less than five minutes to fix the problem, which saved about $12,000 each year.

The system also prioritizes problems to be fixed. Algorithms can balance out the cost of a fix in terms of the money and energy wasted with other factors such as the impact fixing the problem will have on employees who work in that building. So a lower-cost problem in a research lab with critical operations may be given higher priority than a higher-cost fix that directly affects few. According to Smith, almost half of the issues the system identifies can be corrected in under a minute.

Engineers appreciate being able to spend their time anticipating and preventing issues and solving problems instead of gathering data and reacting to emergencies. “I used to spend 70 percent of my time gathering and compiling data and only about 30 percent of my time doing engineering,” notes facilities engineer Jonathan Grove. “Our smart buildings work serves up data for me in easily consumable formats, so now I get to spend 95 percent of my time doing engineering, which is great.”

Microsoft is forecasting energy savings of 6–10 percent per year. And as each new algorithm comes online, the company finds new opportunities to save energy. For example, a “new algorithm for detecting when the air in a given building is being overcooled,” can save fifteen minutes of air conditioning for each instance detected. The team seems confident that even greater energy and cost savings are on the horizon.

Risk Management

Identifying and minimizing risk is a priority for every organization. Risk managers have been using data and analytical tools to do their jobs for many years. Recent technical advances and new types of data are reinvigorating the use of predictive analytics to mitigate risk. While the specific risks vary by organization, two of the most active areas have been in fraud detection and cybersecurity.

A growing challenge for financial services firms is to protect their customers and themselves against fraudulent claims or purchases. The Coalition Against Insurance Fraud conservatively estimates that fraud costs insurers $80 billion a year.9 Companies increasingly rely on sophisticated fraud detection algorithms to recognize fraudulent credit card purchases in real time. Credit card companies like Visa and American Express have been using advanced analytics methods for years to identify stolen credit cards and fraudulent purchases in real time. By combining point of sale, authorization, and geolocation information with past purchasing behavior, these companies are getting better at pinpointing suspicious behavior and alerting cardholders of potential problems.

Insurers are fighting back too, using predictive analytics to identify where they need to focus their attention. According to Bill Dibble, senior vice president of claims operations, Infinity Property & Casualty:

We realized that automobile insurance claims could be scored in the same way as consumer credit applications, and that technology could significantly enhance that process by making specific inferences on behavior . . . We developed a process to assign different “scores” on fraud probabilities when claimants first report an accident. By using a rules engine that automatically scores claims based on its fraud probability, we could forward suspicious claims into the hands of investigators within a day or two for deeper analysis. As a result of using this technology, not only have we slashed the time it takes to identify potential fraudulent claims within 24 hours—it used to take between 30–60 days—and we have been more successful in catching fraud claims.”10

In another example, Infinity analyzed its old adjusters’ reports to develop a new algorithm that resulted in $12 million in subrogation recovery.11

IT groups historically have been better at creating predictive models for other parts of the business, but they have been slower to embrace analytics to manage their own function. That is changing as cybersecurity challenges proliferate and become more daunting. The number of threats to large organizations is growing rapidly, as is the number of hackers who create them and the number of systems at risk from cyberattacks. Data breaches are increasing, according to one report, by 85 percent a year, and in 2016, half a billion personal records were stolen or lost.12

And with the proliferation of connected devices, there is no doubt that the challenge of protecting an organization’s data can only grow. Cybersecurity processes within companies are too often reactive to hacks and breaches; investigation and actions are taken only after (sometimes long after) a problem has occurred. The technology most commonly used to address cyberattacks employs “threat signatures” based on patterns of previous attacks. Of course, these approaches are of little value in preventing new types of attacks. Predictive analytical methods originally developed to detect and prevent credit card fraud—a form of anomaly detection—are now being applied to behaviors in cybersecurity attacks.13 Some cognitive technologies, including deep learning, can also identify anomalies in transaction patterns. These approaches can identify emerging anomalies much faster than using threat signatures, and may be able to identify threats much earlier and prevent substantial hacks and data losses before they occur. Given the sensitivity of cybersecurity issues, humans will still be necessary to confirm and investigate threats, particularly when they are internal. But the amount of investigative labor can be substantially reduced through analytics. Organizations in both public and private sectors today are using analytics and—to a lesser degree—cognitive technologies and automation to improve their cybersecurity programs. It’s unclear when these technical capabilities will be fully mature, but there should be no doubts about their necessity and the likelihood of their ultimate adoption.14

Merger and Acquisition Analytics

Mergers and acquisitions (M&A) have historically not been the focus of a lot of analytical activity, perhaps with the exception of detailed cash flow analyses. There is usually little attention paid to operational analytics involving supply chain efficiencies, predicted customer reactions, and impacts on costs within the combined organization. This may be a reason why a high proportion—estimates as high as 70 and 80 percent—of M&A deals are not terribly successful in terms of producing economic value.

We haven’t found any firm that really differentiates itself on the quality of its M&A analytics. But that might be starting to change. A 2015 Deloitte survey of five hundred corporate executives found 68 percent were using data analytics for M&A (although only 40 percent saw it as a core capability for M&A), and more than 80 percent saw data analytics becoming increasingly important in the future of M&A.15 The most common uses were to understand customers, markets, workforces, and compensation. But some more advanced organizations were using predictive analytics for identification and realizing potential synergies.

Certainly, some M&A deals must be done so quickly that extensive analytics would be difficult or impossible. When Bank of America was given the opportunity to acquire Fleet Bank, for example, it had about forty-eight hours to make a decision. But for most deals, there’s plenty of time to undertake analyses. At Procter & Gamble, for example, the acquisition of Gillette was considered for more than a year before the deal was announced. P&G’s analysis identified significant savings (in its supply chain, through workforce reductions, and potential gains from customer synergies), which were used to determine how much P&G offered for Gillette in the deal. Similarly, CEMEX, the global cement company, uses analytics to quantify expected benefits from increased market share and improved profitability by enforcing its processes and systems on the takeover target.

One company attempting to inject more analytical rigor into its M&A integration activity is IBM. IBM has been on an acquisition spree lately, and to improve the odds of achieving its goals, it has developed a machine learning algorithm called M&A Pro. The system produces visualizations quantifying key integration risks, offers qualitative advice and creates a financial dashboard comparing performance of past deals against their initial business plan. Paul Price, IBM’s director of M&A Integration says, “Not everyone is an M&A process expert. But what we have done is create a common language, an Esperanto, for deal execution across the organization . . . Our business now is much more grounded in economic and operational reality.”16

Operational Analytics

One analytical domain that has long existed in companies is operations, especially manufacturing, quality, safety, and logistics. This was the original home, for example, of Total Quality Management and Six Sigma, which, when done seriously, involve detailed statistical analysis of process variations, defect rates, and sources of problems. Manufacturing and quality analytics have had an enormous impact on the global manufacturing industries, but until recently their impact has been less revolutionary for service industries and nonmanufacturing functions within manufacturing companies. For many organizations, it still seems difficult to summon the required levels of discipline and rigor to apply statistical quality control or even a strong process orientation outside of manufacturing. This means, of course, that it becomes difficult for firms to compete broadly on the basis of analytics, since they are often limited to the manufacturing function and generally focused on achieving productivity improvements rather than innovations to gain a competitive advantage.

Manufacturing

Real analytical competitors in manufacturing, then, are those that go beyond just manufacturing. There are a few great examples of this approach. One is at a small steel manufacturer in the United States, and it illustrates that analytical competition applies both to smaller firms and to the manufacture of commodity goods. Rocky Mountain Steel Mills, a steel rail-, rod-, and bar-making division of Oregon Steel, faced a critical decision about manufacturing capacity in early 2005. It had shut down its seamless pipe mill in 2003 because of price pressures, but the primary customers for seamless pipe were oil drillers, and by 2005 oil prices had risen enough to make Rob Simon, the company’s vice president and general manager, consider reopening the pipe mill. However, he found the rules of thumb and standard costing/volume analyses previously used for such decisions to be too simplistic for a fast-changing mix of demand, pricing, production constraints, and industry capacity.

Simon decided to turn to a more analytical approach, and Rocky Mountain installed an analytical software tool called Profit InSight. He began to work with monthly analyses to determine whether the plant should be reopened. Potential customers and other managers thought that the high prices for pipe clearly justified the capital outlay that would be needed to reopen, but Simon’s analyses suggested that increased revenues for pipe would be offset by lower production of rod and bar, and would not be profitable. Only when pipe prices continued to rise throughout 2005 did Simon decide to reopen the plant in December of that year. Even when production started, his models suggested holding off on taking orders because prices were predicted to rise. Indeed, by January 2006 they had risen by 20 percent over the previous quarter.

Rocky Mountain estimates that in addition to the higher prices it received, it averted a $34 million loss it would have faced from production constraints had it reopened the plant earlier in 2005. The success with the new pipe mill was also a key factor in a substantial rise in Oregon Steel’s stock price. Profit InSight is now used as a weekly strategic planning tool, and Rocky Mountain Steel Mills has completely abandoned its previous “rhetorical” sales planning and forecasting approach for the new analytical methods. They are measurably superior, but Simon still had to demand that everyone listen to what the analyses show and quit second-guessing using the old approaches.

Engineers involved in product design have been using computer-aided design (CAD) for decades. But recent advances using parametric modeling permit more flexible, individually customized designs that also cut manufacturing time. Firewire Surfboards, a manufacturer of high-performance, premium surfboards wanted to give customers more freedom to customize their board without sacrificing board performance or overwhelming their design and manufacturing processes. Firewire boards are constructed through proprietary methods and a combination of high-tech materials previously not offered by other commercial surfboard manufacturers. But Firewire’s CEO Mark Price knew that this wasn’t enough to be a market leader. Hardcore surfers want customized boards—made for them specifically to suit their personal style and local wave conditions. But the complexity of Firewire’s production process made customization through its computer-aided design system extremely labor-intensive. Firewire worked with ShapeLogic and Siemens’ NX 3D CAD software to develop advanced parametric models of its stock and added specific rules that control how the customer’s changes affect the board’s curves to ensure peak performance.

Online customers begin by selecting one of Firewire Surfboards’ standard models and then altering the design to fit their needs. Once the customer orders a custom board, a precise solid model is sent directly to Firewire’s factory, where it is used to run the machines that manufacture the board. The model makes it possible to quickly machine custom boards to about 97 percent of their net shape, which minimizes the required finishing processes, manufacturing time, and costs.17 As more powerful and sophisticated (yet consumer-friendly) CAD solutions become more accessible, we expect customer customization of bicycles, sports equipment, automobiles and even medical devices will become commonplace.

For internet-based businesses, operations means churning out the basic service for which customers visit a website. Successful online businesses use analytics to test virtually all aspects of their sites before implementing them broadly. For example, because the primary reason customers visit Google is to use its search capabilities, the company has a very extensive program of testing and analytics with regard to its search engine. Google employs a wide range of operational and customer data and analytics to improve search attributes, including relevance, timeliness, and the user experience. The company developed many of its own proprietary measures of search relevance. Most of the metrics are gathered in an automated fashion, such as the percentage of foreign results, how deeply users go in the retrieved items list, the percentage of users who go to each page of the search result, and the measures of search latency or timeliness. But Google also collects human judgments on the search process, and even observes individual users (at Google headquarters and in users’ homes) as they use the site for individual queries and entire sessions. One technique employed is eye tracking, from which “heat maps” are created showing which areas of a page receive the most attention.

Google is heavily committed to experimentation before making any change in its search site. As Google’s Bill Brougher put it:

Experiments are a requirement of launching a new feature. It’s a very powerful tool for us. We have been experimenting for years and have accumulated a lot of institutional knowledge about what works. Before any new feature ships, it has to pass through a funnel involving several tests. For example, any change to our search algorithms has to be tested against our base-level search quality to make sure the change is a substantial improvement over the base. A little blip in quality isn’t significant enough to adopt.18

Google’s methods for analytical operations are as rigorous as any firm’s, and the nature of the business makes a large amount of data available for analysis.

Another key aspect of manufacturing analytics is to ensure that the right products are being manufactured. We’ll refer to it as the configuration problem—making sure that the products offered to the market are those that the market wants. The configuration problem, like the ones described earlier at Rocky Mountain Steel and Firewire Surfboards, is cross-functional; it takes place at the intersection of sales and manufacturing, and usually also involves the company’s supply chain, financial, and even human resource processes. To compete on the basis of configuration is thus by definition an enterprise-wide activity. Configuration is highly analytical. It involves predictive modeling of what customers want to buy, as well as complex (usually rule-based) analysis of what components go with what other components into what finished products.

What companies compete on the basis of configuration? Some high-technology companies, such as Dell, are known for their configurable products. Wireless telecommunications companies may have many different service plans; some have developed automated analytical applications to find the best one for each customer. They also tailor their services to each corporate account. Automobile companies need to compete on configuration, though US and European manufacturers have traditionally done it poorly. Since manufacturing a car from scratch to a customer’s specification is viewed as taking too long (at least outside of Japan, where it is commonly done), car companies have to forecast the types of vehicles and options customers will want, manufacture them, and send them to dealers. Far too often, the mixes of models and option packages have not been what customers want, so cars have to be substantially discounted to be sold during promotions or at the end of a model year. The mismatch between consumer desires and available product has been one of the biggest problems facing Ford and General Motors.

Both companies are trying to do something about configuration, but Ford is probably the more aggressive of the two. The company has shifted its focus from producing whatever the factory could produce and worrying later about selling it, to trying to closely match supply and demand. As part of this initiative, Ford is using configuration software to maintain rules about options and components, which will reduce the number of production mistakes and more closely match dealer orders to production schedules. Ford’s Smart Inventory Management System (SIMS) optimizes inventory for nearly three thousand Ford and Lincoln dealerships in North America. SIMS uses advanced analytics to produce dealer-specific vehicle order recommendations to ensure that dealers have the right number and mix of inventory to accommodate customer preferences and demand. As a result, annual revenue has increased and Ford dealerships are confidently making smart and cost-effective inventory ordering decisions.19 Ford has not yet entirely mastered the art of competing on configuration, but it is clearly making strides in that direction.

Quality

Analytics can also be applied to assess manufactured quality. Honda, for example, has long been known for the quality of its automobiles and other products. The company certainly has analytical individuals in its manufacturing quality department. However, it goes well beyond that function in identifying potential quality problems. Honda instituted an analytical “early warning” program to identify major potential quality issues from warranty service records. These records are sent to Honda by dealers, and they include both categorized quality problems and free text. Other text comes from transcripts of calls by mechanics to experts in various domains at headquarters and from customer calls to call centers. Honda’s primary concern was that any serious problems identified by dealers or customers would be noticed at headquarters and addressed quickly. So Honda analysts set up a system to mine the textual data coming from these different sources. Words appearing for the first time (particularly those suggesting major problems, such as fire) and words appearing more than predicted were flagged for human analysts to look at. Honda won’t go into details about any specific problems it’s nipped in the bud, but says that the program has been very successful.

Toshiba Semiconductor Company is another business that has made extensive use of analytics—in particular, visual representation of statistical analysis—in manufacturing quality. The initial applications focused on advanced analysis for new product and technology development, but they expanded quickly into other areas such as sales, marketing, development, production, and quality assurance. The semiconductor business unit’s executives are strong advocates of analytics, and have led the company by the concept for over fifteen years. Toshiba’s overall approach is encompassed by a broader initiative entitled Management Innovation Policy and Activity.

The visual analytics approach was first used by engineers in several semiconductor fabrication plants (fabs) for yield analysis—a key problem in the industry. According to Shigeru Komatsu, the company’s chief knowledge officer (it is rare for analytics to be addressed in such a role, but they are at Toshiba Semiconductor), “We have worked on standardizing performance indicators, we have built shared databases, we have made efforts to share analysis cases and results, and we have implemented analytic software such as Minitab and [TIBCO] Spotfire DecisionSite in order to increase our efficiency in analytics.”20 Toshiba Semiconductor continues to invest in yield analytics, most recently by incorporating artificial intelligence into analytical models used to determine the root cause of defects and further boost production quality.21

Safety

Safety was not the earliest area to apply data and analytics, but it’s growing rapidly today. It turns out that certain types of accidents are—at least to some degree—predictable. They are a function of the people, equipment, and company settings involved. If you have data on past safety incidents and the attributes associated with them, it’s not a huge stretch to predict when they will happen in the future, and intervene to try to prevent them.

Safety analytics is one of the specialties of the boutique analytics consulting firm First Analytics, which Tom helped to found in 2009. A manager at a large railroad read the original “Competing on Analytics” article, and contacted First Analytics CEO Mike Thompson in 2010. They began discussing how the company could improve its safety using analytics.

The railroad manager explained that safety was a top priority for the company and that it had improved considerably on this front, but it got harder to keep improving. He said the company had already used some data to identify likely risks, but there was a lot more that could be explored.

The railroad and First Analytics began with a proof of concept project to take existing data on the company’s train operators and score how likely they were to be at risk. The available data included location, job position, weather conditions, work schedule, absenteeism, scores on rules tests and training exercises, previous rules violations, and more. The data eventually came from about twenty different databases across the company. The railroad had previously used a risk scoring system based on reasonable logic, but the new one based on analytics improved upon it dramatically.

Since the proof of concept had worked well, the railroad worked with First Analytics to expand the analysis to other safety areas; for example, an analysis was created to identify and prioritize the most at-risk railroad grade crossings. The personnel safety analytics were extended beyond those operating the trains to the personnel maintaining the tens of thousands of miles of track. The overall result of these efforts was a dramatic improvement in safety. The company’s vice president of safety recently discussed the improvements with customers:

Big data. We’re into it big time. We think it’s driving these last eighteen months [of safety results]. There are twenty-five hundred operating managers . . . there’s no way we can look, spend time with, teach all the employees, all the time, every day. This is about focusing management attention on those who exhibit more risk. We’re on our fourth iteration of this model; we’re constantly fine-tuning it . . . If you were to take a look at our control charts, it’s been a nice downward trend for a decade. Since we turned a lot of this on, it’s been a step function: a full standard deviation from our normal run rate.22

Some companies in other transportation industries have adopted similar approaches to safety analytics. This is facilitated by the increasing ease of capturing driving data. Schneider National, Inc., for example, a large trucking firm, captures driver behaviors such as speed, acceleration and deceleration, and driving times. A predictive safety algorithm alerts supervisors that drivers are at risk for accidents even before they have had one.

Other industries that have taken aggressive approaches to safety analytics include mining, energy, and manufacturing. As sensors become more pervasive, we’re likely to see many more companies and industries adopt these approaches.

Logistics

When you invest in operational analytics at the level of hundreds of millions of dollars—and more importantly, deliver value at multiples of that sum—it’s safe to assume that you are competing on analytics. UPS has a long history of measurement and improvement through industrial engineering. Today, UPS captures information on the 18.3 million packages and documents, on average, that it delivers daily, and it receives 69.4 million tracking requests a day. The company has a deep, long-standing commitment to using analytics for decision making.

Jack Levis, the company’s senior director of process management, who also leads the Operations Research and Advanced Analytics groups, leads the ORION project. ORION, an acronym that stands for On-Road Integrated Optimization and Navigation, may be the largest commercial analytics project ever undertaken.23 ORION is a prescriptive analytics logistical model for UPS’s fifty-five thousand drivers in the United States (the international rollouts will come soon). Before drivers start their routes, ORION analyzes the packages to be delivered that day and determines an optimal routing for the “package cars.” Handheld computers tell the drivers where to go next.

UPS is a big company, and the benefits of ORION are commensurately large. Jack Levis likes to say “Big savings come from attention to detail.” Shortening each driver’s route by one mile daily translates into $50 million to the UPS bottom line annually. Every minute saved per driver daily is worth $14.6 million, and preventing one minute of idle time daily saves $515,000. As a result, what seem like small incremental improvements can yield big savings. For example, savings in driver productivity and fuel economy by driving more efficient routes add up to about $400 million a year. By cutting routes just a fraction of a mile here and there, UPS is driving 100 million fewer miles, with a resulting reduction in carbon emissions of 100,000 metric tons a year. You don’t see those levels of benefit from an analytics project very often, and these have been confirmed through intensive measurement and reported to Wall Street analysts.

ORION required a sustained commitment over more than a decade on the part of UPS management. The system took more than a decade from inception to fully roll out, and more than $250 million of investment. So the company clearly went all in on this project. What took so long? First, the sheer volume of data and the complexity of the math needed to optimize routes are truly staggering. Consider that for a single driver making 120 deliveries, the number of potential routes is 120 factorial, which is a sum greater than the age of the earth—in seconds.24 Now imagine computing the optimal route for every driver, every day.

The optimization algorithm itself was difficult to develop, but that aspect of the work paled in comparison to the other challenges. UPS had to develop its own maps to ensure that the drivers would be directed to the right place every time, and to the correct location for package drop-off or pickup. No commercially available maps could do that for the 250 million different locations to which UPS delivers. Next, telematics sensors were installed in more than forty-six thousand company trucks, which track metrics including speed, direction, braking, and drivetrain performance.

But it was the change management challenges that were the most complex to address. Imagine communicating and inculcating a new way of performing a core daily task to fifty-five thousand skilled union workers. UPS devoted six days of training and support to each driver. Most of the drivers wanted to know how the system worked before they would give up their time-honored routes, and so considerable effort was devoted to turning the “black box” algorithm into a “glass box.” To their credit, most of the drivers were enthusiastic about the new approach once they had experienced it.

ORION’s benefits to date are only the beginning for UPS. There is, of course, the global rollout of these tools. And to maintain a level of simplicity for drivers, ORION only optimizes routes at the beginning of the workday. More sophisticated programs down the road will reoptimize during the day, taking factors such as traffic patterns and weather into account. UPS plans to continue to extend ORION’s capabilities. Levis explains that future enhancements will also make other decisions to improve customer service without sacrificing efficiency. “If a customer calls with a request while drivers are on road, ORION will look at all drivers and determine the best choice. ORION will then put it in their handheld computer and adjust their route accordingly.”

There aren’t many companies that have made bets on analytics to this degree. But with the ORION initiative, UPS has truly transformed itself from a trucking company that uses technology to a “technology company with trucks.”

Managing the logistics of a global supply chain is another fertile area for investing in analytics. Caterpillar’s Global Supply Network Division (GSND) is responsible for overseeing an interdependent supply network with over eleven thousand suppliers and 150 facilities. But for thirty years, Caterpillar’s supplier performance appeared stagnant, with no sustainable improvements. GSND personnel were hampered by outdated manual processes, data unavailability, and network complexity. Users became accustomed to making decisions on assumptions and incomplete data. “We were trying to manage a supply network in the dark on spreadsheets.”25

To solve this problem, in 2012 Caterpillar began to develop the Assurance of Supply Center (ASC). The ASC is a data analytics and visualization platform that captures and transforms supply network data from dozens of systems into a suite of powerful business tools that is used to drive both every day decisions and strategic network design decisions. Three years later, the transformation has been remarkable. CEO Doug Oberhelman told shareholders, “[ASC] simplifies the supply network—a network that involves thousands of suppliers shipping more than a million parts and components every year. Now Caterpillar can see orders from production to delivery—by facility, business unit and cost.”26

Using mobile devices, users are able to see data on inventory, transportation, suppliers, performance, network footprint, defects, and tariffs. A shipment occurs every ninety seconds, producing millions of new data points every day that are analyzed, visualized to support decisions to over 10,000 users. The system incorporates a library of over 100 million data points feeding 45,000 predictive and prescriptive models, including data on 640,000 part numbers from over 7,000 suppliers, shipping to 127 facilities across the globe. All this data enables them to answer important questions such as:

  • Where is my inventory?
  • Are we ready for an upturn?
  • Why is supplier X performing poorly?
  • What is the network impact of a major natural disaster?
  • Factoring in projected demand, consumption, and performance, what should our parts inventory be for all part numbers next year?
  • How could I change my network to make it more cost-effective and profitable?

Caterpillar shares its data and insights with suppliers and works with them to improve their performance. In just three years, the results have been dramatic: in 2012, 67 percent of schedules were shipped on time; now it is 93 percent. Quality defects have been reduced by 50 percent. The organization now has the ability to keep the network moving even when disaster strikes. Users are able to resolve network performance issues in minutes rather than months. And ASC has become a continuing source of competitive advantage for Caterpillar as it continues to introduce more innovative analytic solutions to its business.

Research and Development Analytics

Research and development (R&D) has been perhaps the most analytical function within companies. It was the primary bastion of the scientific method within companies, featuring hypothesis testing, control groups, and statistical analysis.

Of course, some of this highly analytical work still goes on within R&D functions, although much of the basic research in R&D has been supplanted by applied research (which can still use analytical methods) and the creation of extensions of existing products. In several industries, research has become more mathematical and statistical in nature, as computational methods replace or augment traditional experimental approaches. For example, automaker Tesla’s connected cars share a steady stream of information with the company, which it uses to identify problems and create software fixes that are then automatically downloaded.

We’ll describe the analytical environment for R&D in one industry that’s changing dramatically with regard to analytics. In the pharmaceutical industry, analytics—particularly the analysis of clinical trials data to see whether drugs have a beneficial effect—always have been important. Over the last several years, however, there has been a marked growth in systems biology, in which firms attempt to integrate genomic, proteomic, metabolic, and clinical data from a variety of sources, create models and identify patterns in this data, correlate them to clinical outcomes, and eventually generate knowledge about diseases and their responses to drugs. This is a considerable challenge, however, and firms are just beginning to address it. We interviewed three pharmaceutical firms—one large “big pharma” company and two smaller, research-oriented firms—and found that all of them have efforts under way in this regard, but they are a long way from achieving the goal. This field is rapidly changing, however, and several firms are now attempting to use artificial intelligence—specifically IBM’s Watson—to help develop new drug compounds.

Analytics is also being used effectively to address today’s challenges in R&D, and this is one way that Vertex Pharmaceuticals, Inc. competes. Vertex, a global biotech firm headquartered in Boston, Massachusetts, has taken a particularly analytical approach to R&D, and its results are beginning to show the payoff. Vertex’s co-founder and retired CEO, Joshua Boger, is a strong believer in the power of analytics to raise drug development productivity. As early as 1988 (when he left Merck & Co. to found Vertex) he argued that, “What you need in this business is more information than the other guy. Not more smarts. Not more intuition. Just more information.”27

Vertex has undertaken a variety of analytical initiatives—in research, development, and marketing. In research, Vertex has focused on analyses that attempt to maximize the likelihood of a compound’s success. This includes developing multiple patent lead series per project and ensuring that compounds have favorable drug-like attributes. Vertex’s approach to drug design is known as rational or structural. This approach seeks to “design in” drug-like properties from the beginning of a drug development project and it enables Vertex to determine as early as possible whether a compound will have drug-like attributes.

More of Vertex’s efforts in using analytics have gone into the development stage of R&D, where its analyses indicate that most of the cost increases in the industry have taken place. One particularly high cost is the design of clinical trials. Poor clinical trial design leads to either ambiguous trial results or overly large clinical trials. This causes substantial delays and raises costs. Vertex has addressed this challenge by developing new trial simulation tools. These tools enable Vertex to design more informative and effective clinical trials in substantially less time. Vertex can now perform trial simulations hundreds of times faster than were previously possible. With these simulations, Vertex can also reduce the risk of failed or ambiguous trials caused by faulty trial design. This simulation advantage allows Vertex to optimize trial design in a fraction of the time it takes using industry standard design tools, thereby shortening the trial cycle time.

Clinical trial operation also represents some of the highest cost increases across the pharmaceutical industry. The operation of clinical trials, as with trial design, takes place within the development stage of R&D. Operational activities that are not automated lead to significant costs. Vertex uses analytics to automate and enhance clinical trial operations; examples include tools for patient accruals and electronic data capture (EDC).

Whether in R&D or elsewhere, the company begins with the right metric for the phenomenon it needs to optimize. Analysts determine how to obtain the appropriate data and what analyses to conduct. With these findings, Vertex constantly compares itself to competitors and to best practice benchmarks for the pharmaceutical industry. “We compete on analytics and culture,” says Steve Schmidt, Vertex’s former chief information officer. “We encourage fearless pursuit of innovations but we ruthlessly measure the effect these innovations have on our core business. We’re always looking for new meaningful analytic metrics, but where we look is driven by our strategy, our core corporate values and strengths, and our understanding of the value proposition to our business.”28 Vertex is a great example of applying analytics to product R&D, and as a result the company has an impressive array of new drugs in all stages of development.

The pharmaceutical industry is also pursuing analytical approaches that don’t even involve the laboratory. So-called in silico research uses computational models of both patients and drugs to simulate experiments more rapidly and cheaply than they could be performed in a lab. One systems biology firm, Entelos, Inc., has produced computer program platforms to simulate diseases and treatments in the areas of cardiovascular diseases, diabetes, inflammations, and asthma, among others. Entelos partners with pharmaceutical companies and other research organizations to identify and test new compounds. The goal is to use computational simulations to reduce the very high cost, long cycle times, and high failure rates of conventional laboratory research in the pharmaceutical industry. One collaboration on a diabetes drug between Entelos and Johnson & Johnson, for example, led to a 40 percent reduction in time and a 66 percent reduction in the number of patients necessary in an early-phase clinical trial.29

Of course, R&D today involves not only product innovation but also innovation in other domains: processes and operations; business models; customer innovations such as marketing, sales, and service; and new management approaches. In a very important sense, the idea behind this book is that each area in which an organization does business can be one in which R&D is conducted. In chapter 3, we noted how Capital One identifies new offerings through its test-and-learn market research approach. At internet-oriented firms such as Amazon, Facebook, and Google, every change to a web page is treated as a small R&D project. What’s the baseline case for measures such as page views, time spent on the site, and click-throughs? How does the change work on a small scale? How does it work when it is scaled more broadly? This test-and-learn approach to operational R&D is just as important as R&D on products.

Operational, business-model R&D also doesn’t have to involve the internet. In health care for example, despite the seemingly scientific nature of medicine, several studies suggest that only one-quarter to a third of medical decisions are based on science. A growing industry of health care providers, insurance companies, and third-party data and analytical service providers are working to make health care more efficient and effective through analytics. One of the most aggressive adopters of this approach is Intermountain Healthcare, a Utah-based association of twenty-two hospitals. Brent James, a physician with a master’s degree in statistics, began proposing small “science projects” to see which clinical interventions led to the best outcomes at Intermountain. These projects eventually evolved into a broad set of ten clinical programs based on research and evidence. Each clinical program is supported by information systems that incorporate recommended care protocols and keep track of patient costs. Intermountain has become a model for the effective provision of health care at a reasonable cost, and has trained many other physicians and administrators around the world.

A key aspect of analytical health care is population health, or the analysis of health outcomes for groups of people. One increasingly common approach to population health, for example, is to try to predict the likelihood that members of a health plan will develop higher risk for more severe disease over time. Healthways is one company that works with insurers to make such predictions and to identify ways to improve health outcomes and thereby reduce the likely cost to the insurer. Healthways uses data on members’ demographic, claims, prescription, and lab procedures to predict (using artificial intelligence neural network technology) which ones will be at highest risk for greater future total medical expenditures over the next year. Healthways employs more than fifteen hundred registered nurses who then provide telephonic and direct mail interventions from one of its ten call centers nationwide to help members develop healthy behaviors which reduce the severity of the disease, improve outcomes, and reduce the cost to the health plan. This approach to risk management can also reduce ongoing health maintenance costs and lower the risk of disease recurrence.30 The health insurance giant United Healthcare goes one step further by not only determining whether a patient is at risk for acquiring a particular disease, but also by using analytics to determine how likely a patient will be to adopt and follow disease management interventions.

Human Resource Analytics

The final internal analytical application we’ll discuss in this chapter is in human resources. Once a lagging area for analytics, leading-edge companies are now adopting sophisticated methods of analyzing employee data in order to get the most value from their people. Google, eBay, Walmart and others are using predictive analytics to increase productivity, to compensate and incent their employees appropriately, to help employees succeed in their jobs, and to reduce retention.

As with other parts of organizations, the tools for employing analytics in HR are becoming widely available. Most large organizations now have in place human resource information systems (HRIS), which record basic HR transactions such as hiring date, compensation, promotions, and performance ratings. Some go well beyond that level, and record skill levels on a variety of aptitudes and learning programs undertaken to improve those skills. Companies increasingly have the ability to relate their investments in human capital to their returns on financial capital. Whether they have the desire, however, is another question. People may be “our most important asset,” and even our most expensive asset, but they are rarely our most measured asset. Many companies may be beginning to employ HR analytics, but they are hardly competing on them.

The most conspicuous exception, of course, is in professional sports. Baseball, football, basketball, and soccer teams (at least outside the United States) pay high salaries to their players and have little other than those players to help them compete. Many successful teams are taking innovative approaches to the measurement of player abilities and the selection of players for teams. We’ve already talked about the analytical approach to player evaluation in baseball that was well described in Michael Lewis’s Moneyball. In American professional football, the team that most exemplifies analytical HR is the New England Patriots, the 2017 Super Bowl champions and winners of five Super Bowls over the last fifteen years.

The Patriots take a decidedly different approach to HR than other teams in the National Football League (NFL). They don’t use the same scouting services that other teams employ. They evaluate college players at the smallest, most obscure schools. They evaluate potential draft selections on the basis of criteria that other teams don’t use—intelligence, for example, and a low level of egotism. As head coach Bill Belichick puts it: “When you bring a player onto a team, you bring all that comes with him. You bring his attitude, his speed, strength, mental toughness, quickness. Our evaluation is comprehensive. Scott Pioli [then vice president of player personnel] and the scouting department do a great job of getting a lot of details by going into the history of a player, his mental and physical makeup as well as attitude and character. He eventually receives one grade and that establishes his overall value to the team.”31 Belichick often refers to the nonphysical attributes of players as “intangibles,” and he is perfectly comfortable discussing them with players and media representatives. Anyone who witnessed the Patriots’ unprecedented overtime win of the 2017 Super Bowl probably knows that one particular intangible—mental toughness—was a key element in their victory. The Patriots measure and actively develop this trait in their players with the help of behavioral and emotional intelligence testing using the Troutwine Athletic Profile (TAP). Belichick says, “I’ve been familiar with [the TAP] for over 15 years . . . I’m telling you from experience this gives you tremendous insight into players, yourself, and your program.”32

The Patriots manage the potential player data in a Draft Decision Support System, which is updated daily with new reports from scouts. Cross-checkers at the team’s offices check the scout ranking by comparing West Coast scout ratings with similar East Coast ratings (i.e., does the 6'7" UCLA receiver compare with the 6'7" Georgia Tech receiver?). No detail that can provide an edge is overlooked.

With the success of analytics in other domains, business leaders have high (but thus far mostly unmet) expectations for how analytics should revolutionize HR as well. Many HR departments have taken small steps to catch up to their peers in other parts of the organization. Companies are measuring more consistently across global HR processes and putting the information in systems. There are varying approaches to quantitative HR, including 360-degree evaluations, forced rankings, predictions about attrition, and so forth. None of these approach “rocket science,” but they do connote a much more disciplined and methodical approach. At American Express, for example, which has employees in eighty-three countries, an HR executive in Asia commented, “Everything that we touch has a metric attached. Whatever we do we use globally consistent processes, measurements and databases. We do things in a methodical, thought out, consistent manner, and we have consistent platforms.”33

Other firms are taking more modest steps to managing their talent. One manufacturing company we interviewed, for example, has developed a “talent management index” from four proprietary measures, and it uses the index to evaluate how each organizational unit manages its investments in human capital. Goldman Sachs, which, like professional sports teams, pays its employees extremely well, is beginning to apply analytical approaches to its workforce. GE, Accenture, Capital One, and Procter & Gamble seek quantitative reasoning abilities in potential recruits. Caesars, another analytical competitor in general, also uses HR analytics extensively in the recruiting process.

Digital companies are adopting more analytical techniques to retain their highly skilled and mobile workforce. In 2014, when eBay was preparing to spin off PayPal, managers were concerned about the effect on their workforce. Management asked Katie Meger, senior manager of talent analytics at eBay (and a psychology PhD) to help identify at-risk employees and to help them determine the best ways to prevent unwanted attrition. As the Wall Street Journal reported, Meger developed a predictive model, using survival analysis, a statistical technique for predicting death:

At the time, as eBay prepared to spin out PayPal, managers lobbied for more money to keep good employees from leaving, she recalled. But compensation wasn’t in the top five variables that indicated someone was at risk for quitting, the model showed. It also revealed that attrition was contagious, especially on small teams, she said. One remedy eBay enacted: If an employee resigned, HR software would automatically send an email to his former manager explaining the contagion factor and offering suggestions for staying close to remaining employees.34

Another factor driving the shift to HR analytics is increasing rigor in staffing and recruiting processes. Companies are increasingly viewing these processes as activities that can be measured and improved; people are almost becoming just another supply chain resource. This is particularly noticeable in relationships with external staffing firms. Apex Systems, a division of On Assignment, is a major IT staffing firm. It has observed a long-term move among its clients to greater use of more rigorous processes and metrics, and it’s trying to stay ahead of the trend by adopting more analytical measures and management approaches in its own business. Apex looks at a variety of staffing metrics, including:

  • Time to respond with a first, second, and third qualified candidate
  • How many candidates does a customer see?
  • Frequencies of payroll errors or customer invoice defects
  • Speed of customer issue resolution
  • Overall customer satisfaction levels

Apex’s customers are increasingly using analytics themselves to track the efficiency and effectiveness of staffing processes, so the company needs to establish and understand its own analytics to stay ahead of demand.

Three companies that rival the New England Patriots as analytics powerhouses for talent management are Google, Capital One, and Walmart. Google’s highly analytical culture and practices are evident in its human resources function. Google’s People Operations group operates very differently than the typical HR department. Its motto is: “All people decisions should be informed by data and analytics.” To achieve its goal, Google created a people analytics function with its own director and staff of over sixty researchers, analysts, and consultants who study employee-related decisions and issues.

In naming Laszlo Bock, Google’s former vice president of people operations, the HR Professional of the Decade in 2016, ERE Recruiting notes that Bock views data as a way to seek truth and unique insights. In contrast to the typical HR organization, Bock’s former group behaves like an applied R&D function that experiments and finds solutions for people management challenges: Google’s People Management frequently goes significantly beyond the usual HR metrics and scope, using predictive analytics to find ways to improve its managerial performance, recruiting, retention, employee well-being, and health. Google’s People Operations researchers even found ways for employees to eat better and consume fewer calories. Data and experiments on improving employee health led to the company switching to smaller plates at their eating facilities.35

The People and Innovation Lab (PiLab) is the R&D group within People Operations that conducts focused investigations on behalf of internal clients. As Google analyzes different HR issues, it has often moved in new directions as a result. The PiLab has determined what backgrounds and capabilities are associated with high performance and what factors are likely to lead to attrition—such as an employee’s feeling underutilized at the company. It has set the ideal number of recruiting interviews at four, down from the previous average of ten. One such project was Google’s Project Oxygen—so named because good management keeps the company alive. The objective of this project was to determine the attributes of successful managers. The PiLab team analyzed annual employee surveys, performance management scores, and other data to divide managers into four groups according to their quality. It then interviewed the managers with the highest and lowest scores (interviews were double-blind—neither interviewer nor managers knew which category the managers were in) to determine their managerial practices. Google’s HR analytics team ultimately identified eight behaviors that characterized good managers and five behaviors that every manager should avoid. A year after the team shared its findings, Google measured significant improvement in 75 percent of low-performing managers.

Google’s talent value models address questions such as “Why do employees choose to stay with our company?” to calculate what employees value most, then apply those insights to boost retention rates, design personalized performance incentives, decide whether to match a competitor’s job offer, or determine when to promote someone. Google also uses employee performance data to determine the most appropriate ways to help both high- and low-performing employees become more successful. Bock told us, “We don’t use performance data to look at the averages but to monitor the highest and lowest performers on the distribution curve. The lowest 5 percent of performers we actively try to help. We know we’ve hired talented people, and we genuinely want them to succeed.” The company’s hypothesis was that many of these individuals might be misplaced or poorly managed, and a detailed analysis supported that idea. Understanding individual’s needs and values allow the People Operations team to successfully address a number of difficult situations. The team has the data to prove it. Bock noted, “It’s not the company-provided lunch that keeps people here. Googlers tell us that there are three reasons they stay: the mission, the quality of the people, and the chance to build the skill set of a better leader or entrepreneur. And all our analytics are built around these reasons.”36

Capital One uses analytics extensively in the hiring process, requiring potential employees at all levels to take a variety of tests that measure their analytical abilities. The company uses mathematical case interviewing, a variety of behavioral and attitudinal tests, and multiple rounds of interviewing to ensure that it gets the people it wants. The process applies at all levels—even to the senior vice presidents who head business functions. For example:

When Dennis Liberson flew into Washington to interview for the top human resources position at Capital One Financial Corp., he was told that his interviews with the 16 top executives would have to wait. First, he was whisked off to a hotel room to take an algebra exam and write a business plan. Liberson, who jokes nearly seven years later that he might have been “the only HR guy who could pass their math test,” got the job and is now one of the company’s executive vice presidents. He also got an early taste of Capital One’s obsession with testing.37

Liberson is no longer the head of HR, but the focus on testing remains. A candidate for a managerial role at Capital One, for example, is still asked to review a set of financial statements and charts for a publishing company and then answer questions like these:

What was the ratio of sales revenue to distribution costs for science books in 2016? (Round to nearest whole number):

  1. 27 to 1

  2. 53 to 1

  3. 39 to 1

  4. 4 to 1

  5. Cannot say

Without some quantitative skills, even the most senior executive need not apply.

Walmart employs a lot of people—over 2.2 million associates worldwide. Saba Beyene heads Walmart’s global People Analytics team of seventy dedicated analysts. Gaining insight into the business implications of staff turnover was a big priority for the group: “Like all retail organizations, Walmart has a big turnover issue. We were trying to understand what we could do to make the organization understand that this is big. They were thinking if somebody moves I will get to hire somebody at a lower cost. What they did not understand is the cost of hiring, the cost of on-boarding, the cost of training. We were able to quantify all that and make it clear that when you lose an associate within 90 days or less they did not get their money back from hiring that associate.”38

In light of these findings, it is not surprising that Walmart took a closer look at compensation for new hires. The company is investing billions to improve its training processes, increase starting salaries, and give raises to associates that have been with the company for 6 months.

Overall, however, other than these few companies and professional sports teams, few organizations are truly competing on HR analytics. Perhaps this emphasis will come with time, but what seems to be lacking most is the management desire to compete on HR in the first place. Perhaps as people costs continue to rise and constitute higher percentages of organizations’ total costs, and as executives realize that their people are truly their most critical resource, analytics will catch on and proliferate in HR.

This chapter has considered a wide variety of internal applications of analytics. In each case, our objective was to illustrate not just that analytics are possible in a particular function but that they can be the basis for a different approach to competition and strategy. We hope these examples will drive senior executives to think about their own strategies and how they perform their internal activities. Chapter 5, which addresses the use of analytics in external (e.g., customer and supplier) relationships, offers even more possibilities for competition.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset