CHAPTER SEVEN

MANAGING ANALYTICAL PEOPLE

CULTIVATING THE SCARCE INGREDIENT THAT MAKES ANALYTICS WORK

When most people vizualize business analytics, they think of computers, software, and printouts or screens full of numbers. What they should be envisioning, however, are their fellow human beings. It is people who make analytics work and who are the scarce ingredient in analytical competition.

Analytical Urban Legends

This assertion is contrary to some analytical urban legends, so let us dispel those now. Years ago, we began hearing extravagant tales of software that would eliminate the need for human analysts. The most popular story involved a data mining episode involving diapers and beer. The gist of the story was that some grocery retailer had turned loose its powerful data mining software on a sales database, and it had come up with an interesting finding. Male customers who came in to buy beer for the weekend also tended to remember that their wives had asked them to buy diapers (some versions of the story switched around the primary shopping intent), so they put both products in their shopping carts. The retailer quickly moved diapers over by the beer (or vice versa), and sales exploded.

We chased this one down, and the most credible version of the story happened at Osco, a drugstore chain. Some of the company’s data analysts do dimly remember seeing a correlation between diaper and beer sales in their stores. But the analysts had told the software where and how to look for the relationship; it wasn’t just stumbled on by an enterprising young computer. Most importantly, the finding was deemed an anomaly, and diapers and beer were never put in proximity in the Osco stores (not all of which could even sell beer).

The legend is worth discussing, however, for a few lessons it provides. While data mining software is a wonderful thing, a smart human still needs to interpret the patterns that are identified, decide which patterns merit validation or subsequent confirmation, and translate new insights into recommendations for action. Other smart humans need to actually take action. When we studied more than thirty firms with a strong analytical capability in 2000, we found that a heavy dose of human skills was present at each of the firms, and the analytical competitors we’ve studied over the years certainly have lots of smart analysts on board.1

The other key lesson of the diapers and beer legend is that analytics aren’t enough, even when orchestrated by a human analyst. In order for analytics to be of any use, a decision maker has to make a decision and take action—that is, actually move the diapers and beer together. Since decision makers may not have the time or ability to perform analyses themselves, such interpersonal attributes as trust and credibility rear their ugly heads. If the decision maker doesn’t trust the analyst or simply doesn’t pay attention to the results of the analysis, nothing will happen and the statistics might as well never have been computed.

We found another good example of this problem in our previous study of analytical capability. We talked to analysts at a large New York bank who were studying branch profitability. The analysts went through a painstaking study in the New York area—identifying and collecting activity-based costs, allocating overheads, and even projecting current cost and revenue trends for each branch into the near future. They came up with a neat, clear, ordered list of all branches and their current and future profitability, with an even neater red line drawn to separate the branches that should be left open from those that should be closed.

What happened? Nary a branch was shut down. The retail banking executive who had asked for the list was mostly just curious about the profitability issue, and he hardly knew the analysts. He knew that there were many political considerations involved in, say, closing the branch in Brooklyn near where the borough president had grown up, even if it was well below the red line. Analytically based actions usually require a close, trusting relationship between analyst and decision maker, and that was missing at the bank. Because of the missing relationship, the analysts didn’t ask the right questions, and the executive didn’t frame the question for them correctly.

There are really three groups, then, whose analytical skills and orientations are at issue within organizations. One is the senior management team—and particularly the CEO—which sets the tone for the organization’s analytical culture and makes the most important decisions. Then there are the professional analysts, who gather and analyze the data, interpret the results, and report them to decision makers. The third group is a diverse collection we will refer to as analytical amateurs, a very large group of “everybody else,” whose use of the outputs of analytical processes is critical to their job performance. These could range from frontline manufacturing workers, who have to make multiple small decisions on quality and speed, to middle managers, who also have to make middle-sized decisions with respect to their functions and units. Middle managers in the business areas designated as distinctive capabilities by their organizations are particularly important, because they oversee the application of analytics to these strategic processes. IT employees who put in place the software and hardware for analytics also need some familiarity with the topic. We’ll describe each of these groups in this chapter.

Before we go any further, however, it is important to point out that the role of humans in analytics is changing somewhat, and is likely to change more in the near future. One key development is that machine learning and other intelligent technologies are changing how analytical models are generated. A human analyst might be able to generate a few new models per week, but a machine learning system could easily generate tens of thousands of models per week. Thus far, however, machine learning models still need humans to kick them off, point them in the direction of the right data and the variables to be predicted, and ensure that the resulting models make sense. We don’t think that machine learning models have inhibited employment for quantitative analysts and data scientists yet, but they may in the future.

The other technological factor that’s driving change for humans in the world of analytics is automation of decisions and actions. While an autonomous system is unlikely to close a set of bank branches without human intervention, complex decisions and digital tasks can both be performed by machine. They can do things like approve insurance policies, authorize loans, reboot computer servers, and replace and mail a lost ATM card. It’s likely that the decisions taken over by analytics and computers will be tactical and repetitive ones, but these probably constitute the bulk of decisions in many organizations. We don’t think high-level managers will lose their jobs to autonomous systems, but it’s likely that some employees will. The good news here is that an automated decision system is unlikely to ignore an analytical result. Humans who ignore analytics in the future will do so at their own peril.

Senior Executives and Analytical Competition

As if CEOs, presidents, COOs and other senior executives weren’t busy enough already, it is their responsibility to build the analytical orientation and capabilities of their organizations. If the CEO or a significant fraction of the senior executive team doesn’t understand or appreciate at least the outputs of quantitative analysis or the process of fact-based decision making, analysts are going to be relegated to the back office, and competition will be based on guesswork and gut feel, not analytics. Fact-based decision making doesn’t always involve analytics—sometimes “facts” may be very simple pieces of evidence, such as a single piece of data or a customer survey result—but the desire to make decisions on the basis of what’s really happening in the world is an important cultural attribute of analytical competitors.2

For an example, take Phil Knight, the founder and chairman emeritus of Nike. Knight has always been known as an inspirational but intuitive leader who closely guarded the mythical Nike brand. Perhaps needless to say, it didn’t take a lot of analytics to come up with the famous “swoosh.” At the beginning of 2005, however, Knight brought in William Perez, formerly head of S. C. Johnson & Son, as CEO of Nike. Perez, accustomed to the data-intensive world of Johnson’s Wax, Windex, and Ziploc bags, attempted to bring a more analytical style of leadership into Nike. He notes about himself, “I am a data man—I like to know what the facts are. If you come from the world of packaged goods, where data is always valuable, Nike is very different. Judgment is very important. Feel is very important. You can’t replace that with facts. But you can use data to guide you.”3

Perez attempted, for example, to move Nike into middle-tier retail environments, where his data suggested that footwear growth was fastest. In response to arguments from Knight and other Nike executives that such a move would weaken the brand, Perez pointed to companies such as Apple that sell successfully in Walmart without diluting their brands. But these and other clashes eventually led Knight and the board of directors to remove Perez little more than a year later.

The good news is that Perez’s departure only delayed the rise of analytics at Nike. Over the last several years, the company has become much more analytical. It uses data and analytics to influence shoe design, marketing programs, outlet store locations, logistics, and many other types of decisions. Nike has perhaps the world’s largest analytics group that’s focused on sustainability. The company may have gotten there faster if Perez had stayed, but it is definitely moving toward analytical competitor status.

The general lesson here, however, is that if a CEO can’t move the culture in a more analytical direction, a middle or junior manager would have even less of a chance of doing so. In fact, we’ve seen several companies in which fairly senior functional managers—corporate heads of marketing or technology, for example—were trying to bring a more analytical orientation to their firms and faced substantial obstacles. One, for example, a senior vice president of sales and marketing at a technology company, was known as a true “data hound,” bringing piles of statistical reports to meetings and having perfect command of his own (and other managers’) statistics. The sales and marketing organizations slowly began to become more data focused, but for years the overall company culture continued to emphasize chutzpah and confidence more than correlations. Again, that company eventually embraced analytics (it helped that the “data hound” kept getting promoted, eventually becoming president), but a more receptive culture would have allowed it to happen faster.

While there’s no doubt that almost any employee can move an organization in a more analytical direction, it takes top management commitment for a company to become an analytical competitor. In fact, we didn’t find a single stage 4 or 5 analytical competitor where either the CEO or a majority of the senior management team didn’t believe strongly in analytics as a primary competitive resource. Senior executive support is even important at stage 3, when organizations begin to aspire to analytical competition.

Characteristics of Analytical Leaders

What are the traits that senior executives and other analytical champions in an analytical competitor should have? A few key ones are described next.

They should be passionate believers in analytical and fact-based decision making

You can’t inspire others to change their behavior in a more analytical direction if you’re not passionate about the goal. A truly committed executive would demonstrate fact-based and analytical decision making in his or her own decisions and continually challenge the rest of the organization to do the same. For example, whenever Barry Beracha, previously CEO of the private-label baker Earthgrains (which was acquired by Sara Lee Bakery Group), needed to make a decision, he searched to turn up the right data. He insisted that the entire organization needed better data, and led the company to implement a new ERP system to create it. After it was available, he pressed his employees to use it when deciding what products to keep and what customers to serve. He was so passionate about data-based decisions that his employees referred to him as a “data dog”—to his delight.

They should have some appreciation of analytical tools and methods

The senior executives of analytical competitors don’t necessarily have to be analytical experts (although it helps!). As Professor Xiao-Li Meng—formerly the chair of the statistics department at Harvard and now dean of the Graduate School of Arts and Sciences—points out, you don’t need to become a winemaker to become a wine connoisseur.4 Management users of data analytics do need to have an awareness of what kinds of tools make sense for particular business problems, and the limitations of those tools. Just as a politician analyzing polls should know something about confidence intervals, the CEO making a decision about a plant expansion should know something about the statistical and qualitative assumptions that went into predicting demand for the goods the plant will make.

They should be willing to act on the results of analyses

There is little point in commissioning detailed analytics if nothing different will be done based on the outcome. Many firms, for example, are able to segment their customers and determine which ones are most profitable or which are most likely to defect. However, they are reluctant to treat different customers differently—out of tradition or egalitarianism or whatever. With such compunctions, they will have a very difficult time becoming successful analytical competitors—yet it is surprising how often companies initiate analyses without ever acting on them. The “action” stage of any analytical effort is, of course, the only one that ultimately counts.

They should be willing to manage a meritocracy

With widespread use of analytics in a company, it usually becomes very apparent who is performing and who isn’t. Those who perform well should be rewarded commensurately with their performance; those who don’t perform shouldn’t be strung along for long periods. As with customers, when the differences in performance among employees and managers are visible but not acted on, nothing good results—and better employees may well become disheartened. Of course, the leaders of such meritocratic firms have to live and die by this same analytical sword. It would be quite demoralizing for a CEO to preach the analytical gospel for everyone else but then to make excuses for his or her own performance as an executive.

How Does Analytical Leadership Emerge?

Some organizations’ leaders had the desire to compete analytically from their beginning. Amazon was viewed by founder Jeff Bezos as competing on analytics from its start. Its concept of personalization was based on statistical algorithms and web transaction data, and it quickly moved into analytics on supply chain and marketing issues as well. Amazon used analytics to determine the timing and extent of its holiday advertising strategy, outspending Walmart in October and November 2016. Capital One, Netflix, and Google were also analytical from the beginning because their leaders wanted them so. The visions of the founders of these startup businesses led to analytical competition.

In other cases, the demand for analytical competition came from a new senior executive arriving at an established company. Gary Loveman at Caesars, and Tom Ricketts, the new owner of the Chicago Cubs, brought with them an entirely new analytical strategy.

Sometimes the change comes from a new generation of managers in a family business. At the winemaker E. & J. Gallo, when Joe Gallo, the son of one of the firm’s founding brothers, became CEO, he focused much more than the previous generation of leaders on data and analysis—first in sales and later in other functions, including the assessment of customer taste. At the National Football League’s New England Patriots, the involvement in the team by Jonathan Kraft, a former management consultant and the son of owner Bob Kraft, helped move the team in a more analytical direction in terms of both on-field issues like play selection and team composition and off-field issues affecting the fan experience.

The prime mover for analytical demand doesn’t always have to be the CEO. At Procter & Gamble, for example, the primary impetus for more analytics at one point came from the firm’s two vice chairmen. One of them, Bob McDonald, became CEO and accelerated P&G’s analytical journey. And Jonathan Kraft is the president of the Patriots, not the CEO.

In addition to the general characteristics described earlier in the chapter (which are generally relevant for the CEO), there are specific roles that particular executives need to play in analytical competition. Three key roles are the chief financial officer (CFO), the chief information officer (CIO) and the chief data and analytics officer (CDAO).

Role of the CFO

The chief financial officer in most organizations will have responsibility for financial processes and information. Therefore, analytical efforts in these domains would also be the CFO’s concern. Since most analytical projects should involve some sort of financial information or returns, the CFO is at least a partial player in virtually all of them.

We have found several companies in which the CFO was leading the analytical charge. In order to play that role effectively, however, a CFO would have to focus on analytical domains in addition to finance and accounting. For example, at a large insurance company, the CFO had taken responsibility for analytics related to cost control and management but also monitored and championed analytical initiatives in the claims, actuarial, and marketing areas of the company. He also made it his responsibility to try to establish in the company’s employees the right overall balance of intuitive versus analytical thinking.

At Deloitte’s US business, the person leading the charge on analytics (at least for internal consumption) is Frank Friedman, the CFO. He has assembled a group of data scientists and quantitative analysts within the Finance organization. They are working with him to address several initiatives, including optimized pricing, predictive models of performance, identifying services that help sell other services, and factors that drive receivables. They have also worked to predict which candidates will be successful recruits to the firm.

Another CFO (technically a senior vice president of finance) at a retail company made analytics his primary focus, and they weren’t even closely connected to finance. The company had a strong focus on customers and customer orientation, and he played a very active role in developing measures, systems, and processes to advance that capability. The company already had good information and analytics on such drivers of its business as labor, space allocation, advertising, and product assortment. His goal was to add the customer relationship and customer segment information to those factors. Since his role also incorporated working with the external financial community (Wall Street analysts, for example), he was also working on making the company’s analytical story well known to the outside world. He also viewed his role as including advocacy of a strong analytical orientation in a culture where it wasn’t always emphasized. He noted, “I’m not the only advocate of analytics in the company—I have a number of allies. But I am trying to ensure that we tell our stories, both internally and externally, with numbers and analytics.”

At Bank of America, CFO (2005–2006) Al de Molina viewed himself as a major instigator of analytical activity. The bank had tried—and largely failed with—a big data warehouse in the early 1990s, so managers were generally wary of gathering together and integrating data. But in his previous job as head of the treasury function, de Molina felt that in order to accurately assess the bank’s risks, it needed to consolidate information about assets and rates across the bank. Since the bank was growing rapidly and was assimilating several acquisitions, integrating the information wasn’t easy, but de Molina pushed it anyway. The CFO also took responsibility for analytics around US macroeconomic performance. Since it has a wealth of data on the spending habits of American consumers, Bank of America could make predictions on the monthly fluctuations in macroeconomic indicators that drive capital markets. This had obvious beneficial implications for the bank’s risks. Both the interest rate risk and the macroeconomic analytics domains are obvious ones for a CFO’s focus. De Molina largely deferred to other executives where, for example, marketing analytics were concerned. De Molina left Bank of America and eventually was named CEO of GMAC, now Ally Financial.

Role of the CIO

The CEO or another top operating executive will have the primary responsibility for changing the culture and the analytical behaviors of employees. But CIOs play a crucial role in this regard too. They can work with their executive peers to decide what behaviors are necessary and how to elicit them.

At the telecommunications firm Verizon, the CIO’s objective is to create a similar change in analytical culture. Verizon and other firms arising out of the “Bell System” have long been analytically oriented, but decisions were generally made slowly and were pushed up the organizational hierarchy. While CIO at Verizon from 2000 to 2010 (he later became CEO of Juniper Networks and Coriant, another telecom equipment firm), Shaygan Kheradpir attempted to change this culture through continual exposure to information. He created a form of continuous scorecard in which hundreds of performance metrics of various types are broadcast to PCs around the company, each occupying the screen for fifteen seconds. The idea was to get everyone—not just senior executives—focused on information and what it means, and to encourage employees at all levels to address any issues that appear in the data. Kheradpir felt that the use of the scorecard changed Verizon’s culture in a positive direction.

Of course, the most traditional approach to analytics for CIOs is through technology. The CIO must craft an enterprise information strategy that serves the needs of everyone in the organization. This includes much more than running the enterprise transaction applications, management reporting, and external websites. A technology infrastructure must be capable of delivering the data, analytics, and tools needed by employees across the organization. Chapter 8 discusses analytical technology, and it should certainly be apparent from reading that chapter that both an architect and a leader are necessary. Those roles may not have to be played by the CIO personally, but the person(s) playing them would in all likelihood at least report to the CIO.

The CIO may also provide a home and a reporting relationship for specialized analytical experts. Such analysts make extensive use of IT and online data, and they are similar in temperament to other IT people. Some of the analytical competitors where analytical groups report to the office of the CIO include Procter & Gamble, the trucking company Schneider National, Inc., and Marriott. Procter & Gamble, for example, consolidated its analytical organizations for operations and supply chain, marketing, and other functions. This allowed a critical mass of analytical expertise to be deployed to address P&G’s most critical business issues by “embedded” analysts within functions and units. The group reported to the CIO and is part of an overall emphasis within the IT function on information and decision making (in fact, the IT function was renamed “information and decision solutions” at Procter & Gamble). Then-CIO Filippo Passerini worked closely with vice chairman, later CEO, Bob McDonald to architect a much more analytical approach to global decision-making at the company. They developed a series of innovations, including “business sphere” rooms for data-driven decision-making, and “decision cockpits” with real-time data for over fifty thousand employees.

CIOs wanting to play an even more valuable analytical role than simply overseeing the technology will focus on the I in their titles—the information. Analytical competition, of course, is all about information—do we have the right information, is it truly reflective of our performance, and how do we get people to make decisions based on information? These issues are more complex and multifaceted than buying and managing the right technology, but organizations wishing to compete on analytics will need to master them. Research from one important study suggests that companies focusing on their information orientations perform better than those that address technology alone.5 The authors of the study argue that information orientation consists of information behaviors and values, information management practices, and information technology practices—whereas many CIOs address only the latter category. While that study was not primarily focused on analytics, it’s a pretty safe bet that information orientation is highly correlated with analytical success.

Role of the CDAO (Chief Data and Analytics Officer)

As we mentioned in chapter 2, many analytical competitors have created a new role, the chief data and analytics officer (or sometimes only chief analytics officer or chief data officer, still with analytics responsibilities). The CDAO is responsible for ensuring that the enterprise has the data, organizational capabilities, and mindset needed to successfully compete on analytics. Gartner describes the role this way: “The CDO is a senior executive who bears responsibility for the firm’s enterprise wide data and information strategy, governance, control, policy development, and effective exploitation. The CDO’s role will combine accountability and responsibility for information protection and privacy, information governance, data quality and data life cycle management, along with the exploitation of data assets to create business value.”6

The CDAO serves as the champion and passionate advocate for the adoption of big data analytics in the organization. The analysts and data scientists in a firm may report directly to the CDAO, or they may have a matrixed reporting relationship. At a minimum, the CDAO keeps data scientists and other analysts productively focused on important business objectives, clears bureaucratic obstacles, and establishes effective partnerships with business customers. Many CDAOs tell us that they spend half their time “evangelizing” for analytics with the business community.

Depending on the organization and its strategic priorities, the CDAO may report variously to the CEO, COO, CIO, chief risk officer, or the chief marketing officer. Since the CDAO does not directly own business processes, he or she must work closely with the rest of the management team to embed data analytics into decision making and operations. And this executive must ensure that analyst’s insights are put into practice and produce measurable outcomes. If data management and analytics are combined into one CDAO role, it’s important for the incumbents to carve out time for both defense—data security, privacy, governance, etc.—and offense, which includes the use of analytics to create business value.7

What If Executive Commitment Is Lacking?

The enemies of an analytical orientation are decisions based solely on intuition and gut feel. Yet these have always been popular approaches to decision making because of their ease and speed and a belief that gut-feel decisions may be better. As we noted in chapter 6, the presence of a committed, passionate CEO or other top executive can put the organization on the fast track to analytical competition. But for those organizations without sufficient demand for data and analysis in executive decision making, the obvious question is whether such demand can be stimulated. If there is no senior executive with a strong analytical orientation, must the organization wait for such a manager to be appointed?

If you don’t have committed executives, it’s going to be difficult to do much as an outright analytical competitor, but you can lay the groundwork for a more analytical future. If you’re in a position to influence the IT infrastructure, you can make sure that your technical platforms, transaction systems, data, and business intelligence software are in good shape, which means that they reliably produce data and information that is timely and accurate. You can obtain and encourage the use of analytical software, programming languages, and data visualization tools. If you head a function or unit, you can make headway toward a smaller-scale analytical transformation of your part of the business. If you are really smart, influential, and politically astute, you might even plot an analytical coup and depose your non-analytical rulers. But needless to say, that’s a risky career strategy.

There are approaches that can be taken to stimulate demand for analytics by executives. These would generally be actions on the prove-it detour described in chapter 6. At one pharmaceutical firm where we interviewed several IT executives, there was generally little demand from senior executives for analytical decision making, particularly in marketing. IT managers didn’t have access to the decisions marketers were trying to make, and the marketing executives didn’t know what data or analysis might be available to support their decisions. However, two external events offered opportunities to build analytical demand. One marketing manager discovered a vendor who showed how sales data could be displayed graphically in terms of geography on an interactive map. The company’s IT executives felt that the display technique was relatively simple, and they began to offer similar capabilities to the manager to try to build on his interest and nurture the demand for marketing analytics.

A second opportunity was offered by an external study from a consulting firm. One outcome of the study will be a new set of performance indicators. The IT group plans to seize on the indicators and will offer more analysis and related data to the management team. These IT managers refuse to wait until more analytically oriented senior executives happen to arrive at the company.

Analytical Professionals and Data Scientists

There is an old joke about analytical professionals. It goes this way:

Question: What did the math PhD say to the MBA graduate?

Answer: Would you like fries with that?

That joke is now totally obsolete, as demand for analytical talent has skyrocketed. Data science, math, and other analytical professionals are being avidly recruited to play key roles in helping companies compete on analytics.

In addition to committed executives, most of the analytical competitors we studied had a group of smart and hardworking analytical professionals within their ranks. It is the job of these professionals to design and carry out experiments and tests, to define and refine analytical algorithms, and to perform data mining and statistical analyses on key data. Analytical pros create the predictive and prescriptive analytics applications used in the organization. In most cases, such individuals would have advanced degrees—often PhDs—in such analytical fields as statistics, data science, econometrics, mathematics, operations research, logistics, physics, and marketing research. As they become more widely available, they are being joined by a new generation of analysts with master’s degrees in applied analytics, informatics, and data science. In some cases, where the company’s distinctive capabilities involve a specialized field (such as geology for an oil exploration firm), the advanced degree will be in that specialty.

One great example of this type of person we found in our research is Katrina Lane. We first encountered her as vice president of channel marketing at Caesars Entertainment. There, Lane had the job of figuring out which marketing initiatives to move through which channels, including direct mail, email, call centers, and so forth. This is a complex field of business that hasn’t been taught in most business schools, so Lane had to figure a lot out on her own. Fortunately, her skills were up to the task. To start with, she has a PhD in experimental physics from Cornell. She was head of marketing for a business unit of the May Department Stores Company and a consultant with McKinsey & Company’s marketing and sales practice. How common is such a combination of skills and experience? Not very, which is why assembling a capable group of analytical professionals is never easy. The rarity of her skills also explains why Lane was promoted to chief technology officer at Caesars, and then recruited to be an executive vice president and general manager of marketing and operations for Consumer Cards at American Express. Now she is the VP of Global Delivery Experience for Amazon. She clearly excels in highly analytical management roles in highly analytical companies.

With her PhD in experimental physics, Lane would today be hired as a “data scientist,” a job that can bring structure to unstructured data, create sophisticated models to analyze it, and interpret the results for their implications for key decisions and business directions. Data scientists, whom Tom and his coauthor D. J. Patil (until recently, the chief data scientist in the White House) described as holding “the sexiest job of the 21st century,” are in hot demand.8 Some starting salaries for data scientists exceed $200,000. Their earliest employers were Silicon Valley startups, but now they’re being hired by large traditional firms as well. Procter & Gamble, for example, went from one data scientist in 2013 to over thirty in 2017. GE hired a couple hundred of them for its GE Digital operation in the San Francisco Bay area. There simply aren’t enough to go around.

Even Google, which is one of the most desired employers on the planet right now, has challenges getting this sort of talent. It offers generous salaries, stock options, and what is reputed to be the best cafeteria food anywhere. Yet UC Berkeley professor Hal Varian, who has worked with Google as a consultant since 2002, notes the difficulty of hiring analytical professionals and data scientists there: “One point that I think needs more emphasis is the difficulty of hiring in this area. Given the emphasis on data, data warehousing, data mining and the like you would think that this would be a popular career area for statisticians. Not so! The bright ones all want to go into biotech, alas. So it is quite hard to pull the talent together, even for Google.”9

Assuming you can find them, how many of these people are necessary? Of course, the answer depends on what an organization is attempting to do with analytics. In the companies we studied, the numbers range from about a dozen analytical professionals and data scientists to several hundred. GE had a goal of hiring four hundred data scientists for its software and analytics business based in the San Francisco area. We’re not sure of the exact amount the company hired, but there were at least two hundred in this central organization, and now other business units are hiring their own. Procter & Gamble has two hundred or so. Google has about five hundred people with the “quantitative analyst” title, and thousands who do some sort of analytical work.

How are they organized? Most companies have centralized them to some degree, although organizational structures fluctuate over time. Procter & Gamble, for example, took analytical groups that had been dispersed around the organization, and combined them to form a new global analytics group as part of the IT organization. Then it decentralized them a bit while retaining a dotted line to the chief data officer. AIG created a centralized sixty-person science office for advanced analytics, but then decentralized it for greater responsiveness to the business.

Another logical alternative as an organizational home for these high-powered analysts would be the business function that is the primary competitive thrust for the organization. For example, Caesars keeps most of its “rocket scientists” (including Katrina Lane at the time) in the marketing department, because customer loyalty programs and improved customer service are the primary orientation of its analytics.

One argument in favor of central coordination is that at the most advanced stages of analytics, extensive knowledge of specialized statistical methods is required. As a result of what we learned about the companies we studied, we believe it is impractical for these advanced skills to be broadly distributed throughout the organization. Most organizations will need to have groups that can perform more sophisticated analyses and set up detailed experiments, and we found them in most of the companies we interviewed. It’s unlikely, for example, that a “single echelon, uncapacitated, nonstationary inventory management algorithm,” employed by one analytical competitor we studied in the supply chain area, would be developed by an amateur analyst. You don’t learn about that sort of thing in a typical MBA program.

Most analytical competitors we have studied are evolving toward a hub-and-spoke organization reporting to the CDAO. Analysts are allocated among business units, corporate, and functions so they can specialize and work closely with decision makers. The centralized hub is responsible for knowledge sharing, spreading best practices, analytics training, career path development, common standards and tools. Often, the most highly skilled analysts and data scientists are centralized in the hub so they can be strategically deployed to work on the enterprise’s most pressing projects.

Regardless of where the professional analysts are located in their firms, many of the analytical competitors we interviewed stressed the importance of a close and trusting relationship between these analysts and the decision makers. As the head of one group put it, “We’re not selling analytics, we’re selling trust.” The need is for analytical experts who also understand the business in general and the particular business need of a specific decision maker. One company referred to such individuals as “front room statisticians,” distinguishing them from “backroom statisticians” who have analytical skills but who are not terribly business oriented and may also not have a high degree of interpersonal skills.

In order to facilitate this relationship, a consumer products firm with an IT-based analytical group hires what it calls “PhDs with personality”—individuals with heavy quantitative skills but also the ability to speak the language of the business and market their work to internal (and, in some cases, external) customers. One typical set of job requirements (listed on Monster.com for a data scientist in Amazon’s Advertising Platform group) reads:

  • PhD in CS [computer science] machine learning, operational research, statistics or in a highly quantitative field
  • 8+ years of hands-on experience in predictive modeling and analysis
  • Strong grasp of machine learning, data mining and data analytics techniques
  • Strong problem solving ability
  • Comfortable using Java or C++/C. Experience in using Perl or Python (or similar scripting language) for data processing and analytics
  • Experience in using R, Weka, SAS, Matlab, or any other statistical software
  • Communication and data presentation skill

Some of these skills overlap with those of traditional quantitative analysts, but some don’t. Data scientists tend to have more computer science–oriented backgrounds, whereas analysts tend to have a more statistical focus. Data scientists are also more likely to be familiar with open-source software and machine learning, and perhaps more likely to have a PhD in a scientific discipline. There are also differences in culture and attitude to some degree. A table of typical differences between these two groups is provided in table 7-1, which is based on a survey Jeanne led in 2014. Over time (and with the popularity of the data scientist title), however, these categories have become somewhat intermingled, and the differences may be diminishing.

TABLE 7-1

Analysts and data scientists: not quite a different species

Analysts Data scientists
Data structure Structured and semistructured, mostly numeric All, predominantly unstructured
Data types Mostly numeric All, including images, sound, and text
Preferred tools Statistical and modeling tools, on data usually residing in a repository such as a data warehouse Mathematical languages (such as R and Python), machine learning, natural language processing, and open-source tools; data on multiple servers (such as Hadoop)
Nature of assignment Report, predict, prescribe, optimize Explore, discover, investigate, visualize
Educational background Operations research, statistics, applied analytics Computer science, data science, symbolic systems, cognitive science
 
Mindset:
  • Entrepreneurial

69%

96%

  • Explore new area

58%

85%

  • Insights outside of projects

54%

89%

Source: Jeanne G. Harris and Vijay Mehrotra, “Getting Value from Your Data Scientists,” MIT Sloan Management Review (Fall 2014).

Whatever the role is called, business relationships are a critical component of it. At Wells Fargo, a manager of a customer analytics group described the relationships his group tries to maintain: “We are trying to build our people as part of the business team; we want them sitting at the business table, participating in a discussion of what the key issues are, determining what information the business people need to have, and recommending actions to the business partners. We want this [analytical group] to be more than a general utility, but rather an active and critical part of the business unit’s success.”10

Other executives who manage or have managed analytics groups described some of the critical success factors for them in our interviews:

  • Building a sustainable pipeline. Analytical groups need a pipeline of projects, client relationships, and analytical technologies. The key is not just to have a successful project or two but to create sustainability for the organization over time. It’s no good to invest in all these capabilities and have the analytical initiative be in existence for only a few years. However, it takes time to build success stories and to have analytics become part of the mythology of the organization. The stories instill a mindset in the company so that decision makers have the confidence to act.
  • Relationship to IT. Even if an analytical group isn’t officially part of an IT organization, it needs to maintain a close relationship with it. Analytical competitors often need to “push the frontier” in terms of IT. One analytical group’s manager in a consumer products firm said that his group had been early users of supercomputers and multiuser servers and had hosted the first website for a product (which got thousands of hits on the first day). Exploration of new information technologies wasn’t an official mission for the group but one that the company appreciated. It also helped capture the attention and imagination of the company’s senior management team.
  • Governance and funding. How the group is directed and funded is very critical, according to the executives we interviewed. The key issue is to direct analytical groups at the most important problems of the business. This can be done either by mandate or advice from some form of steering committee, or through the funding process. When steering committees are made up of lower-level executives, the tendency seems to be to suboptimize the use of analytical resources. More senior and strategic members create more strategic priorities for analytical professionals. Funding can also lead to strategic or tactical targets. One group we interviewed was entirely paid for by corporate overhead, which meant that it didn’t have to go “tin cupping” for funding and work on unimportant problems that somehow had a budget. Another group did have to seek funding for each project, and said they had to occasionally go to senior management to get permission to turn down lucrative but less important work.
  • Managing politics. There are often tricky political issues involved in analytical professionals’ work, ranging from whose projects the group works on to what the function is called. One company called its analytical group Decision Analysis, only to find that executives objected because they felt it was their job to make decisions. The group then changed its name to Options Analysis. It can also be politically difficult simply to employ analysts. As one head of an analytical group put it: “So I go to Market Research and say, ‘I have a better way to evaluate advertising expenditures.’ They don’t necessarily react with joy. It’s very threatening to them. It makes it particularly difficult if you are asking them to pay you to make them look bad!”11

    Heads of analytical groups have to be sensitive to political issues and try to avoid political minefields. The problem is that people who are good at analytics do not often have patience for corporate politics! Before any analysis, they should establish with the internal client that both client and analyst have no stake in any particular outcome, and they will let the analytical chips fall where they may. CEOs can help their analytical professionals by making it clear that the culture rewards those who make evidence-based decisions, even when they go against previously established policies.

  • Don’t get ahead of users. It’s important for analytical professionals to keep in mind that their algorithms and processes must often be implemented by information workers, who may be analytically astute but are not expert statisticians. If the analytics and the resulting conclusions are too complex or laced with arcane statistical terminology, they will likely be ignored. One approach is to keep the analytics as simple as possible or to embed them into systems that hide their complexity. Another is to train users of analytical approaches as much as possible. Schneider National’s analytics group has offered courses such as “Introduction to Data Analysis” and “Statistical Process Control” for users in various functions at the company. It’s not a formal responsibility for the group, but the courses are popular, and the group feels it makes their job easier in the long run.

Offshore or Outsourced Analytical Professionals

With expert analytical professionals in short supply within US and European companies, many companies are considering the possibility of outsourcing them or even going to India or China to find them. It’s certainly true that an increasing number of firms offer “knowledge process outsourcing” in analytical fields including data mining, algorithm development, and quantitative finance. In India, firms such as Mu Sigma, Evalueserve, and Genpact have substantial practices in these domains. Genpact did work in analytical credit analysis when it was a part of GE Capital, and now also offers services in marketing and sales analytics. Most major consulting firms, including Accenture, Deloitte, and IBM, have large analytics groups based in India.

However, it is difficult for analysts to develop a trusting relationship with decision makers from several thousand miles away. It is likely that the only successful business models for this type of work will combine onshore and offshore capabilities. Onshore analysts can work closely with decision makers, while offshore specialists can do back-office analytical work. If a particular analytical application can be clearly described by the business owner or sponsor before it is developed, there is a good chance that development of the relevant algorithms could be successfully outsourced or taken offshore.

Analytical Amateurs

Much of the daily work of an analytically focused strategy has to be implemented by people without PhDs in statistics or operations research. A key issue, then, is how much analytical sophistication frontline workers need in order to do their jobs. Of course, the nature and extent of needed skills will vary by the company and industry situations. Some firms, such as Capital One, hire a large number of amateur analysts—people with some analytical background (perhaps MBAs), but mostly not PhD types. At one point, when we looked at open jobs on Capital One’s website, there were three times as many analyst openings as there were jobs in operations—hardly the usual ratio for a bank. According to its particular analytical orientation, a company simply needs to determine how many analytical amateurs it needs in what positions. Some may verge on being professionals (we call them analytical semi-professionals); others may have very limited analytical skills but still have to work in business processes that are heavily based on analytics.

The Boston Red Sox’s situation in 2003, which we described in chapter 1, is an example of needing to spread analytical orientations throughout the organization. For more business-oriented examples, we’ll describe two organizations that are attempting to compete on supply chain analytics. One, a beer manufacturer, put in new supply chain optimization software to ensure that it manufactured and shipped the right amount of beer at the right time. It even created a new position, “beer flow coordinator” (if only we had such a title on our business cards!) to use the system and oversee the optimization algorithms and process. Yet the company’s managers admitted that the beer flow coordinators didn’t have the skills to make the process work. No new people were hired, and no substantial training was done. The new system, at least in its early days, was not being used. The company was expecting, one might say, champagne skills on a beer-skills budget.

At a polymer chemicals company, many of the company’s products had become commoditized. Executives believed that it was important to optimize the global supply chain to squeeze maximum value and cost out of it. The complexity of the unit’s supply chain had significantly increased over the previous couple of years. Responding to the increased complexity, the organization created a global supply chain organization, members of which were responsible for the movement of products and supplies around the world. In the new organization, someone was responsible for the global supply chain, there were planning groups in the regions, and then planners in the different sites. The greatest challenge in the supply chain, however, involved the people who did the work. The new roles were more complex and required a higher degree of analytical sophistication. The company knew that the people performing the previous supply chain roles didn’t have the skills to perform the new analytical jobs, but it kept them anyway. At some point, the company plans to develop an inventory of skills needed and an approach to developing or hiring for them, but thus far the lack of skills remains a bottleneck in the implementation of its new logistical process.

When a company is an analytical competitor, it will need to ensure that a wide variety of employees have some exposure to analytics. Managers and business analysts are increasingly being called on to conduct data-driven experiments, interpret data, and create innovative data-based products and services. Many companies have concluded that their employees require additional skills to thrive in a more analytical environment. An Avanade survey found that more than 63 percent of respondents said their employees need to develop new skills to translate big data analytics into insights and business value.12 Anders Reinhardt, formerly head of Global Business Intelligence for the VELUX Group—an international manufacturer of skylights, solar panels, and other roof products based in Denmark—is convinced that “the standard way of training, where we simply explain to business users how to access data and reports, is not enough anymore. Big data is much more demanding on the user.”13

To succeed at an analytical competitor, information workers and decision makers need to become adept at three core skills:14

  • Experimental: Managers and business analysts must be able to apply the principles of scientific experimentation to their business. They must know how to construct intelligent hypotheses. They also need to understand the principles of experimental testing and design, including population selection and sampling, in order to evaluate the validity of data analyses. As randomized testing and experimentation become more commonplace in the financial services, retail, and telecommunications industries, a background in scientific experimental design will be particularly valued. Google’s recruiters know that experimentation and testing are integral parts of their culture and business processes. So job applicants are asked questions such as “How many tennis balls would fit in a school bus?” or “How many sewer covers are there in Brooklyn?” The point isn’t to find the right answer but to test the applicant’s skills in experimental design, logic, and quantitative analysis.
  • Numerate: Analytical leaders tell us that an increasingly critical skill for their workforce is to become more adept in the interpretation and use of numeric data. VELUX’s Reinhardt explains that “Business users don’t need to be statisticians, but they need to understand the proper usage of statistical methods. We want our business users to understand how to interpret data, metrics, and the results of statistical models.” Some companies, out of necessity, make sure that their employees are already highly adept at mathematical reasoning when they are hired. Capital One’s hiring practices are geared toward hiring highly analytical and numerate employees into every aspect of the business. Prospective employees, including senior executives, go through a rigorous interview process, including tests of their mathematical reasoning, logic, and problem-solving abilities.
  • Data literate: Managers increasingly need to be adept at finding, manipulating, managing, and interpreting data, including not just numbers but also text and images. Data literacy is rapidly becoming an integral aspect of every business function and activity. Procter & Gamble’s former chairman and CEO Bob McDonald is convinced that “data modeling, simulation, and other digital tools are reshaping how we innovate.” And that changed the skills needed by his employees. To meet this challenge, P&G created “a baseline digital-skills inventory that’s tailored to every level of advancement in the organization.” The current CEO, David Taylor, also supports and has continued this policy. At VELUX, data literacy training for business users is a priority. Managers need to understand what data is available, and to use data visualization techniques to process and interpret it. “Perhaps most importantly, we need to help them to imagine how new types of data can lead to new insights,” notes Reinhardt.15

Depending on the business function, additional expertise may be needed. Most IT people, for example, should have some sense of what analyses are being performed on data, so that they can ensure that IT applications and databases create and manage data in the right formats for analysis. HR people need to understand something about analytics so that they can hire people with the right kinds of analytical skills. Even the corporate legal staff may need to understand the implications of a firm’s approach to analytical and automated decision making in case something goes awry in the process.

Firms that have upgraded the analytical skills of employees and managers are starting to see benefits. For example, at a consumer products firm with an analytical strategy, they’re seeing a sea change in middle managers. Upper middle management has analytical expertise, either from mathematical backgrounds or from company experience. Two of the central analytical group’s key clients have new managers who are more analytical. They were sought out for their analytical orientations and have been very supportive of analytical competition. The analytical managers are more challenging and drive the professional analyst group to higher levels of performance. The senior management team now has analytical discussions, not political ones.

Tools for Amateurs

One of the issues for amateur analysts is what IT tools they use to deal with analytics. There are three possible choices, and none seems ideal. One choice is to give them powerful statistical analysis tools so that they can mine data and create powerful algorithms (which they are unlikely to have the skills to do). A second choice is to have the prescriptive models simply spit out the right answer: the price that should be charged, the amount of inventory to be shipped, and so on. While we think this may be the best of the three options, it may sometimes limit the person’s ability to use data and make decisions. The third option, which is by far the most common, is to have amateurs do their analytics on spreadsheets.

Spreadsheets (by which we really mean Microsoft Excel, of course) are still the predominant tool by which amateurs manipulate data and perform analytics. Spreadsheets have some strengths, or they wouldn’t be so common. They are easy to use (at least the basic capabilities); the row-and-column format is widely understood; and they are inexpensive (since Excel comes bundled with widely used office productivity software). Yet as we point out in chapter 2, spreadsheets are a problematic tool for widespread analytical activity. It’s very difficult to maintain a consistent, “one version of the truth” analytical environment across an enterprise with a large number of user-managed spreadsheets. And spreadsheets often have errors. Any firm that embraces spreadsheets as the primary tool for analytical amateurs must have a strong approach to data architecture and strong controls over analytics.

An intermediate approach would be to give amateur analysts the ability to view and analyze data, while still providing a structure for the analytical workflow. Vendors of business intelligence and data visualization software make such a workflow available. They allow the more analytically sophisticated users to do their own visual queries or create visual reports, while letting less sophisticated users observe and understand some of the analytical processes being followed. These tools are becoming increasingly popular and are leading to the democratization of analytics. At some point, we may even see the emergence of the “citizen data scientist,” for whom most of the difficult data management and analysis tasks are done by intelligent machines.

Autonomous Decision Making

Another critical factor involving analytical amateurs that must be addressed is how highly automated a solution for a given problem should be.16 As automating more and more decisions becomes possible, it is increasingly important for organizations to address which decisions have to be made by people and which can be computerized. Automated decision applications are typically triggered without human intervention: they sense online data or conditions, apply analytical algorithms or codified knowledge (usually in the form of rules), and make decisions—all with minimal human intervention.

Fully automated applications are configured to translate high-volume or routine decisions into action quickly, accurately, and efficiently because they are embedded into the normal flow of work. Among analytical competitors, we found automated decision technologies being used for a variety of operational decisions, including extension of credit, pricing, yield management, and insurance underwriting. If experts can readily codify the decision rules and if high-quality data is available, the conditions are ripe for automating the decision.17 Bank credit decisions are a good example; they are repetitive, are susceptible to uniform criteria, and can be made by drawing on the vast supply of consumer credit data that is available.

Still, some types of decisions, while infrequently made, lend themselves well to automation—particularly cases where decision speed is crucial. For example, in the electrical energy grid, quick and accurate shutoff decisions at the regional level are essential to avert a systemwide failure. The value of this rapid response capability has often been demonstrated in large power outages, when automated systems in regions of the United States have been able to respond quickly to power surges to their networks by shutting off or redirecting power to neighboring lines with spare capacity. It is also evident in some of today’s most advanced emergency response systems, which can automatically decide how to coordinate ambulances and emergency rooms across a city in the event of a major disaster.

Autonomous decision-making applications have some limitations, however. Even when fully automating a decision process is possible, fiduciary, legal, or ethical issues may still require a responsible person to play an active role. Also, automated decisions create some challenges for the organization. Because automated decision systems can lead to the reduction of large staffs of information workers to just a handful of experts, management must focus on keeping the right people—those with the highest possible skills and effectiveness. This expert-only approach, however, raises the question of where tomorrow’s experts will come from.

The Override Issue

A related issue is how amateurs should deal with automated decisions with which they don’t agree. Some firms, such as Caesars, discourage employees from overriding their automated analytical systems, because they have evidence that the systems get better results than people do. A hotel manager, for example, is not allowed to override the company’s revenue management system, which figures out the ideal price for a room based on availability trends and the loyalty level of the customer.

Marriott, as we’ve described in chapter 3, has similar revenue management systems for its hotels. Yet the company actually encourages its regional “revenue leaders” to override the system. It has devised ways for regional managers to introduce fast-breaking, anomalous information when local events unexpectedly affect normal operating data—such as when Houston was inundated with Hurricane Katrina evacuees. The revenue management system noticed that an unexpected number of people wanted Marriott rooms in Houston in August, and on its own it would have raised rates. But Marriott hardly wanted to discourage evacuees from staying in its Houston-area hotels, so revenue leaders overrode the system and lowered rates. Marriott executives say that such an approach to overriding automated systems is part of a general corporate philosophy. Otherwise, they argue, they wouldn’t have bothered to hire and train analytically capable people who make good decisions.

Why these two different philosophies? There are different systems involved, different business processes, and different levels of skill. Companies with a high skill level among analytical amateurs may want to encourage overrides when people think they know more than the system. Partners HealthCare’s physicians, who are often also professors at Harvard Medical School, are encouraged to override automated decision systems when doing so is in the best interest of the patient. With such highly trained experts involved in the process, the best result is probably from the combination of humans and automated decision rules.

Companies that feel they have most of the variables covered in their automated analytical models—and that have lower levels of analytical skills at the front line—may prefer to take a hard line on overrides. To some degree, the question can be decided empirically—if overrides usually result in better decisions, they should be encouraged. If not, they should be prohibited most of the time. If a company does decide to allow overrides, it should develop some systematic means of capturing the reasons for them so that the automated model might be improved through the input. At Partners, for example, physicians are asked to give a reason when they override the automated system, and physicians who constantly override a particular system recommendation are interviewed about their reasoning.

Whatever the decision on people versus automation, the key message of this chapter is that the human resource is perhaps the most important capability an analytical competitor can cultivate. When we asked analytical competitors what’s hard about executing their strategies, most said it was getting the right kinds of analytical people in sufficient numbers. Hardware and software alone can’t come close to creating the kinds of capabilities that analytical strategies require. Whether we’re talking about senior executives, analytical professionals and data scientists, or frontline analytical amateurs, everyone has a job to do in making analytical competition successful.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset