CHAPTER 
3

The Impact of Marketing Analytics

Show Me the Money

The first two chapters hopefully made a convincing argument for marketing analytics. It’s not enough, however, to collect all the right data to input into a legitimate marketing analytics process. The key to success with marketing analytics is in the application. If the results and insights from the process are not used, then a marketing analytics initiative is of little value. To benefit the whole organization, the analytics must influence the thinking, the decisions, and ultimately the way marketing works.

To look at marketing analytics as just a performance management tool misses the point of the process. It is true that a proper implementation of analytics can help marketing work better: more efficiently and more effectively. Although the analytics process should provide the data that helps marketing do more with less, ultimately analytics are about making money. The highest and best use of the analytics process is to help marketing operate as a revenue center. When the CEO says, in so many words, “show me the money,” marketing is able to do so when it has the right analytics data and uses it wisely.

Image Note  Analytics, though useful for improving marketing’s efficiency and effectiveness, are ultimately about helping the marketing function operate as a revenue center.

For marketing organizations that have an analytics process, there are certainly some extremes to avoid. The first extreme is to have an analytics process but then do nothing with the output. The second is to have an analytics process and expect its output to serve as the primary source of leadership for the marketing team—this is the job of the CMO or marketing VP. Analytics are not a substitute or replacement for leadership. The place to situate a marketing analytics process is between these extremes. This is a place where the analytics are real, they have influence, and they serve as a reference point to help the CMO validate decisions and refine strategies.

Chapter 2 presented the state of marketing analytics, along with attitudes about them and the challenges with marketing analytics.   How are organizations doing in the real world when it comes to finding this analytics “happy place” between the extremes just described? Figure 3-1 shares study results1 that reveal what marketing organizations are doing with the analytics data they measure and track.

9781484202609_Fig03-01.jpg

Figure 3-1. Marketing analytics data use

Figure 3-1 reveals that almost one-fourth of the marketing organizations that have analytics data occupy the first of the two extremes: they make no apparent use of it. For these organizations, analytics is nothing more than busy work. Within the data lies the power to improve performance or at least prove marketing’s contribution, but it goes unused. This lack of use indicates that marketing either feels it is already performing well enough and doesn’t need the boost it would get from using the data, or it doesn’t care.

What’s best is for marketing and the CMO to desire to use analytics as a continuous improvement strategy. In the absence of any pressure from above, the ideal marketing team seeks to better itself and uses analytics as a tool to manage how it works. In the absence of this zeal for improvement, the company executives can also mandate the establishment of metrics and the collection of data around those metrics for the purposes of accountability. Either way, for the 24 percent who do not use the analytics data at their disposal, the accountability doesn’t exist to encourage use of that resource.

For the remaining 76 percent of the sample represented in Figure 3-1, distributed across four response options, some level of the analytics data use is indicated. None of these remaining responses are negative, although some probably represent higher or more advanced levels of use. Of these remaining four response options, two of them—CMO and marketing team reviews of the data—are internal uses of analytics process output.

Using and reporting analytics data just within the marketing department is not a bad thing. Valid analytics data has the same power to influence actions even if that data never reaches outside the marketing department. In Figure 3-1, almost half of marketing organizations keep their analytics data within the family. Most common is that members of the marketing team only (38 percent) have access to and use the data. Less common is for the CMO only (10 percent) to see and use the data.

From a macro perspective, one of the primary uses of analytics data is to inform and validate decision making. Even when kept inside the department, the marketing analytics data loses none of its power to do this. However, another primary use of analytics data is as an accountability mechanism. Reporting analytics data outside the marketing department, usually to executives and other stakeholders, establishes a healthy level of accountability. When the data stays within the marketing department, it loses most of its leverage as an accountability mechanism. Lower levels of analytics maturity are commonly associated with keeping the data within the department.

Image Note  Analytics data is robbed of much of its power as an accountability mechanism when it stays within the marketing department.

The highest level of marketing analytics maturity is associated with organizations that use the analytics data to report performance externally. Figure 3-1 shows two categories of response that represent external reporting of analytics data: to the executive team and to anyone in the organization. Making up the larger of these groups is the 18 percent of typical marketing organizations that report data to the executive team, and this is the most important level of accountability. When this happens, hopefully one result is an ongoing, healthy dialogue between the CMO and CEO about what marketing is doing and how it is contributing. Analytics data is a wonderful catalyst for these discussions, particularly when the CEO expects this type of accountability and the CMO willingly delivers it.

Constituting the smaller of the two groups that report analytics data externally are the marketing organizations that practice complete transparency. They make the analytics data available to anyone in the organization that wants to review it. This “nothing to hide” attitude about the work of marketing is a commendable position to which many more organizations should aspire but few attain. The marketing teams that practice this level of transparency are generally associated with the highest level of analytics maturity.

The Impact of Analytics

Half the money I spend on advertising is wasted; the trouble is I don’t know which half.

—John Wanamaker, 19th-century retail pioneer

It’s true that analytics can help marketing improve its performance. With the proper selection of metrics, the collection of data, the diligent analysis and reporting of it, and the resulting decisions, analytics are a potent performance enhancer. But what kind of performance, and how do the improvements come about? If analytics are a means to an end, and that end is better revenue performance, how do analytics get us from here to there?

The end game for marketing analytics is revenue, and with revenue impact and expenses known, it is possible to derive ROI for marketing, a topic of further discussion in Chapter 6. How, then, does the analytics process help create revenue? Understanding how marketing analytics affect revenue is the fundamental starting point for many analytics initiatives.

To understand this, consider the journey that most prospects take toward becoming customers. There is an initial recognition of need, followed by a process of discovering potential solutions, consideration of those solutions, and ultimately a decision. Marketing 101 teaches us that prospects must be aware of solutions so they can consider them and ultimately make a purchase. Much of marketing’s work, therefore, is about creating awareness as early as possible in this buying cycle.

Generating this awareness happens a number of ways. In the earliest days of marketing, there were fewer tools and channels for generating awareness. Marketing and advertising were synonymous, and print was the primary channel. Even then, however, there was a need for analytics data.

John Wanamaker, quoted above, understood the limits that a lack of analytics data about advertising efforts imposed. The only thing he had to help him optimize his advertising was his intuition.

Fast forward to the 21st century, when marketers have at their disposal myriad channels, both traditional and digital, to create awareness. There are so many combinations of messages, creativity, and channels that there is no hope of making intelligent decisions about optimizing the media mix without analytics. They are indispensable to the modern marketer.

The modern marketer’s awareness efforts take the form of campaigns whose desired output is leads: individuals or businesses that have identified themselves as having some level of need or interest in a solution, and who have also surrendered contact information. For most companies, leads are the fuel for the sales engine. It’s marketing’s job to ensure that a steady flow of leads is coming in that are nurtured, qualified, and then passed along to the sales team. Most of the time, the lead generation process is ground zero for marketing analytics.

The right combination of testing and analytics can make the difference between a lead generation campaign that is a flop and one that is stellar. Because marketers don’t have unlimited budgets for lead generation, analytics are an important tool for helping them understand which campaigns to expand, which ones to revise, and which ones to abandon. The analytics provide the information marketers need to optimize campaigns, which in turn generates more leads, which the sales team turns into more revenue.

A wonderful example of how analytics help marketers execute more effective awareness and lead generation campaigns comes from the 2008 U.S. presidential election.2 In 2007, then-candidate Barack Obama was competing for the Democratic Party nomination and trailing by double digits in the polls. Dan Siroker, director of analytics for the campaign, recognized that every visitor to the Obama website was an opportunity to raise awareness and campaign funds. The home page of the website featured a large graphic of candidate Obama, and at the bottom of the page was a call-to-action button. The question facing Siroker was, which combination of media and call-to-action buttons would produce the best results?

Siroker set up a series of A/B tests, also called landing page split tests, to determine which combination of six images and four call-to-action buttons was most effective. The images all featured the candidate in various settings. The call-to-action button options were labeled “JOIN US NOW,” “LEARN MORE,” “SIGN UP NOW,” and “SIGN UP.” Siroker’s goal was to determine which of the 24 possible combinations of image and button performed the best, so he set up an experiment to show the various combinations to website visitors and collected the data.

The metric he used to measure success was sign-up rate—the rate for which a test combination of image and call-to-action button caused visitors to provide their email address to the campaign. After gathering enough data, the winning combination was determined, and it was not the combination the staff initially favored.

The page with the top-performing combination had a sign-up rate of 11.6 percent, compared to the original sign-up rate of 8.3 percent. This represents an improvement rate of almost 41 percent. Over the course of the campaign, that difference translated into almost 2.9 million more email sign-ups, with an average donation per sign-up of $21. The analytics-optimized web page generated an additional $60 million for the campaign.

In this example, analytics made a difference on a grand scale by illuminating critical information about the performance of web landing pages that otherwise would have remained in the dark. There certainly is a cost to taking the approach this example describes: it requires time, effort, and a little bit of technology, and this investment is often a barrier to getting it done. The return on the investment, however, almost always exceeds the cost. The value of being well informed is almost always higher than the cost of remaining ignorant. As Mike Krass, CEO of MKT Media Group, put it, “The price of light is less than the cost of darkness.”

Analytics will certainly help marketers improve their calls to action and campaign landing pages on their websites, but their impact goes beyond that. Analytics can make an impact in these areas as well:

  • Brand recognition: Understand the mindshare your brand enjoys and the sentiments customers have toward it.
  • Content: Know with certainty which of your marketing content is most widely consumed, shared, and produces the best conversion.
  • Channel optimization: Compare performance of various marketing channels, such as email or pay-per-click, to improve their performance or invest only those that perform the best.
  • Customer understanding: Gain a deeper understanding of customer behavior to better understand their needs and preferences.
  • Predictive intelligence: Accurately predict early in the buying cycle which customers will buy and when.

The list represents just some of the areas where analytics can have a significant impact, and not just on the marketing function but across the entire organization. Analytics can only have this impact, however, if they are based on the right metrics, which are diligently tracked and reported and used to influence decision-making, and the organization views the entire process as credible. Even when the analytics process is rock solid, if the organization has no faith in the process, the analytics won’t impact the organization the way they should.

Analytics Credibility

It’s one thing to have a marketing analytics process at some level of maturity, and quite another to have the basic output of that process—the data—enjoy credibility. Decisions are made on this data, so if the hope is to have confidence in those decisions, the analytics process and the data it produces must be sound. The marketing analytics process has to have integrity and credibility so that the entire organization will have faith in the process and the decisions born out of that process.

Just because a marketing analytics process collects and reports data across a set of metrics doesn’t mean that everyone is going to believe the numbers. Doubt about the accuracy and credibility of the analytics data can exist inside and outside the marketing organization. Sometimes doubt creeps in because the process is suspect, or the input data has quality issues. Other times, the beneficiaries of the data simply don’t want to believe what the data reveals.

It’s not uncommon for anyone to view the data that marketing collects, maintains, and uses as suspect. Much of the data on which marketing relies is often stored in CRM and marketing automation systems. This data finds its way into these systems through a number of paths, and each path presents challenges with respect to maintaining accuracy and quality.

Ideally, an organization will have a process or mechanism for ensuring data quality or cleaning the data, because the longer dirty data is allowed to reside in a system, the higher the probability that quality issues will compromise the analytics, campaigns, people, or other systems that use it. Most organizations, however, don’t have a formal data cleaning process, and the lack of one is at the root of analytics credibility issues. Figure 3-2 summarizes the state of sales and marketing data quality.3

9781484202609_Fig03-02.jpg

Figure 3-2. Sales and marekting data cleanliness

As Figure 3-2 depicts, only 36 percent of organizations indicated that their sales and marketing data was “clean” or “very clean.”4 Since the analytics process is built on the foundation of data, the implication for the other 64 percent is that they are bound to see their analytics process suffer a credibility gap, particularly if it is known or believed that the data quality is low. Even having clean data is no assurance that a marketing analytics process will enjoy credibility.

Still, having clean data is a prerequisite for an effective, credible analytics process. The current data quality status in a majority of organizations should create a sense of urgency for cleaning up this data. This urgency is only heightened when data volume growth is factored in, because for most companies, that volume grows with every sales call, campaign, webinar, or trade show, compounding the problem.

Even if the data volume isn’t growing, static data doesn’t remain valid indefinitely. Data quality decays through contacts changing companies, email addresses, roles, titles, and responsibilities. Marketing analytics data certainly consists of more than prospect or customer contact data, but most of the metrics marketing can track ultimately lead back to the customer, and it is therefore linked to a customer contact record. When it does, organizations need linkages to accurate customer and prospect data. To do nothing about data quality is a default commitment to poor data quality. Organizations must have some level of intentionality and process discipline to even maintain data quality at any level or (ideally) improve it.

A last thought about data quality: regardless of how pristine the input data for the marketing analytics process is, those who are (or would be) critics of the process must understand the level of cleanliness. There’s a public relations burden the marketing team must shoulder in this regard. If marketing is going to preserve or promote the credibility of its analytics process, it must ensure that the data is clean and that everyone knows it is clean. A flawed perception of data quality can sink the credibility of any analytics process.

A lack of faith or credibility in the process, no matter how much care went into selecting and tracking proper metrics, severely hobbles a marketing analytics effort. The seriousness of the credibility issue with analytics is a very real obstacle on the analytics landscape. Figure 3-3 provides a snapshot5 of where organizations land on the spectrum of marketing analytics credibility.

9781484202609_Fig03-03.jpg

Figure 3-3. Marketing analytics credibility perception

The data represented in Figure 3-3 gets to the heart of how much trust exists in the marketing analytics process. Considering how the analytics process should work—producing data that helps marketing keep its efforts aligned with strategy and performing well—a critical success factor is trust. What value is the analytics process if it isn’t trusted enough to influence big decisions? The output from the process, therefore, must influence decision making. It can only have influence if the process stakeholders have confidence in the process and its output.

Currently, just one-third of companies say they attach “good” or “high” credibility to their marketing analytics. Credibility perceptions of “moderate,” “low,” or “no” credibility indicate skepticism. This credibility data reveals that it isn’t enough to have a sound marketing analytics process; the process has to enjoy the confidence of its stakeholders so that analytics can influence decisions. The true litmus test for a marketing analytics process is this: is the company willing to make serious decisions and commit resources based on the guidance the process provides? Chapter 5 detail a plan for helping organizations to implement a sound marketing analytics process, and part of that plan includes efforts to create trust and confidence in the process output.

Analytics and Decision Making

Ultimately, a marketing analytics process should provide data that facilitates better decision making. As discussed in Chapter 1, analytics exist in the marketing organizations not to make decision making obsolete, but to inform decisions. After decisions are made, analytics have another use: to determine whether those decisions were the right ones. The most mature marketing analytics processes use analytics in both the pre– and post–decision-making phases.

To understand how analytics can inform and then validate decisions, let’s walk through a hypothetical decision-making situation for a marketing team confronted with a challenge. Imagine that you are the CMO. During a meeting of the executive team, the chief financial officer (CFO) reports that revenue projections for the quarter show a “miss” by half a million dollars. After some discussion, the CEO turns to you, the CMO, with a charge to figure out how to make up the projected revenue shortfall.

If you were in a marketing organization that did not use analytics or didn’t use them much what would you do? You would still make decisions, and you would use your experience and intuition. Sometimes, this approach would serve the organization well, but other times it wouldn’t. It’s a roll of the dice, and in situations like these you wish an analytics process was already in place to provide some insight.

After the meeting with the CEO, your first stop is the VP of sales. Together, the two of you must determine where the revenue will come from: new customers, existing ones, or a combination of both? Because the company has implemented CRM and marketing automation systems, the metrics you need to make a wise decision are available. What you know is that the average sales cycle to take a new, qualified prospect to close is six months. Furthermore, the analytics tell you that your sales team successfully converts into a sale 30 percent of qualified prospects with which you engage. With an average sale of $5,000, you’ll need 100 new sales to close the revenue gap. Given your sales conversion rate, the marketing team will need to identify 330 new, qualified prospects to get $500,000 in new revenue.

Here’s the problem: you don’t have time to generate 330 new, qualified leads, and take at least 100 of them through the six-month sales cycle before the end of this quarter. Therefore, you continue your deliberations with the VP of sales by reviewing the opportunities in the sales pipeline that are already in process. Again, because you use the proper systems—in this case, CRM—you have the data available.

Working with the VP, you analyze the pipeline data, removing the leads that are already projected to convert this quarter. After doing this, you discover is that there are 285 qualified leads languishing in the sales pipeline. These are opportunities that entered the pipeline as qualified leads, have gone through part of the buying cycle already, but for various reasons have not warranted the attention of the sales team or have they been nurtured by marketing.

You and the VP determine these 285 existing qualified leads represent the best opportunity to make up the $500,000 revenue shortfall by the end of the quarter. One concern is that the average conversion rate tells you that you can expect to turn 30 percent of these leads into sales. At that conversion rate, with just 285 leads you’ll gain about 85 sales or $425,000, falling short of filling the revenue gap by $75,000. Past experience has taught you that taking the same approaches produces the same results. You think out loud for a few minutes about trying some new approaches to nurture these neglected leads in the pipeline. Leaving the VP’s office, you pledge to put your team to work on a plan to convert these 285 at a higher rate.

Once back in your office, you gather your team and discuss the challenge. Typically, your lead nurturing efforts consist of emails linked to landing pages that offer some sort of premium content, such as a white paper or case study. Your team has developed a reputation for developing compelling content that is also visually appealing. The Achilles heel, however, has been getting prospects to the landing pages to get this content, which is “gated” or kept behind a form requiring the visitor to surrender their contact information.

With your team, you review the results of several previous email campaigns. For each campaign, you have analytics data that includes open rates, bounce rates, click-through rates and even which recipients opened a message. As you review this data, you confirm what you thought was correct: your open rates have historically been strong. This is a benefit of efforts you make to ensure the quality of the data in your CRM and marketing automation systems. What the analytics also show, however, is that the click-rates are lower than industry averages, and they are declining. In other words, the response to your promotional email has been strong, but not enough recipients are clicking on the links in the email to the landing pages you’ve created. This is troubling, but at least the analytics are telling you where the problem is.

The next thing you and the team review is the web analytics data for the landing pages. Here you discover another problem. Too few of the email recipients that are taking the time to click through to the landing pages are choosing the call-to-action button, usually something like “Download Case Study” or “Sign Up Now” and similar calls to action. This data confirms what your instincts have suggested for a few months: you’re experiencing eroding performance of the tried-and-true nurturing methods that have served you well. Until now, you’ve never had a projected revenue shortfall to create a sense of urgency about monitoring these analytics more closely.

As you and the team ponder the implications of what the analytics are telling you, there is discussion about how to nurture these 285 leads differently, in a way that hopefully produces a conversion rate in excess of 30 percent. If you all can put together a campaign that produces a 35 percent conversion rate, you’ll get the 100 sales needed to generate the half million in revenue. The arithmetic is straightforward. The big questions is, what approach will do this?

A member of your team has recently been experimenting with interactive video, conducting some A/B testing on various landing page configurations. Interactive video is marketing video content that features calls-to-action links built right into the video itself. The videos are usually less than two minutes in duration, and when it ends, the viewer is presented with one or more call-to-action buttons. In a departure from how you typically use landing pages, the team tested interactive video that was not “gated”—in other words, visitors to the landing page did not have to first put in their contact information. Instead, at the end of the video, one of the calls to action allowed the viewer to request further information and contact from a sales representative.

Offering ungated content is a departure from the method your team has used in the past. Web forms have always been part of your campaign landing pages because they allow you to capture leads for the sales team. However, this case is different. You’re not trying to capture leads but accelerate the buying cycle for leads you already have. The analytics from the interactive video testing look very promising: visitors clearly seem to prefer video and they are opting to click the “Contact Me” button at the end of the video at twice the rate that you’ve experience for other forms of content. You make the decision to build a campaign using interactive video.

The next big campaign decision is to determine what the video will feature. Video marketing best practices indicate that videos of 90 seconds or shorter are the most successful. Again, you and your team review analytics data to determine what type of marketing messages might work best for qualified prospects who are later in the buying stage. Analyzing past campaigns, you’ve seen the strongest conversion rates for content that features a case study, a real-world story of a customer that implemented your solution with a description of the benefits they received. Based on the available data, you put the team to work producing a 90-second interactive case study video.

The team works quickly, and the video is ready to deploy in short order. Using your marketing automation system, you prepare the campaign landing page and the email that will go out to the 285 prospects you’ve identified. Using the analytics data in your marketing automation system, you identify the ideal day of the week and time to send the email.

Because you have created a data-driven culture in the marketing team, they are always thinking about how to measure everything they do. This campaign video was created with the ability to track viewing duration and shares. Furthermore, your marketing automation system is fully integrated with the CRM system, so when a qualified prospect views the video, that activity is logged in both systems. In addition, when a prospect views the entire video, an email alert is sent to a member of the sales team. Your campaign will track and report these metrics in real time, allowing the sales team to respond quickly when a prospect’s behavior indicates interest.

You launch the campaign. The first few days after the initial email send will tell you if your strategy is working, or if you need to develop a Plan B. The tracking data on the email open rate shows that it is within the normal range, as you expected. What’s most encouraging, however, is that the click-through rate to the campaign landing page is slightly above average. Even better, the landing page call to action to view the case study video is performing much better than previous landing page invitations to consume gated content. So far, the strategy is working beautifully, but the ultimate campaign metric—revenue—is still unknown.

It’s time to check in with the VP of sales. You stop by for a quick update and find the sales bullpen is buzzing! The VP has little time to chat, but for the best of reasons. The interactive video is generating a flurry of inquiries to the sales team through the calls to action options embedded at the end of the video. The case study video is clearly resonating. The VP shares that a number of prospects have resumed a sales dialogue with words to the effect of “we’re having the same problem as the customer in the video—help us solve it.” The sales team is booking demonstrations and having quality interactions with many of the prospects targeted in the campaign.

Within a matter of weeks, the quarterly revenue forecast is revised upward. The VP of sales reports that an additional 105 deals are forecast to close by the end of the quarter, erasing the projected revenue shortfall. You and the marketing team debrief about the campaign, reviewing all the metrics to determine how to improve tactics the next time. One thing everyone agrees on: when you approach the CFO for additional funds to invest more heavily in video marketing, you won’t have any difficulty getting approval given the results of this campaign.

As the preceding scenario illustrates, analytics influence the decision making that takes place within a modern marketing organization in many ways, both large and small. This scenario represents just one way analytics help drive confident decisions whose results you can link directly to revenue.

___________________________

1“Marketing Analytics in 2013: Benchmarks, Insights & Advice,” Demand Metric Research Corporation, March 2013. http://www.demandmetric.com/content/marketing-analytics-benchmark-report

2“How Obama Raised $60 Million by Running a Simple Experiment,” Optimizely blog, November 29, 2010.

http://blog.optimizely.com/2010/11/29/how-obama-raised-60-million-by-running-a-simple-experiment/

3“Sales & Marketing Data Quality Benchmark Report,” Demand Metric Research Corporation, November 2013. http://www.demandmetric.com/content/sales-marketing-data-quality-benchmark-report

4Ibid.

5“Marketing Analytics in 2013: Benchmarks, Insights & Advice,” Demand Metric Research Corporation, March 2013. http://www.demandmetric.com/content/marketing-analytics-benchmark-report

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset