CHAPTER 11
Racial Profiling at Nextdoor
Using Data to Build a Better App and Combat a PR Disaster

Action expresses priorities.

—Mahatma Gandhi

By all accounts, Nirav Tolia seems like a generally jovial guy—and for good reason.

Tolia cofounded the “hyperlocal” social network Nextdoor in 2010 along with David Wiesen, Prakash Janakiraman, and Microsoft veteran Sarah Leary. Today, he serves as its CEO.

If you haven’t heard of it, Nextdoor is a useful app that fills a genuine need that Facebook does not: to connect with our neighbors. Today, millions of people in over 140,000 “microcommunities” use it.* Tolia has been front and center spreading the company gospel. Media appearances have flowed, including an appearance on Dr. Phil.

Tolia had to be downright giddy on March 3, 2015, when his company announced that it had raised $110 million in venture capital. The deal valued the company at more than $1 billion. Champagne all around: Nextdoor had reached the revered status of unicorn.

A scant three weeks later, all of that celebrating must have seemed like a distant memory. The news site Fusion published an article explaining how Nextdoor “is becoming a home for racial profiling.”1 As Pendarvis Harshaw wrote:

While Nextdoor’s ability to assist in crime-spotting has been celebrated as its “killer feature” by tech pundits, the app is also facilitating some of the same racial profiling we see playing out in cities across the country. Rather than bridging gaps between neighbors, Nextdoor can become a forum for paranoid racialism—the equivalent of the nosy Neighborhood Watch appointee in a gated community.2

Harshaw detailed how presumably white members were using Nextdoor’s crime and safety forum to report “suspicious” activities by African Americans and Latinos. Harshaw’s article included redacted screen shots from ignorant or hateful Nextdoor users such as the one in Figure 11.1.

Screenshot reads, ‘casing the house’, ‘we just had a young AA woman with long dreds very slender and light (should have been in school) knock aggressively on our door and ask for “keith”. first time while we were home. she disappeared immediately after. checked all cross streets.’

Figure 11.1 Screenshot from Racially Charged User

Source: Fusion.

Not long after Harshaw’s article went live, Tolia and his senior team soon entered disaster and damage-control mode. This was the type of story that had legs. Case in point: Less than two months later, Jennifer Medina of the New York Times continued the thread, reporting that:

. . . as Nextdoor has grown, users have complained that it has become a magnet for racial profiling, leading African-American and Latino residents to be seen as suspects in their own neighborhoods.3

How Nextdoor responded illustrates the importance of reacting quickly and how Agile analytics can be invaluable in this regard.

UNINTENDED BUT FAMILIAR CONSEQUENCES

Why didn’t Tolia and his team see this type of abuse coming?

Such a question might seem obvious, but it is inherently unfair.

As the cliché goes, hindsight is 20/20. No one could have reasonably expected Nextdoor to launch its app with every conceivable feature and safeguard already in place. That’s not very pragmatic and it certainly isn’t very Agile.

In a way, Nextdoor had become a victim of its own success. Racial profiling wouldn’t have become a problem on the app if few people had downloaded and used it. (Yes, there are drawbacks to network effects.) After all, most start-ups fail; few ever attain anywhere near Nextdoor’s level of reach.

As BackChannel’s Jessi Hempel wrote:

Most social web services — like Airbnb or Facebook or Twitter — were launched quickly. Their founding teams—consisting mostly of well-off men (and the occasional woman) from prestigious universities—were not diverse. Those teams hired designers and engineers who looked like them to launch and grow the sites. These companies weren’t thinking about the way bias would influence how people use their services; they were moving fast and breaking things, content to fill in the details later. What’s more, they mostly built advertising businesses that became more successful as people provided them more social data by posting more on the sites. There was little business incentive for them to slow their users down and ask them to think about why and how they were posting—and, in some cases, to post less.4

Like just about all startups these days, Nextdoor initially focused on growth. As Figure 11.2 shows, the app included basic reporting functionality, but it wasn’t particularly sophisticated.

Screenshot from Original Nextdoor App shows ‘Home page’ with blank space for writing the ‘message’, ‘subject line’, et cetera. And buttons at the bottom for adding photo, close or post.

Figure 11.2 Screenshot from Original Nextdoor App

Source: Nextdoor.

Nevertheless, Nextdoor was starting to grow up. It now was a unicorn, not the pipedream of a few ambitious college students who are, as another hackneyed Silicon Valley phrase goes, looking to “make the world a better place.” Big-boy companies face big-boy problems.

EVALUATING THE PROBLEM

Nextdoor could have easily taken a laissez-faire stance to charges of allowing racial profiling on its app. Its founders could have rationalized that no site or app is perfect, that fleas come with the dog, and so on.

There’s plenty of historical precedent for maintaining such a position. The tech landscape is littered with founders who claim that their products are merely neutral conduits that facilitate communication or commerce among users. Consider some examples over the years:

  • Google and Yahoo long turned a blind eye to click fraud. Google also took its sweet time addressing copyright-infringement claims on YouTube.
  • eBay launched without any formal buyer protection, and even today, many users find it wanting.
  • Facebook’s leadership allowed fake news to proliferate on the social media site during the 2016 U.S. presidential election.
  • Unlike hotels, Airbnb assumes no responsibility for the safety of its guests. To this end, the company has quietly settled lawsuits that would have resulted in devastating press.*
  • Uber has steadfastly refused to conduct background checks on drivers, something that taxi companies have done for decades.
  • Twitter, for years, let trolls and terrorist groups operate unabated. (For more on this, see “ESP at Twitter” in Chapter 2.)

To its credit, Nextdoor’s actions distinguished the company from its largely apathetic tech brethren. Its management immediately addressed the problem of racial bias head-on. Perhaps Tolia and Janakiraman (Indian Americans) and Leary (a woman) were particularly sensitive to the issue because they didn’t look like most start-up founders. It’s conceivable—maybe even probable—that they would not have moved as quickly had they been white males, à la Travis Kalanick of Uber. The cofounders might have feared that negative press would harm their individual reputations, not to mention Nextdoor’s eye-popping valuation. Maybe it was a combination of all of these things.

Whatever its motivations, Nextdoor moved quickly. The cofounders assembled a small but diverse team to tackle the issue. Members included product head Maryam Mohit, communications director Kelsey Grady, a product manager, a designer, a data scientist, and later, a software engineer. (Chapter 5 showed how Agile teams benefit from different perspectives, skills, and expertise.)

Within five months, the team believed that it had found an answer to its problem. Its three-pronged solution included diversity training for its neighborhood operations team as well as an update to its community guidelines and an accompanying blog post. The third part, though, proved to be the trickiest.

Redesigning the App

Nextdoor understood the intimate nexus among app design and user behavior and data. The design of any app directly affects how users interact with it as well as the data it generates. Change the app’s design and you probably change user behavior—as well as the types of data that users generate.

By way of background, Nextdoor for years had allowed people to flag inappropriate posts, either by content or location. For instance, commercial posts don’t belong in noncommercial areas of the app/website. Nextdoor realized that a binary (i.e., flagged or not flagged) was no longer sufficient. To this end, the company added a quick fix in the form of a report racial profiling button.

Ultimately, this step was woefully inadequate because many users didn’t understand the new feature. Identifying racial profiling isn’t tantamount to spotting a red Lexus sedan speeding down the street. “Nextdoor members began reporting all kinds of unrelated slights as racial profiling. ‘Somebody reported her neighbor for writing mean things about pit bulls,’ Mohit recall[ed].”5

Much of Nextdoor’s data was text-centric (i.e., unstructured). Especially at first, this type of data doesn’t lend itself to the kind of easy analysis that its more structured equivalent makes possible. This difficulty doubles when trying to deal with a thorny issue like race relations. Tolia and his colleagues understood this and assigned five employees to read through thousands of user posts. The course of action was anything but obvious.

By looking at the data, the team grasped that it needed to take a step back and answer a core question: What exactly is racial profiling anyway? As Hempel wrote:

The team realized it needed to help users understand when to use race when talking about suspicious or criminal activity. And to do that, they needed to define — very specifically — what constituted racial profiling in the first place. “We could not find a definition of racial profiling that everyone agreed on,” says Tolia. “If you go to the NAACP, if you go to the ACLU, if you go to the White House Task Force on Diversity, if you go to the Neighbors for Racial Justice, none of these people agree on what racial profiling is.”

An overly broad definition of racial profiling would capture far too many false positives. Conversely, an overly granular one would result in legitimate claims slipping through the cracks. Drawing the line would be neither simple nor easy. This recognition led Nextdoor management to ask fundamental questions about what it was trying to achieve and how:

  • What if Nextdoor redesigned its reporting feature to gently guide its users in a specific direction?
  • Could it design the app in a way to minimize and discourage the very behavior that it was attempting to prevent?
  • Would better design ultimately lead to better data?
  • Did one size really fit all? Was it time to separate suspicious activity from crime and safety?

Agile Methods in Action

We’ve seen throughout this book that Agile methods explicitly acknowledge uncertainty: It’s usually impossible to know the path to success in advance. As Silicon Valley serial-entrepreneur and academician Steve Blank is fond of saying, “There are no facts inside your building.” The implication is that no one can know what will work in a vacuum. Get outside the building, start testing, gather data, talk to users, and then you can evaluate your progress.

Nextdoor’s top brass clearly understands this key tenet. To this end, the team developed six different variants of its app and began testing them. Doing so helped the company home in on the answers to key questions:

  • If the app alerted users about the potential for racial bias before they posted, would it change user behavior?
  • Characterizing a person isn’t necessarily easy. How does an application prompt its users for descriptions of others that are full, fair, and, crucially, not based exclusively on race?
  • In describing a suspicious person, how many attributes are enough? Which specific attributes are more important than others?

In keeping with methods espoused in The Lean Startup, the team conducted a series of A/B tests. Blessed with a sufficiently large user base, Nextdoor ran experiments to determine the right answers to these questions. For instance, consider two groups of 25,000 users divided into cohorts (A and B). Each group would see one version of the Nextdoor app with slight but important differences in question wording, order, required fields, and the like. As anyone with a modicum of knowledge in survey design knows, even the order of words can sway results.

Over the course of three months, Nextdoor’s different permutations started to bear fruit. The team began to clarify the best way to address racial profiling. Certain versions of the app worked far better than others in this regard. Nextdoor was getting close. By August 2015, the team was ready to launch a new posting protocol in its crime and safety section—one far more granular than that shown in Figure 11.2.

As Figure 11.3 shows, users who mention race when posting to “Crime & Safety” forums now must provide additional information. Nextdoor requires users to enter a minimum of two of the following four additional categories: hair, top clothing, bottom clothing, and shoes. Figure 11.4 shows this redesigned form.

Screenshot from Nextdoor’s redesigned App shows display of ‘Incident’ page with heading ‘First, describe the page’ with blank space given below. It also shows ‘next’ button at the right corner of the page and ‘cancel’ at other end.

Figure 11.3 Screenshot from Nextdoor’s Redesigned App

Source: Nextdoor.

Screenshot from Nextdoor’s redesigned App shows page with heading ‘describe a person’ with options of filling details for hair, top, bottom, shoes, age, et cetera given with ‘add this person’ button at bottom.

Figure 11.4 Screenshot from Nextdoor’s Redesigned App

Source: Nextdoor.

Before users can post suspicious activity, Nextdoor intentionally inserts additional friction.

None of this was coincidental; everything here was deliberate. No, Nextdoor will not eliminate posts by racist users with axes to grind or generally insensitive folks. It has, however, used data intelligently to redesign its product with fantastic results. Kudos to the company and its management for taking this issue so seriously and acting accordingly.

RESULTS AND LESSONS

Nextdoor was able to stem the bleeding in a relatively short period of time. (These matters are nuanced and quite difficult to fix.) By taking a data-oriented approach to design (illustrated in Figure 11.5), the company reported that it had reduced racial profiling by 75 percent. As Kashmir Hill wrote for Fusion:

Screenshot from Nextdoor’s redesigned App shows heading as ‘Posting about suspicious activity is tricky, we can help’ with ‘Follow these tips to make sure all your neighbors feel safe and respected’. It shows tick boxes for ‘Focus on behavior. What was the person doing that concerned you and how does it relate to a possible crime?’, et cetera.

Figure 11.5 Screenshot from Nextdoor’s Redesigned App

Source: Nextdoor.

Erasing racism through technology alone is impossible, but Nextdoor found that a few changes to its interface actually did significantly discourage racial profiling by its users. All users who make posts to their neighborhood’s “Crime and Safety” forum are now asked for additional information if their post mentions race. Nextdoor says that the new forms it’s introducing have “reduced posts containing racial profiling by 75% in our test markets.”6

If you think Nextdoor’s new protocol adds friction that reduces the number of inappropriate user entries, you’re absolutely right. Nextdoor reported that posts under suspicious activity dropped by 25 percent. As Tolia points out, though, “many of those posts shouldn’t have been on the site in the first place.”7

A lesser organization would have announced plans to “study the problem” as it continued unabated. Nextdoor unquestionably gained tremendous goodwill among many of its users by addressing what could have been an existential crisis. To be fair, it might have upset and even lost users who wanted to continue racial profiling, but the company is happy to make this trade-off.

CHAPTER REVIEW AND DISCUSSION QUESTIONS

  • How did Nextdoor’s first version of its app enable racial profiling?
  • Was the fix simple? Why or why not?
  • What do Nextdoor’s changes mean in terms of the data that it now collects?
  • Do you think that Nextdoor is finished tinkering with its app? Why or why not?
  • What has the company learned from the experience? Will these lessons inform its future design decisions? Why or why not?

NEXT

We’ve covered a wide range of case studies. It is now time to switch gears and distill some of the lessons from Part Two.

NOTES

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset