Continuing #Startwitheight: Eight Easy Ways To Minimize The Effects Of Implicit Bias In The VC Review Process

with contributions from:
No items found.

This March, Alpha Edison launched the #StartWithEight initiative, challenging our colleagues in VC to join us in taking meetings with eight women from outside our usual networks during the month of March. As we noted then, the statistics about women in venture capital are dismal. In 2017, only 2 percent of venture capital funding went to female founders, and only 8 percent of partners at the top venture capital firms were women. Meanwhile, Only 18 black female founders had raised more than $1M of venture capital for their startups, ever.

These numbers aren’t a secret; it’s widely known that women and minorities face prodigious barriers to entry in VC, but so far, a solution to correct the inequity has been elusive. The #StartWithEight campaign is a first step in correcting the historical imbalance and opening up pipeline access. The next step  focuses on the other half of the equation: addressing and reducing bias.

Implicit bias is bad for business

We can’t reduce bias without first identifying it. Implicit bias is defined as unconsciously having a preference for or aversion to certain groups. Or, to put it another way, the idea of implicit bias is that no one enters any situation or views another person from a completely neutral position; we all have subconscious attitudes and associate stereotypes with certain groups. For the most part, the VC industry has adopted a head-in-the-sand posture against the challenge of implicit bias, but the financial impact is both measurable and vast. As noted, 98 percent of all VC funding goes to male founders, yet a 2015 study by First Round Capital on the over 300 companies in their portfolio found that teams with female founders outperformed all-male founding teams by 63 percent.

For racial and ethnic diversity, a 2015 McKinsey report found that companies in the top quartile for diversity are 35 percent more likely to have financial returns above their respective national industry medians. That Why Diversity Matters report also found that for every 10 percent increase in racial and ethnic diversity on the senior-executive team, earnings before interest and taxes (EBIT) rise 0.8 percent.

Small interventions make a large difference

Dealing with the toxic influence of social stereotypes in the venture capital review process seems like a Herculean task. Even those of us with the strongest intentions shrink when faced with the enormity of the many societal, historical, and justice issues entangled with these biases.

And while gender or ethnicity are often top of mind, other biases around sociodemographic criteria -– the schools people attend, places they’ve lived, and their economic and social class -– not to mention past performance (a.k.a anchoring), also plague the VC deal-making process. In short, VCs and the startups they fund tend to lead towards preference, and our lack of progress in changing the status quo is driven in part by two factors:

  1. We focus on reducing the causes of implicit bias instead of adapting our procedures to minimize its impact.
  2. We overestimate the level of effort needed to correct for implicit bias.

But these ideas are misguided. To reduce the impact of implicit bias in VC, we don't have to root out all bias from our hearts and minds, or drastically change the way we choose which companies to fund. Nor do we have to do extensive soul-searching about where our implicit biases come from (though that may be useful work to do for other reasons). Research shows that some simple interventions can greatly reduce the impact of implicit bias on any decision-making process, from orchestra auditions to referees’ calls at sporting events. And most of them can also be applied to VC.

#StartWithEight Anti-Bias Strategies

The only way to level the playing field is to tackle these biases head on. Here are our eight strategies to reduce various kinds of implicit bias in VC selection and decision-making processes.

1. Be aware of your bias

Simple awareness of bias is usually the first step to reducing it—and in some cases, it may be impactful in itself. In one article published in 2014, researchers looked at the effects of an earlier study showing racial bias among referees in the National Basketball Association (NBA). That study, which used data from 1991 to 2002, “found that personal fouls are more likely to be called against basketball players when they are officiated by an opposite-race refereeing crew than when officiated by an own-race refereeing crew.” Though the study concluded in 2002, the results were not widely publicized until 2007, when they landed on the front page of the New York Times and were covered by the major television networks.

For the 2014 study, researchers conducted the same tests for own-race bias both before the earlier study was released (2003 to 2006) and after the media coverage (2007 to 2010). During the first period, they found “continued own-race bias” at rates similar to those in the original 2002 study. However, in the second period, they found no bias. They concluded that the “results suggest that raising awareness of even subtle forms of bias can bring about meaningful change.”

This insight has many applications outside of NBA game play. I managed research funding programs at the the National Science Foundation (NSF) for five years before joining Alpha Edison. NSF established a diverse strategy for tackling bias. The world of scientific peer review shares some similarities with venture investing. When applying for federal grants, research teams submit a written proposal (which includes a detailed “biosketch” of each researcher) to a panel of experts who make recommendations to the NSF about which proposals to fund. For obvious reasons, this is a process vulnerable to implicit bias.  To reduce bias, some NSF program officers will insert a slide about implicit bias in decision-making into the presentation they use for panelist training. This reminder acts to make panelists conscious of their unconscious bias and thereby reduce its impact. It’s a quick, simple intervention that could also be repurposed for a venture capital context. For instance, a firm could insert a reminder about implicit bias in written materials reviewed before pitch meetings or other occasions for group decision making, and make a brief verbal announcement in opening remarks during discussions.

2. Take your time

Research has found that when people are in certain mental states, they are more likely to apply stereotypes than those who aren’t. One 2009 study found that negative emotions like anger and disgust can produce stereotypical judgements, even when the source of those emotions has nothing to do with the groups being judged. Stress and energy level are also contributing factors. Recent studies have found that fatigue and overwork can increase implicit bias among police officers and emergency room physicians. The National Center for State Courts uses these criteria, among others, in their own strategy for reducing bias in judges.

This makes sense because stereotypes are a kind of cognitive shortcut. Making a snap judgement based on someone’s race or gender takes less mental work than making a considered decision based on their unique circumstances. Anything that saps a person’s mental energy and attention—from a heavy workload to loud construction noise—makes it more likely that they’ll engage in biased decision-making.

This means that, strangely enough, VCs can counteract the tendency to fall back on stereotypes by keeping up with self-care. For instance, they can make sure to get a good night’s sleep the evening before they hear a female entrepreneur’s pitch or use mindfulness techniques to make sure they go into the meeting feeling calm and collected. They can also give themselves time to think over their decisions, so they’re not coming out of a long, tiring meeting and immediately deciding “yes” or “no.” A well-rested, focused VC is more likely to have the mental capacity necessary to make good, unbiased decisions.

3. Blind your view

American orchestras were once dominated by male musicians. A 2000 study found that of the top five orchestras in the country, none were more than 12 percent female until around 1980. Things began to change in the 1970s and 1980s when orchestras opened their auditions to a wider pool of musicians and made their audition processes blind: that is, the identity of a musician was hidden from the jury. The researchers found that the blind audition process could explain 25 percent of the increase in the proportion of orchestra members who were female from 1970 to 1996. The Baltimore Symphony Orchestra recently reported that in the 2016-2017 season, women comprised 40 percent of orchestra musicians in the U.S.

Today, companies like GapJumpers and Search Party create the equivalent of the “blind audition” used so effectively by orchestras for companies including Google and Dolby. VCs could adopt similar tactics for their own hiring, for instance anonymizing names on resumes to hide ethnicity and gender. There’s also an occasion for “blind auditions” early in the pitch process, when a VC is reviewing a deck or other written materials about a company to decide whether or not to bring the founders in for a meeting. If names and photos are deleted from decks, VCs can do a “blind review” of the startup’s merits and viability.  

4. Be careful about questions

At the 2017 TechCrunch Disrupt New York conference, a team of scientists observed Q&A interactions between 140 venture capitalists (40 percent of them female) and 189 entrepreneurs (12 percent female), and then tracked funding rounds of startups that launched at the competition. Though the startups were comparable in quality and capital needs, the scientists found that male-led ventures raised five times more funding than the female-led startups, according to Harvard Business Review.

One key to the imbalance? The researchers observed that male and female entrepreneurs were asked two distinctly different types of questions. VCs asked men “promotion-oriented” questions about their ”hopes, achievements, advancement, and ideals,” while women were asked about “prevention-oriented” questions that focused on “safety, responsibility, security, and vigilance.” The bias was found with both male and female VCs.

The disparity in questioning had a substantial impact on the funding received. Entrepreneurs who answered mostly promotion questions raised about seven times more on average than those who fielded prevention questions. The researchers explained the results thusly: “By responding in kind to promotion questions, male entrepreneurs reinforce their association with the favorable domain of gains; female entrepreneurs who respond in kind to prevention questions unwittingly penalize their startups by remaining in the realm of losses.”

Implicit bias shows itself most strongly in instinctive decisions made in the moment, so advance preparation may be the best defense against this type of disparity. Before hearing a female entrepreneur pitch, a VC could sit down and write out several “promotion-oriented” questions and commit to asking them during the meeting. That way, they’ll be less likely to give into an instinct to ask only about potential losses and other “prevention-oriented” topics. Or, they could come up with a standard list of questions to ask all entrepreneurs who pitch them, standardizing away question bias entirely.

5. Control for likeability

It’s natural to feel drawn towards some people and not towards others. However, relying too much on this kind of natural chemistry when making funding or hiring decisions can open the door to bias. That’s in part because chemistry can be rooted in factors that aren’t so intangible. One 2012 study found that employers were more likely to hire candidates who were “culturally similar to themselves”—that is, who had similar hobbies, experiences, and interests. That means that startup founders who are culturally dissimilar from the average VC may suffer when decisions are based too much on likeability.

That’s not to say likeability is always rooted in bias, or that it shouldn’t be taken into account in funding or hiring decisions at all. However, it’s important to remain aware of its impact on decision making. In an article in Harvard Business Review, behavioral psychologist Iris Bohnet recommends employers rate job candidates on likeability like they would a technical skill. Making likeability an explicit, measured metric makes it more “controllable,” she says, so that it doesn’t get excessive weight. For example, likeability can be an aspect of leadership—but it’s not the only or even the most important aspect. Pulling out likeability as a separate quality makes it less likely that VCs will leap to the conclusion that a likeable founder will also be great at motivating and managing their team.

Talking about likeability directly also brings unconscious preferences more clearly to the surface—and as we saw with the first strategy in this list, sometimes making people aware of their own bias is all that it takes to reduce it.

6. Increase accountability for decision making

People are at their most biased when they are least accountable—that is, when they believe their decisions won’t be made public or that they won’t be asked to justify their reasoning. Studies dating back to the late 1960s have found that even small interventions can encourage decision-makers to think more deeply and carefully about a situation, reducing their reliance on stereotypes and biases. Even something as simple as having another person in the room when a decision is made can change outcomes for the better. Other strategies to reduce bias and increase accountability include:  

  • Making the results of the decision identifiable, so that the decision-maker knows they will be linked to him or her personally
  • Requiring the decision-maker to give an explanation of their thought process after a decision is made
  • Making the decision-maker aware that their performance will be assessed by a third party based on known ground rules

There are several takeaways here for VCs. To start, each VC should “have another person in the room” as often as possible when making decisions. For instance, they can loop colleagues in on questions they might normally have answered solo, like whether to invite a founder from their network in to the office to pitch. When decisions are made as a group, there should be no hiding behind consensus—the final decision should rest with one individual, who should then have to explain their “yes” or “no.”

7. Kick the habit of your individual biases

As detailed in a 2017 Atlantic article, Patricia Devine, the researcher who thirty years ago originally coined the term “implicit bias,” has been working on a set of strategies to end it. In 2012, Devine and her co-researchers published results of a 12-week longitudinal study, in which they developed a set of mental interventions to break the “habit” of implicit race bias. Those interventions included:

  1. Stereotype replacement: recognizing that a response is based on stereotypes, labeling the response as such, and reflecting on why the response occurred—and how to avoid the bias in the future.
  2. Counter-stereotype imaging: imagining counter-stereotypic others—that is, people who defy the usual stereotypes about their race, class, or gender. The strategy makes positive examples mentally accessible.
  3. Individuation: obtaining specific information about group members. This helps people evaluate members of a target group based on personal rather than group-based attributes.
  4. Perspective taking: assuming the first-person perspective of a member of a stereotyped group, which can increase psychological closeness to the stigmatized group.
  5. Increasing opportunities for contact: seeking opportunities to engage in positive interactions with “out-group” members.

These interventions work by raising people’s awareness of their own biases, then giving them them the tools to replace biased responses with more nuanced and accurate ones. VCs interested in applying these interventions in their own work can seek out trainings based on Devine’s system, which have already increased hiring of women in some academic departments at the University of Wisconsin-Madison, according to the Atlantic article. They can also work to apply the five interventions on their own, especially interventions 1, 2, and 4—mental exercises which can be performed whenever they have a few moments of quiet in their day.

8. While designing processes, take into account the bias of crowds

Devine’s interventions—like most of the strategies discussed in this post—are based on the premise that individual bias is mutable. It fluctuates over time depending on context and situation. By contrast, research shows that when measured at the group level, bias is more stable: all of those individual fluctuations cancel each other out, yielding measurable levels of bias that are consistent over years or even centuries (in the case of some stereotypes against ethnic groups).

Why this discrepancy? In a November 2017 paper, researchers from UNC-Chapel Hill and the University of Richmond posited that implicit bias tests are really tests of situations, not of people. Some situations put biased concepts at the top of people’s minds, making stereotypes and prejudices mentally easier to access—and therefore more likely to affect decision making. For instance, an individual who starts work in an office decorated with pictures of scantily clad models might become more likely to stereotype women as sexual objects at work, regardless of how they behave in other contexts. Group-level bias remains stable as long as the situations encountered by the group in question remain stable (i.e. no one takes the models’ pictures down off the wall).

For example, the web hosting service GoDaddy was once known for risqué advertising featuring scantily clad models. When the company was hit with a sexual harassment lawsuit along with other troubles related to its treatment of women, its upper management thoroughly analyzed HR processes to uncover sources of systemic bias. One change they made was to employee evaluation forms, according to the New York Times. GoDaddy removed open-ended questions like “Is this employee a good communicator?,” which invited subjective judgments about employees’ character, and added specific questions that focused on concrete impact—e.g. asking for documented examples of the employee sharing knowledge with a colleague. That one change purportedly led to a 30 percent increase in the number of promotions of women in a single year—without ever even mentioning the word “bias” out loud.

---

Just as our #StartWithEight campaign is meant to be a first step in bringing more women and minorities into VC, this phase are the first steps for our firm and others to address the issue of implicit bias head-on. We understand that changing the culture is not a light undertaking, but we also understand that the status quo cannot continue. It’s time for all of us to look our own bias squarely in the eye and say “no more.” Alpha Edison will be applying these strategies in our own work in the coming months—will you?

Subscribe to the latest AE NEWS