When it comes to hiring, even the most well-intentioned employers are subject to bias. This is because while biases can certainly be explicit, or intentional, the vast majority of biases run much deeper. Implicit bias, as defined by the Kirwan Institute for the Study of Race and Ethnicity in their annual “State of the Science” report, is “the attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner.” The plethora of science on the nature of implicit bias boils down to several key qualities. Implicit biases are:
They are also not solely enacted by individuals. Rather, implicit biases, explicit biases, and the structural forces that facilitate them reinforce one another.
Implicit biases play out in the workplace in numerous ways, including hiring practices. One type of implicit bias in particular can impact the diversity of who we hire: affinity bias. Put simply, affinity bias is the tendency to be attracted to people who are like yourself. In the workplace, this can lead to favoritism towards candidates “like you” that indirectly results in discriminatory preference towards less diverse applicants.
You may be reading this and thinking that you’re an exception to the rule—you are actively in favor of a diverse workforce and believe in selecting people who are different from you. But the hard truth is that most of us think that. We carry the “illusion of objectivity,” or the idea that we are free of the biases that we can easily call out in others. Yet, as one Harvard Business Review article puts it, “in reality, most of us fall woefully short of our inflated self-perception.” We can simultaneously think that we are unbiased, ethical, and capable of strong decision-making in the hiring process while also experiencing an unconscious internal predilection towards people like us. In fact, research shows that we can even hold “counter-intentional unconscious biases”, or biases that actively conflict with our own explicit beliefs.
Tackling affinity bias in hiring, then, must begin with the recognition that we are all impacted by implicit stereotyping, the mechanisms behind it, how it plays out in the workplace, and (finally) how we can challenge it.
Breaking Down Affinity Bias
So where exactly does affinity bias come from, and what do we know about its effects? Social psychologists have asked this question many times, and while research is far from complete, it does tell us a helpful story.
Generally, we are naturally inclined towards identifying people who are and are not like us—in-groups and out-groups. Once we identify with an ingroup, we become automatically more attracted to fellow group members. This happens, quite literally, in our brains. Studies show that we use entirely different neural pathways when we perceive the faces, words, and actions of ingroup members compared to outgroup members. We demonstrate less action in critical brain areas of empathy when we see outgroup members in pain and more action in critical brain areas for moral sensitivity when we experience threats from an outgroup. Our reward system lights up more when we reward ingroup members versus outgroup members. We respond to members of other groups with less eye contact and warmth, tense muscles, anxious tone of voice, physical distance, and formality.
This research to the level of the neuron might feel too microscopic for your purposes in hiring for an organization. But when it comes down to it, it shows us that affinity bias is hardwired into our brains, and therefore influences everything it touches.
Affinity Bias in the Workplace
Let’s step out of the fMRI machine and into the workplace now. How does affinity bias manifest in diversity management? For one, we unconsciously form impressions of others based on our level of affinity towards them. In the long term, this can result in workplace bias, or differences in career outcomes that are unrelated to someone’s qualifications, skills, or other investment in a company. It can also result (to your detriment) in less diverse workplaces.
Traditional workplace environments are designed, in one way or another, to maintain a level of status quo. This especially applies to decision-makers, like hiring managers, who personally identify with long-standing procedure, practice, and workplace culture. Managers call this hiring technique “culture fit”—a subjective assessment of how well a potential hire “fits in” with the values and practices of the organization. On face value, it makes sense: hire people that will blend in seamlessly with minimal disruption. In practice, however, it can function as a veil for affinity bias.
This can take several forms, in several areas of the workplace. Emphasis on networking, or subjective sourcing from your existing circles of connection, can de facto exclude more diverse and otherwise qualified new candidates. We do more favors for people we know or people who look like us. Seniority systems tend to advance the interests of those historically at the top, and are used as justification for hiring from a limited pool of privileged applicants.
In-group favoritism is the strongest when membership confers clear advantages to the majority in power—or as Bielby puts it: “both personal and formal procedures can be and are manipulated by those in positions of privilege to preserve their advantage.” One study, for instance, found that white male workers ardently defend the need for seniority systems in jobs that historically bar women and minorities from entry. The same goes for placing disproportionate weight on prior experience, which replicates the social conditions that limit minorities’ access to this experience in the first place.
This plays out in the interview as well. Hiring managers hold implicit expectations about what “type” of person will be the best “culture fit” for the company. Workplaces cling to “culturally bound normative expectations” about “professionalism”, self-presentation, “appropriateness”, and familiarity with the unsaid “rules of the game” in that industry—rules that not every job seeker has access to. These subjective measures become easy euphemisms for rejecting difference, including unconsciously toward favoring people like us.
Minimizing Bias Using Data
So what do we do about it? How can we change something that is unconscious? Unconscious biases, including affinity bias, are malleable and can be unlearned—we call this debiasing—when addressed deeply and directly. While you cannot fully eliminate bias from your hiring process, you can call awareness to it, understand it, and work to minimize its impact.
Minimizing bias takes place at two levels, the individual and the organizational. Both are important, because bias is simultaneously internal and structural. It is also intersectional—interacting with multiple identities—and thus complex. Your solutions must take this into account.
Begin with awareness. A 2020 report on the “State of Diversity” notes that 33% of employees think the largest barrier to diversity is a sense that the workplace is already diverse enough. Feelings like this can blind workplaces from seeing bias enacted in the organization. A great way to challenge this is through data.
Consider taking the Predictive Index test. This test was developed in the 1950s based on psychometric testing and workplace psychology studies of talent optimization. Arnold Daniels, the creator, caught the attention of scientists when he led his combat troop of the US Army Air Corps through over 30 missions with an amazing zero casualties. What made this diverse group work so well together? The ensuing research resulted in a Predictive Index, backed up by over 500 validation studies and used by thousands of organizations worldwide. Importantly, you do not need bottomless resources to use the Predictive Index—even the most low-budget groups can access this helpful test. 70% of test takers say the Index helps remove bias and subjectivity from hiring.
The Predictive Index will tell you where your biases lie. But your data collection need not stop there. Remember—knowledge is power. The more you know about the points of bias in your workplace, the more you can minimize them. Consider a diversity audit, breaking each set of data down by identities (gender, race, age, sexuality, etc.). Collect small-scale data on information like:
- Pay and career advancement patterns
- Feedback from employees about perceptions of barriers to entry
- The breakdown of employees at each level of the company
- The hiring decision-makers
- Diversity of applicants for open positions
- Diversity of representation in social media, website, and marketing content
Once you’ve got that data, then you’re in the position to make changes aimed at reducing affinity bias, especially in the recruitment and hiring process, where decision-makers are most susceptible to select people like themselves.
Banaji, M. R., Bazerman, M. H., & Chugh, D. (2003, December). How (un)ethical are you? Harvard Business Review, 3—10. https://hbr.org/2003/12/how-unethical-are-you
Bielby, W. T. (2000). Minimizing workplace gender and racial bias. Contemporary Sociology, 29(1), 120—129. http://newlegalrealism.org/wp-content/uploads/2017/12/Bielby-Minimizing-Workplace-Gender-and-Racial-Bias.pdf
Bodenhausen, Galen V., & Macrae, C. N. (1996). The self-regulation of intergroup perception: Mechanisms and consequences of stereotype suppression. In C. N. Macrae, C. Stangor, & M. Hewstone (Eds.), Stereotypes and stereotyping (pp. 227—253). Guilford Press. https://www.scholars.northwestern.edu/en/publications/the-self-regulation-of-intergroup-perception-mechanisms-and-conse
Bye, H. H., Horverak, J. G., Sandal, G. M., Sam, D. L., & Van de Vijver, F. J. (2014). Cultural fit and ethnic background in the job interview. International Journal of Cross-Cultural Management, 14(1), 7—26. https://www.researchgate.net/profile/Fons-Van-De-Vijver/publication/242334941_Cultural_fit_and_ethnic_background_in_the_job_interview/links/56fb732108ae8239f6dadfc0/Cultural-fit-and-ethnic-background-in-the-job-interview.pdf
Deaux, K. (1984). Blue-collar barriers. American Behavioral Scientist, 27(3), 287—300. https://journals.sagepub.com/doi/abs/10.1177/000276484027003003
Dixon-Fyle, S., Hunt, V., Dolan, K., & Prince, S. McKinsey (2020, May). Diversity wins: How inclusion matters. McKinsey & Company. https://www.mckinsey.com/featured-insights/diversity-and-inclusion/diversity-wins-how-inclusion-matters
Faigman, D. L. Kang, J., Bennett, M., Carbado, D., Casey, P., Dasgupta, N., Faigman, D., et al. (2012). Implicit bias in the courtroom. UCLA Law Review, 59(5), 1124—1186. https://repository.uchastings.edu/cgi/viewcontent.cgi?article=2315&context=faculty_scholarship
Kacmar, K. M., Delery, J. E., & Ferris, G. R. (1992). Differential effectiveness of applicant impression management tactics on employment interview decisions. Journal of Applied Social Psychology, 22(16), 1250—1272. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1559-1816.1992.tb00949.x
Molenberghs, P., & Louis, W. R. (2018, October). Insights from fMRI studies into ingroup bias. Frontiers in Psychology, 9(1868), 1—12. https://www.frontiersin.org/articles/10.3389/fpsyg.2018.01868/full
Pittman, T.S. (1998). Motivation. In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), The handbook of social psychology (Vol. 1). Oxford University Press. http://web.mit.edu/curhan/www/docs/Articles/15341_Readings/Motivation/Pittman_1998_Motivation.pdf
Predictive Index (2022). https://www.predictiveindex.com/
Staats, C. & Patton, C. (2013). State of the science implicit bias review. Kirwan Institute. http://www.kirwaninstitute.osu.edu/reports/2013/03_2013_SOTS-Implicit_Bias.pdf
Unsiloed (2022). https://unsiloed.org/our-services/