Research in Practice

Gender, Race and Power in AI

Artificial intelligence (AI) is on the verge of transforming market research. Companies like Remesh, Canvs and SightX (to name just a few) are using AI to speed the time to deeper, richer insights. Without a doubt, there are many benefits to automation, machine learning and AI. However, there is also a potential for AI to recommend actions that maximize corporate profits at the expense of making human lives better (see this great article from Andrew Konya of Remesh on the morality of market research in the age of artificial intelligence) or worse, excluding, discriminating against or even endangering entire populations.

There’s been much ado recently about Artificial Intelligence’s (AI) racial and gender homogeneity and for good reason. A 2019 article in The Guardian called the imbalance both “disastrous” and “at risk of replicating or perpetuating historical biases and power imbalances.” The issue is embedded in every seam and segment of the industry: 80% of AI professors are male, men currently make up 71% of the applicant pool for AI jobs, and this disparity continues to hardline as one inches toward leadership and executive roles. While exact figures for AI professionals disproportionately impacted by race and marginalized gender identity are not out yet, it’s safe to say that their battle is even steeper yet. And then there are the allegations of abuse and harassment from within the industry. A 2018 class action lawsuit filed against Microsoft exposed the ‘boys club atmosphere’ many have come to associate with AI and the tech industry at large. In 2014, a group of Amazon employees created an AI program to aid in the company’s hiring process using past job applicant’s resumes as a data pool and discovered that the system “penalized resumes that included the word ‘women’s.’” To say that AI is unwelcoming and discriminatory toward women and people of color is an understatement.

While exact figures for AI professionals disproportionately impacted by race and marginalized gender identity are not out yet, it’s safe to say that their battle is even steeper yet.

Implications

What does this mean for Market Research? AI is one of the hottest topics in the industry and its potential to enhance and improve data analysis could be considered unparalleled. But with the advent of this new technology comes a new and decidedly-stark workplace and hiring culture which threatens to push back the dial of diversity and inclusion that many within the industry—including these authors—have so painstakingly fought to advance. We can’t forgo AI altogether, but we can control the hiring and workplace practices within our own industry. Within this emergent technological shift is a two-fold opportunity to both improve diversity and inclusion tactics within market research, as well as to validate and affirm that the AI we utilize doesn’t produce bigger data gaps than it fixes. The question becomes, how do we do this?

Who’s Invited to the Table vs. Who Gets a Seat

Market Research, as an industry, appears fairly gender-balanced on the surface. A 2017 study on gender performed by Women in Research, tracked representation within the industry between 2012 and 2017 and found that 45% of respondents were male while 55% were female. While in that time frame there was a 6% increase in women entering leadership roles, among experienced researchers, men still assumed more of the top (Executive+) positions and 80% of CEOs were still men. Imbalance of power is the most damaging blow to inclusion in that the positions which govern the very policies and provisions that inhibit or incubate worker success are often held by those who have the least stake in their revision. Likewise, when women are excluded from senior leadership positions, it has a profound impact on innovation and product development initiatives that truly serve diverse users (respondents) and research buyers.

AI and its Discontents

This report from AI Now Institute clearly and concretely homes in on the greatest threat to the public posed by a lack of diversity in AI: the risk of implicit bias embedded in its algorithms. To this point, the piece cites a 2019 study investigating racial bias within a widely used commercial algorithm used to determine whether patients will be enrolled in ‘care management’ programs that allocate considerable additional resources, finding that, “white patients were far more likely to be enrolled in the program and to benefit from its resources than black patients in a comparable state of health.” When biased AI systems are tasked with image recognition and treated as an infallible data set, an unadulterated image of the minds behind the algorithm becomes clear. As Caroline Criado Perez outlines in her book Invisible Women: Data Bias in a World Designed for Men, “AI’s have been trained on data sets that are riddled with data gaps – and because algorithms are often protected as proprietary software, we can’t even examine whether these gaps have been taken into account.” In essence, when a single slice of the population is tasked with designing discriminating systems, we can’t be surprised that the results are so blatantly discriminatory and blind to the obstacles faced by marginalized communities.  

For Market Research, an industry which prides itself on highly ethical and intentional data methodology, misrepresentation errors such as these could be fatal. Entrusting a data set to an algorithm will and has revolutionized the way we, as researchers, observe data—as well as the speed of our analysis. It would, however, be a huge misstep for industry professionals to trade time for properly balanced analyses. Further, when we present data as filtered through potential AI bias as cold fact, we validate the bias, deepening the divisions already rampant in our world and our communities. A rallying cry for workplace diversity has always been, “you’ve got to see it to be it.” When it comes to AI, it’s important that those who decide what we see do so from a diverse and holistic perspective.

How market research can make a difference

As AI integrates further into the research process, organizations within the market research industry will be tasked to make definite and forward-thinking decisions when it comes to diversity and inclusion—both as a workplace and in regard to a research methodology.

A 2015 study looking at self-report bias in psychological studies found that “the use of the generic masculine in questionnaires affected women’s responses, potentially distorting ‘the meaning of test scores’. The authors concluded that its use ‘may portray unreal differences between men and women, which would not appear in the gender-neutral form or in natural gender language versions of the same questionnaire.”

Organizational recruitment and outreach efforts should be cognizant of their hiring pool; if all of your candidates are white and male, you don’t have a talent problem, you have a recruitment problem. Commit to a transparent hiring and compensation model that prioritizes the path for people of color, women, and other under-represented groups into leadership positions and hold current executives accountable by enfolding incentive structures into your company’s OKR’s to increase hiring and retention of marginalized communities.

Remember: the issue of workplace (and thus product) diversity isn’t solved by hiring a woman or a person of color but from addressing structures of power within your organization; who has power, who has access to power, and who feels the brunt of its force.

Foster spaces inside of your workplace for women and people of color to shine. Often, people of color, particularly in tech focused roles, mention feeling disconnected and disenfranchised when surrounded by a sea of white faces. Numerous posts on the critically-important blog People of Color in Tech echo this sentiment and the blog itself, which captures the individual experience of those entering into emergent tech fields including AI, should be standard reading for all in tech-adjacent industries. Invest in internal diversity initiatives or networking groups which can connect marginalized employees with a larger, local community outside of the office while inviting the office as a whole to get involved, get educated, and become better allies. “ You’ve got to see it to be it” is a common refrain in the call for gender parity and diversity-focused networking groups and meetups are rare opportunities for women and people of color to learn from those who have experienced similar setbacks and challenges in their career path. Remember: the issue of workplace (and thus product) diversity isn’t solved by hiring a woman or a person of color but from addressing structures of power within your organization; who has power, who has access to power, and who feels the brunt of its force.

Lastly, ensure you are cognizant of gender data gaps in both your research and your product development. These data gaps are not generally malicious, or even deliberate, but they exist nonetheless and can have real (sometimes fatal) consequences for marginalized groups.

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.
Please note that your e-mail address will not be publicly displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles