Reading Time: 5 minutes

Google provides information to people searching for answers globally, but it may be pushing an alternate agenda. Google wants to appear as a progressive, feminist company, but it is all a front. Behind the scenes, Google’s staff perfectly represents the company’s feelings on gender roles; women are not capable of handling demanding jobs. As of 2021, only 33.7% of Google’s employees were women (Johnson, 2021). 

Even in such a progressive era, there is continuously a large gender gap in Google. From 2014-2021 the total percentage of women employees only increased by 3.1%. That amount should be higher. Over the same 8-year period, men still make up at least two thirds of the employee demographic at Google.

According to Jonathan Strickland (2008), Google uses algorithms called “spiders” or “crawlers” to search for those keywords across thousands of websites. They categorize those websites by popularity and relevance to give you the best results at the top of your search. There are a few other factors to what pops up at the top of the search such as how many links are in the websites, how long the website has been posted, whether the website is credible. These tools are what makes Google so user friendly.   

I was inspired to do some of my own investigating of Google’s bias, after watching Safiya Noble’s (2016) TedTalk about her book, Changing the Algorithms of Oppression. In this, she discussed how women of color have been wrongly depicted through a basic search on the internet. Noble chose to look at the search term “girls” and the results she found were sexualizing girls of color. The websites Google had suggested in the first few results were porn websites with girls of color as the main point of focus. Noble has inspired great debate about this same topic of whether the Google search engine produces racist and sexist results.

 According to Jonathan Cohn’s (2019) article, after Noble’s book came out, Google made some major adjustments, although has not said that it was in response to Noble’s research specifically.

 

My Results

I was curious to see how the Google Images algorithm depicts different keyword searches about women and men’s work life. Specifically, I was looking to see if the stereotype women are nurturing, and men are tough would appear in occupational searches. 

When I searched the keyword “caregiver” the first 20 images that popped up were exclusively of women for both the caregiver and the person receiving care. All the women receiving care were also elderly and all women shown in the first 20 images were white. 

When I searched for the term “caretaker” my results included a little more gender diversity, bringing in two men, as well as diversity in the task. The task was still mainly elderly care, with a few added images of men mowing the lawn, or caring for young disabled people. 

I also searched for the keyword “worker” and these same anti-progressive narratives have been displayed. The first image was of two cartoon women in a group with three men and the second was of a solo white woman. The next 20 image results were of men, mostly white, in hard hats with tools. Most of these men are depicted solo but a few images are of a group of men. 

My last comparison was between the word’s “professor” and “teacher”. The images surrounding “professor” nearly all showed a white man in front of a chalkboard, seven out of the first 50 were women including the first image. When I googled the word “teacher”, almost exclusively women appeared, in smaller classrooms with young kids. In the first 50 images, only four men were depicted, two black and two white. 

Why is this important?

These different searches are disturbing. The narrative that women are given a specific role in society to be caring, giving, and weak, whereas men are strong, bread winners, and capable is hundreds of years old and anti-progressive. Image results such as these are harmful to the advancement of modern society. They tell women there is no place in the workforce for them to do jobs of advanced physical or intellectual labor. 

The jobs surrounding care and nurture are the careers available to women. When we search for keywords on the internet, oftentimes we only read the first couple articles or look at the first few images. When we do this and see, for example, only women as caregivers, our brains unconsciously absorb that information. 

This gives the impression that only women are pursuing that line of work, and it would be socially wrong for a man to pursue a career in caregiving. The same is true for the results in what Google perceives as male dominated industries; it appears there is no room for women in higher education. When people discover image results such as these, the algorithm is promoting a world where women are the weaker gender and are incapable of providing for themselves or having highly physical or highly respected careers. 

This is particularly damaging to young children when they search the internet for information about their future career paths. The lack of diversity in images of caregivers continues the belief that women are nurturing, compassionate, and emotional by nature. When these children see only men in images of workers, little girls are discouraged from pursuing labor-based careers. This is not only the case for children either. 

One article on how women being under-represented in the media found it causes women and girls to lose self-esteem and a sense of self-worth. In the same article, the author also mentions how representation in the media is how young girls learn how to socially behave (Collins R.L., 2011).  Girls use representation in the media to learn how to do their makeup and what styles appeal to them. Without seeing other girls who look like them, young girls struggle to find their identity.  

 

Change

Google lacks diversity in its staff to create diverse results in its search engine. Women leaders at Google only make up 25% of the leadership, but that is an increase of 4% since 2014 (Nitasha Tiku, 2018). According to Johnson (2021), as of 2021 66.3% of all Google employees are men. 

In an attempt to change, Google wants to put an emphasis on progression of employees, not just hiring a diverse staff. The company has outlined ways their actions will be more inclusive, and staff will be better aware of how to be racially conscious. The company will hold training for employees and will diversify its leadership by 30% by 2025 (Hugh Langley, 2020). Google has not made any promises to increase gender diversity with staff, only racial diversity. 

Google is taking some steps to change. Jonathan Cohn (2019) says google no longer tries to help guide searches by suggestions for phrases such as “women are…”, “Mexicans are…”, “Muslims are…”, and so on. Also, the first image in all my searches contained a woman, no matter the search, giving off the false sense of diversity when the next woman would not appear for 30 more pictures. 

Making the first image a female is a good start but does not statistically help to diversify the search image results. By only making the first image a woman it does very little reverse the stereotype if the overwhelming majority of images are still men. By them doing this it only reinforces that women in those fields are still an extreme minority.

 

References Brin, S., & Page, L. (1998). The anatomy of a large-scale hypertextual web search engine. Computer Networks and ISDN Systems, 30(1-7), 107–117. https://doi.org/10.1016/s0169-7552(98)00110-x Collins, R.L. Content Analysis of Gender Roles in Media: Where Are We Now and Where Should We Go? Sex Roles 64, 290–298 (2011). https://doi.org/10.1007/s11199-010-9929-5 Jonathan Cohn Assistant Professor of Digital Cultures. (2020, August 10). Google's algorithms discriminate against women and people of colour. The Conversation. Retrieved February 15, 2022, from https://theconversation.com/googles-algorithms-discriminate-against-women-and-people-of-colour-112516 Johnson, J. (2021, November 2). Google: Gender distribution of global employees 2021. Statista. Retrieved March 8, 2022, from https://www.statista.com/statistics/311800/google-employee-gender-global/ Illing, S. (2018, April 3). How search engines are making us more racist. Vox. Retrieved February 15, 2022, from https://www.vox.com/2018/4/3/17168256/google-racism-algorithms-technology Langley, H. (2020, June 17). Google pledges to introduce 'racial consciousness' training for employees and boost diversity among its leadership ranks. Business Insider. Retrieved February 15, 2022, from https://www.businessinsider.com/google-promises-several-changes-improve-racial-diversity-across-the-company-2020-6 O'Toole, E. (2017, September 6). Women's intellectual aptitude would blow Aristotle's Mind. The Irish Times. Retrieved March 11, 2022, from https://www.irishtimes.com/life-and-style/people/women-s-intellectual-aptitude-would-blow-aristotle-s-mind-1.3210296 Strickland, J. (2008, January 11). Why is the google algorithm so important? HowStuffWorks. Retrieved February 8, 2022, from https://computer.howstuffworks.com/google-algorithm.htm Tiku, N. (2018, June 14). Google's diversity stats are still very dismal. Wired. Retrieved February 15, 2022, from https://www.wired.com/story/googles-employee-diversity-numbers-havent-really-improved/ YouTube. (2016, June 15). Safiya noble | challenging the algorithms of oppression. Retrieved January 11, 2022, from https://www.youtube.com/watch?v=iRVZozEEWlE&t=738s White, K. M. G. (2019, August 19). 'woke' google continues to face accusations of racism, sexism. Washington Examiner. Retrieved March 10, 2022, from https://www.washingtonexaminer.com/opinion/woke-google-continues-to-face-accusations-of-racism-sexism