heavysixer

From the 'Male Gaze' to the 'ML Gaze'

The concept of the “male gaze,” as theorized by feminist film theorist Laura Mulvey, refers to the way in which the visual arts, particularly cinema, often represent the world from a patriarchal perspective, objectifying and sometimes sexualizing women for the pleasure and consumption of a male viewer. Similarly, biases codified into machine learning systems, or the “ML gaze,” can be seen as a reflection of the biases and perspectives of those who design and train these systems, including the source material used for training.

Both the male gaze and the ML gaze reveal the ways in which power and privilege shape and filter the way we see, experience and understand the world. In the case of the male gaze, Mulvey would argue it is the power and privilege of the patriarchal society that shapes the way women are represented and perceived, and once translated into media artifact e.g., film the power asymetry becomes immutable and perminant. In the case of the ML gaze, it is the power and privilege of those who design and train the systems, often white and male, that shape the way the systems perceive, catagorize and interact with the world.

The consequences of the male gaze and the ML gaze are also similar in that they can reinforce and perpetuate existing societal inequalities. Mulvey might argue that the objectification and sexualization of women in visual media contributes to the devaluation and marginalization of women in society. Similarly, the codification of biases into machine learning systems can lead to discriminatory outcomes, such as in the case of facial recognition systems that perform poorly on people with darker skin tones.

The “ML gaze” becomes more concerning when considering new technologies like autonomous vehicles, algorithmic predictive policing, and hiring or financial lending algorithms. These systems are being designed to make decisions that can have serious and often life altering consequences, such as whether to arrest someone or who to hire for a job. If these systems are built with biases, they could perpetuate existing inequalities and discriminate against marginalized groups. This is compounded by the fact that many models are so-called “black boxes” whose decision system is opaque and unexplainable.

Both the male gaze and the ML gaze are rooted in the idea that there is a single, objective perspective from which to view the world. However, as Mulvey argues, there is no such thing as an objective gaze, as all perspectives are shaped by power and privilege. In the same way, it is important to recognize that there is no such thing as an unbiased machine learning system, as all systems are shaped by the biases and perspectives of those who design and train them as well as the source material they used.

It is crucial to acknowledge and address the ways in which the male gaze and the ML gaze shape our understanding of the world, overtly or inadvertantly marginalize people and to work towards creating more inclusive and equitable and transparent systems. This can be done by diversifying the teams who design and train these systems and by making these models transparent, auditable and explainable.

  1. Mulvey, L. (1975). Visual pleasure and narrative cinema. Screen, 16(3), 6-18.

  2. Dastin, J. (2018, June 25). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08GD

  3. Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of Machine Learning Research, 81, 1-15. http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf

  4. Crawford, K. (2019). The problem with bias in AI. Harvard Business Review, https://hbr.org/2019/11/the-problem-with-bias-in-ai

  5. Doshi-Velez, F. and Kim, B. (2017) “Towards a rigorous science of interpretable machine learning”, arXiv preprint arXiv:1702.08608.