Glitch
Identity
What is this?
How do machines label us and shape our identity with biased categorization? In an era where Generative AI astonishes with its capabilities, it also possesses the propensity for misrecognition due to its reliance on algorithmic classification. My project, “Glitch Identity,” seeks to reveal and critique how machine learning algorithms may categorize human and real-world diversity into oversimplified and stereotyped groups.
Although AI is impressive, deriving its understanding from massive datasets, it is not immune to the biases embedded categorically within its algorithmic framework (Crawford and Paglen). The fact is, the real world is not as simple as a distinct category that AI perceives. Reality frequently defies these rigid classifications with its continuous and non-linear attributes. To address this limitation, “Glitch Identity” will engage with audiences by scanning their faces, yet assigning a deliberately randomized identity, such as ‘A dinosaur that likes to dance,’ to underscore the absurdity and inaccuracies intrinsic to these algorithms.
Race, ethnicity, gender, and other sociocultural constructs have deep and undetachable relationships with technology since humans are the creators. As Sterne said, “Technology is part of the domain of human existence, not something outside it” (Sterne, 41). With this intertwined relationship, computers can become “encoders or culture,” even if “the structure of code works to disavow these very connections” (McPherson, 36). Technology reflects our lives and possibly alters the way we think; considering these factors, image classification technologies can unconsciously mold our self-perception with biases with their statistical averages. One question then arises: Can technology be built to be inclusive and unbiased if it is an inseparable part of society? As technology digitizes and quantifies reality, the conversion should reflect the complexity and authenticity of the real world rather than a caricature soaked in stereotypes. “Glitch Identity” is not just an interactive project but a critical lens through which the implications of technological categorizations on identity are explored and challenged, suggesting the necessity to consider the ethics behind it.
Bibliography
Crawford, K., & Paglen, T. (2019, September 19). Excavating AI. Excavating AI: The Politics of Training Sets for Machine Learning. https://excavating.ai/
McPherson, T. (2012) “U.S. Operating Systems at Mid-Century: The Intertwining of Race and UNIX.” In Race After the Internet, edited by Lisa Nakamura and Peter A. Chow-White, 21–37. New York, NY: Routledge.
Russell, L. (2020). Glitch feminism a Manifesto. Verso.
Sterne, J. (2016) “Analog.” In Digital Keywords: A Vocabulary of Information Society and Culture, edited by Benjamin Peters, 31–44. Princeton, NJ: Princeton University Press.