Timothy Lee
3 min readJan 16, 2021

--

Week 11 reflections

In thinking about my research proposal, I am still interested in pursuing the topic of racism in the machine, and how algorithms are inherently biased and often used against marginalized communities and exacerbate prejudices in social and political institutions. It has become more apparent in contemporary times that as humanity moves toward an automated world based on biases, these are factors and conditions that need to be accounted for in order to attempt to establish a truly equitable world of machines.

As a Korean immigrant, growing up in the United States as a child was very confusing and often a harrowing experience: I was constantly ‘othered’ — my slanted eyes and subtle accent meant I was a perpetual foreigner in the eyes of my White, American peers; however, my thick Western accent and American mannerisms distinguished me from my community back in Korea. A moment that struck me particularly was when I first heard that I was yellow — in Western vocabulary, being yellow meant I was Asian — from East Asian descent to be exact. I looked at my skin, and noticed that it didn’t differ much in hue from those of my White friends, and even amongst my circle of Asian friends none of our complexions were particularly “yellow.” So why was I considered a yellow person?

A brief look into the history of the term’s origin suggests that the the label “yellow” was not a reflection of any particular visual or biological basis, but rather a way to classify and categorize the different races in a way similar to the binomial nomenclature used in taxonomy. Another article corroborates this account to suggest that Asians were initially seen as white, but over time perception changed to see them as “yellow:”

“The first suspect implicated in applying the “yellow” label to East Asian faces is the famed Carl Linnaeus (1707–78). At first, Linnaeus used the Latin adjective “fuscus,” meaning “dark,” to describe the skin color of Asians. But in the tenth edition of his 1758–9 “Systema Naturae,” he specified it with the term “luridus,” meaning “light yellow” or “pale.” It was Johann Friedrich Blumenbach (1752–1840) who went beyond the coloring ascribed by Linnaeus to apply the completely different label of “Mongolianness.” Regarded as a founder of comparative anatomy, the German zoologist did more than just use the Latin word “gilvus,” meaning “light yellow,” to describe East Asian skin color: he also implicated the Mongols, a name with troubling and threatening connotations for Europeans with their memories of Attila the Hun, Genghis Khan, and Timur.While the references remained anomalous at first, travelers to East Asia gradually began describing locals there more and more as “yellow.” By the nineteenth century, Keevak argued, the “yellow race” become a key part of anthropology.”

Regardless of the history of the “yellow label,” it has become associated with violence, discrimination, and exclusion. I am interested in how much “yellow” is in the skin of Asians vs. White people. Artists such as Byron Kim has created artworks around the theme such as his work Synecdoche where he creates monochrome panels whose colors were extracted from the skin of the Asian community to suggest the variation and “non-yellowness” of this community. But is there a way to perform this color extraction and comparison with code? In learning pixel processing, the possibility of creating a code that performs these tasks is exciting to me, and something I would like to explore for my final project.

--

--

Timothy Lee
0 Followers

Blog for Computational Arts-Based Research & Theory