Week 6 reflections

Timothy Lee
3 min readNov 16, 2020

Update on our research project:

Big Brother:

1) Summarise your project in two lines

Our project reviews the basic principles and mechanisms of facial recognition software and explore its modern uses and flaws with regards to its use in systems of surveillance and the racial biases implicit in these algorithms. In addition, we will discuss the current artistic applications of facial and pattern recognition software, including protests as performance, and highlight potential trajectories for future research and development of this technology.

2) What are the overarching areas of research and practice?

Our research will cover topics in surveillance technology, such as pattern recognition, machine deep learning, biometrics, and racial bias. Although we will not cover the mechanisms and history of facial recognition software in detailed specificity, since a plethora of new observations been made and discussed in recent years, we will comment on the philosophical and existential implications of relegating the identity of an individual to their biological blueprint. We will look at modern methods of generating facial recognition data, including the generation of Eigenfaces, and explore how various artists and political movements have approached surveillance technology into their own demonstrations or artworks. Some keywords related to our project proposal include: racial bias, facial recognition, protest technology, surveillance, biometrics, and Eigenfaces.

3) What are the key questions or concerns you will address?

Our central concerns in our research proposal all revolve about the human touch in training and calibrating machines, and particularly in pattern recognition. Because deep machine learning, a necessity for pattern recognition, requires a database for reference, its algorithms will be implicitly biased based on the manual input of human who maintain the databases and also manually scrutinize the accuracy of matches. Although there is unsupervised machine learning, these also lend themselves to racial biases because their original points of reference, regarding their references or mode of analyzing data, where originally (and most often) designed by white men along Western models of thought and point of view. As such, we ask: is a truly neutral software possible? What would facial recognition — civilian — software look like in absence of the human touch?

4) Why is your group motivated to undertake this project?

Technology is becoming increasingly more involved in our routine lives, especially as machines replace a significant portion of human labor, and particularly when it comes to purportedly algorithmic actions such as pattern recognition. In light of the many political conflicts happening around the world, with a particular focus on the United States and Hong Kong, there has been documented efforts by the respective governments and law enforcement agencies to rely on facial recognition software and surveillance technologies to counteract against protest demonstrations and crime. However, there have also been extensive documentation that provides evidence for racial biases implicit in these technologies, since their design, training, and databases all prime the software to make inaccurate decisions that overwhelmingly disadvantage communities with dark skin tones. Since the legal implications of these false positives and inaccurate matches are profound, it is important to discuss these biases and the challenges of making truly neutral technology. As artists, we are also interested in learning more about this topic and pay particular attention to how it has been used as performance and as a vector for creation.

5) What theories or writers will you use in your work to guide you? Who is already writing about your area of interest? (Use library resources to guide you).

Since we will briefly review the main properties and mechanisms of facial recognition software, we will highlight some published articles written by scientists, designers, and academics across various journals. However, the bulk of our research will draw from active conversations, platforms, and theorists who address the changing natural of artificial intelligence and racial bias — such as Ruha Benjamin, The Gender Shades project by the Algorithmic Justice League, the database of Homeland Security Affairs, and recent journalism that covers protest demonstrations.

--

--

Timothy Lee
0 Followers

Blog for Computational Arts-Based Research & Theory