Entanglements: an exploration of the digital literary work FISHNETSTOCKINGS

Racial Bias in Motion Tracking


When my family was exploring FISHNETSTOCKINGS, the system did not treat my two children equally. While it picked up the movements of my daughter fairly easily, it did not seem as attentive to the movements of my son. My daughter is caucasian, and my son is African American. This difference is treatment is not particular to my family.

A GameSpot test of the Microsoft Kinect discovered issues with the system performing facial recognition on black-skiinned users.  Such findings are consistent with similar problems in recognition discovered by the advocates at the Algorithmic Justice League, founded and coordinated by Joy Buolamwini. 

For FISHNETSTOCKINGS, such discrepancies might suggest a need to recalibrate in order to help all participants immerse themselves in the work. Since the projections require low light conditions, the differences in perceiving bodies especially with regard to skin tone may be magnified.  The discrepancy also suggestions that though the bodies will be flattened to two dimensions because they have to be recognized and read by the system, bodies do still matter. As Buolamwini has said, "Sometimes respecting people means making sure your systems are inclusive" (ajl.com/about).

Just the way racial representation is not removed from the tales of an imaginary species as Diana points out, racial recognition does not disappear from our digital re-imaginings, especially when one of the readers or rather one of the active participants is a computational system. 

Contents of this annotation:

This page has paths: