Thursday, 31 May 2018

Making Data Human

This month we had a more thoughtful session contemplating how algorithms and data can oppress people.



We were very lucky to have Ed Fish, an artist who uses technology, algorithms and code, to create powerful contemplative and thought-provoking work.

The slides for this talk are here [link].

A video recording of the talk is here [link].


66,081 and Statistical Numbing

Ed started with a bang. He presented a single number, in a plain white font on a dark background:


That number 66,081 is the number of civilian casualties of the Iraq War recognised by the Allies. Ed's starting point was his initial response to this number of deaths - or rather, lack of response. That bothered and intrigued him.

This led to the idea that humans have different responses to, say, 1 death than 100 deaths. The emotional response is not a hundred times greater for 100 deaths. He reflected on the saying that one death is a tragedy, and a million is merely a statistic.

Ed's motivation was then to understand how and why this difference in response happens - and how to then create art that actually does resonate with those experiencing it in a way that more truly reflects a tragedy.

Ed pointed us to Paul Slovic who is an authority on statistical numbing - this very effect of us having more emotional connection, empathy and response to 1 death than 100 deaths.


Ed then showed a powerful graph demonstrating the non-linear response we, humans, have to different physical stimuli as they grow.


For example:
  • In a darkened room, a small light makes significant impression. Doubling the intensity of the light increases our response but doesn't double it. As the light continues to get brighter, our perception of it doesn't double. The largest change in response is between complete darkness and some light. 
  • Sound is similar to light - small noises against silence are perceived with greater difference than loud sounds compared with even louder sounds.
  • Heaviness works in the opposite way - small tiny weights are imperceptible. Larger weights are definitely felt, and doubling a weight feels as if the weight was more than doubled. Electric shocks, like heaviness, are a direct pain sensation and we don't desensitise to them either.

There is some justification for these effects - evolution required us to be sensitive to small sounds and light - in order to survive in a sometimes hostile natural environment.

From experiments, a similar graph can be constructed for human response for number of victims. It shows that our desire to help victims is most felt when there few victims. When there are many victims, our response doesn't multiply to the same extent.


Large numbers of victims are more anonymous - statistics - harder to identify with as individuals.


Iraq Body Count Project

Ed contacted the Iraq Body Count Project which tries to maintain a count of civilian deaths as a result of the 2003 Iraq War.

In an attempt to make the huge numbers more meaningful, and counter the effects of statistical numbing, Ed created an audio work to encourage a Slovic slow thinking response.

He extracted key parameters from the body count data, such as victim age, and passed these into the Super Collider audio programming environment. He used a method, called granular synthesis, taking small sound segments and further decomposing these in order to reconstruct a wider soundscape.

A key compositional decision was not to create an amplitude based audio work. Instead, granular synthesis based on a sampling of earth being crunched, was used to layer sounds to create a sound texture over time.

His piece, For Dead Children and Electronics (2015) is 8 minutes long.


In Ed's own words:

"A sonification of 10340 civilian deaths in Iraq since 2003. Using information provided by Wikileaks and Iraqbodycount.com the age of the fatality dicatates the duration, amplitude and reverberance of the grain. The younger the victim the louder, longer and more reverberant the pop."


Listening to the piece, even for a couple of minutes, was a sombre, thoughtful and reflective experience. Knowing that each sound grain represented a death, and the louder ones were for younger children, forced us to think and understand, in slow time, the reality of the tragedy of the Iraq War.

Ed's work of art grimly succeeded in forcing us to empathise with the human disaster, his work architected and designed to directly counter statistical numbing.

Many of us felt a sadness descend on us as we listened.


Algorithmic Bias

Ed talked next about the tyranny of algorithms that, through their inflexibility or unsophistication, cause human misery by imposing the bias of their creators on others, especially those who have different ethnic origins or are transgender.


Ed talked about an example of facial recognition software, used by government, trained on a limited diversity of faces. The failure to recognise people who might be of different ethnic origin is not just an issue of inconvenience. The fact that they are then treated as suspects or with suspicion is a major issue. In addition, the wider public seeing these individuals being pulled aside for further questioning reinforces the stereo type that people who are Asian, for example, are more likely to be doing something illegal.

The algorithm, badly designed or trained, becomes an unchallengeable tyrant.

As an artists' response to this, Zach Blas developed this own method for capturing the faces of people with diverse backgrounds and shapes. He crafted the meshes calculated from these captures into masks to be worn. But because these faces were more diverse than the software is typically expecting, the masks were an uncomfortable fit. To underline the issue, the tyranny, the masks were made of metal, and putting them on was not just uncomfortable, but a literal prison of metal bars.



Are We Complicit?

Ed then took the discussion further and asked the provocative question: are we responsible?

If these algorithms are using our data to act in undesirable ways, should we be more active in preventing our own personal data from being used?

This question led Ed to develop a work of art which focussed on the connection between us, the passive or mildly participating public, and the actions of an algorithm.

His research took him to the visualisation of slaves packed inhumanely tight into transport ships. Those slave ships themselves were consciously and intentionally designed for that purpose - to maximise the transportation of slaves with just enough room and air to keep enough alive. These macabre dimensions and proportions were used to inform the design if a pyramid sculpture which moved in response to viewers moving around and in proximity to it.


Ed reported that viewers initially couldn't understand how this triangular pyramid moves and changes in response to their own movement. The cause and effect is there - but the culpability isn't clear.

The work starkly illustrates the core idea that each of us, individually, can't easily understand our own effect on the machine, exacerbated by the machine combining data from many of us. And that, in effect, distances our responsibility for the algorithm's actions.

Ed also highlighted the converse - where a mob will deliberately subvert an algorithm by flooding it with biased or malicious data. In this sense, the culpability is real, but each individual participant of these attacks is again anonymous, hiding amongst the mob.


Group Discussion

Ed's talk stimulated us into a great passionate discussion covering a broad range of themes, included:

  • Transparency and algorithms - is it as simple as demanding they must be open source? Can most people understand open sourced algorithms?
  • How journalists and artists can better explain tragedies given an understanding of statistical number.
  • The algorithm used to create the audio piece, For Dead Children and Electronics (2015).
  • The artist as a privileged person commenting on the lives of less privileged people - is that right? Or is it a duty on privileged artists to use their talents to create art that raises awareness?
  • The challenge of teaching children and younger people around personal data hygiene and privacy. The lack of education on personal data issues - and the need to do this before corporations take a lead on getting more personal data.
  • The toll on mental health of hyper-active social media platforms - and the benefits of digital disconnection. The idea of a "medical" limit on social media analogous to limits on alcohol and sugar. 



Further Links


No comments:

Post a Comment