skip to content

Histories of Artificial Intelligence: A Genealogy of Power

 

Friday 20 November 2020, 15:00–17:00 GMT

Speakers: Prof. Georgina Born (University of Oxford), Prof. Luke Stark (Western University), Dr Eamonn Bell (Trinity College Dublin)

Synopsis

This session convenes experts in the history and anthropology of AI and music. First, Dr Eamonn Bell (Trinity College Dublin) explores how, in the 1960s, Walter Reitman attempted to use algorithmic music composition as evidence of algorithmic creativity and machinic competency with 'ill-defined problems'. Next, Prof. Luke Stark (Western University) will consider how in the 1970s Manfred Clynes (having coined the term 'cyborg' in the 1960s), drew on gendered conceptions of emotion, along with his own experience as a concert pianist, to argue that human emotions should be understood as universal affective patterns fit for tracing and analysis through what he called 'sentograms'. Prof. Georgina Born (University of Oxford) then concludes with an initial inquiry into the consequential epistemology and political economy of Music Information Retrieval, which today oversees some of the most pervasive and consequential applications of AI to music.

In preparation for this session, we recommend reading Prof. Born's recent article on this topic.

Speakers

Eamonn Bell

Around 1960, Walter Reitman of the Complex Information Processing group at the Carnegie Institute of Technology (now Carnegie Mellon University) made tape recordings of his co-investigator Marta Sanchez 'thinking aloud' as she composed a fugue at the piano keyboard. Reitman used protocol analysis to mine the 150-page transcript of this recording, seeking design inspiration for a new computer model of 'human information-processing' – Argus – which was intended to complement the then-recent work of his colleagues Herbert Simon and Allen Newell on the General Problem Solver. I relate and contextualise this unusual historical case, which shows how Western art music composition was used in the experimental systems of early 1960s AI research as a proxy for so-called 'ill-defined problems' and as an apodeictic demonstration of supposed algorithmic creativity. With the release of the Google 'Bach doodle' in March 2019, little appears to have changed in how high culture is mobilised in the rhetoric that surrounds AI systems.

Luke Stark

Manfred Clynes, Austro-Australian polymath and inventor, is today best known for coining the term 'cyborg' in the early 1960s with psychologist Nathan S. Kline, but was also a pioneer in developing devices designed to track bio-signals – human physiological data – to correlate them with purportedly universal human emotional states. Clynes's book Sentics: The Touch of the Emotions, published in 1977, became a touchstone for work in affective computing and HCI seeking to correlate human emotional states with physiological activity. Clynes argued human emotions should be understood as universal affective patterns expressed by the body, which could be traced and analyzed through 'sentograms', patterns of movement measured by a machine of his own development called the 'stenograph'. Clynes's work was tightly connected with his experience as a concert pianist, understanding musical expression both as a mechanism to explore sentic forms, and as indexical to a broader liberal internationalist politics: Clynes hoped making human emotions legible and comparable would prove their universality, and by extension provide a basis for easing cultural tensions and international political differences. Clynes's research program in sentics would be highly influential on later pioneers of affective computing such as MIT's Rosalind Picard, even if his initial work on sentic reached attracted only a small following. My remarks will examine Clynes's work and legacy through the lens of his focus on music, including the gendered nature of his conception of emotion, artistic genius, and individual human subjectivity.

Georgina Born

Music Information Retrieval (MIR) is an international scientific field that has developed since 2000 around the analysis and organisation of digital music data. It applies machine learning to the extraction of musical 'features' in pursuit of audio content analysis, machine listening, and analysis and simulation of musical style and genre. While MIR is an academic field, its research feeds industry; ML techniques 'that treat musical data as any other machine-readable data' (Serra 2013) inform commercial recommendation and music generation systems, and lie behind applications like Spotify, Shazam and Last.fm. I present a (commissioned) provocation addressed to MIR calling for greater diversification of the field in several linked regards: from the demographics of engineers, to the music addressed, to the field's epistemological and ontological assumptions, to its political economy. The upshot is to question these several deficiencies that have become engineered into some of the most pervasive and consequential applications of AI to music.

Join our Mailing List!

To receive further information on all our activities (and learn their online coordinates), please subscribe to the HoAI mailing list.

Email us at: hoai@hermes.cam.ac.uk

Join our Slack Channel!

To participate in conversations with scholars on topics related to your interests, please join our HoAI Slack Channel.