A Feeling for the (Neural) Organism: An Interactive History of Connectionism
Wednesday 28 October 2020, 15:00–17:00 GMT
Co-facilitator: Prof. Michael Castelle (University of Warwick)
Synopsis
Contemporary AI research, having been largely taken over by its formerly marginalized connectionist strand, is more than ever an experimental science; with arguably limited theoretical guidance, practitioners tinker with and contribute to a growing menagerie of arcane architectures with hundreds of thousands if not billions of individual weight parameters — each of whose existence is justified primarily by its 'state-of-the-art' accuracy on a variety of standardized datasets such as MNIST and ImageNet.
In this interactive tutorial, you will 1) informally learn the basic structural underpinnings and techniques of today's neural networks, and 2) follow and tweak code to construct and train connectionist models similar to — if not identical to, for various historical reasons — the Perceptron (ca. 1958), multilayer PDP networks (ca. 1986), the convolutional digit-recognizing LeNet (1989), Elman's simple Recurrent Neural Network (1990), and recent 'deep convolutional' ImageNet competition winners AlexNet/VGG16 (2012, 2014). The goal is to initiate participants into a phenomenological 'feeling for the organism' (Fox Keller, 1984) which, appropriately contextualized, can be extremely useful for qualitative research in the history of AI.