Draft Schedule

Morning Session

Afternoon Session

Room: Lecture Theatre 8

8:45- 9:00 Arrivals

9:00- 9:45 Data Efficiency and Machine Learning Neil Lawrence, Computer Science

9:45- 10:30 Jim Blevins, Department of Theoretical and Applied Linguistics, University of Cambridge

10:30- 10:45 Coffee Break

10:45- 11:30 Pricing Neural Information Jim Stone, Psychology

11:30- 12:15 Avoiding Equilibrium in Early Twentieth-Century Russian Prose Adam Fergus, Literature

12:15- 13:15 Lunch


Data Efficiency and Machine Learning

Neil Lawrence

Entropy is a key component of information and probability, and may provide the key to data efficient learning. While we've seen great success with the AlphaGo computer program and strides forward in image and speech recognition our current machine learning systems are incredibly data inefficient. Better understanding of entropy within these systems may provide the key to data efficient learning.

Pricing Neural Information

Jim Stone

Neurons process information, and that is pretty much all that they do. When a visual neuron sends information to the brain, that information reduces the brain's uncertainty. Thus, in the context of Shannon's theory of information (1948), entropy (uncertainty) and information are two sides of the same coin. But neurons are expensive; in children, the brain soaks up half of the total energy budget. Accordingly, the efficient coding hypothesis (Barlow, 1961) proposes that, when neurons recode the visual data from the eye into a series of digital neuronal spikes, each spike carries as much information as possible for each Joule of energy expended. In this talk, I will describe how the steep rise in the price that neurons pay for information is due to a fundamental limit set by Shannon's noisy channel coding theorem, and how this may account for key properties of neural information processing.

Avoiding Equilibrium in Early Twentieth-Century Russian Prose

Adam Fergus

Yevgeny Zamyatin’s We (1920), now known as a precursor to Aldous Huxley’s Brave New World presents a utopian society as fundamentally dystopic: ultimately all fantasy and creativity must disappear. In this talk, I shall focus on how Zamyatin’s contemporary, Aleksei Remizov, treats Russian history and the Russian literary canon playfully and subversively as a counterweight to the threat of the creative equilibrium, which was present in both Soviet Russia and in the Russian emigration. Remizov has often been dismissed as whimsical, but a serious moral purpose lies behind his oeuvre: the creative energy of the imagination is essential to his morally alert literary vision. By its very nature, as I shall show, this vision cannot be conclusive.