Program
Draft Version
- Monday, May 23 Morning Session
8:45- 9:00 Arrivals
9:00- 9:45 Data Efficiency and Machine Learning Neil Lawrence, Computer Science
9:45- 10:30 Jim Blevins, Department of Theoretical and Applied Linguistics, University of Cambridge
10:30- 10:45 Coffee Break
10:45- 11:30 Pricing Neural Information Jim Stone, Psychology
11:30- 12:15 Avoiding Equilibrium in Early Twentieth-Century Russian Prose Adam Fergus, Literature
12:15- 13:15 Lunch - Monday, May 23 Afternoon Session
13:15- 14:00 A Chemist's View of the Importance of Entropy in 2 Examples Tony Ryan, Chemistry
14:00- 14:45 Do archaeologists need thermodynamics? Roger Doonan, Archeology
14:45- 15:00 Coffee Break
15:00- 15:45 Can you calculate entropy generation in bio-chemical-techno-socio-political systems? Alastair Buckley, Physics
15:45- 16:30 All I know is that I know nothing, and I can prove it Iñaki Esnaola, Automatic Control and Systems Engineering
16:30- 17:15 Engines and refrigerators could be more efficient if it weren't for that pesky second law Stephen Beck, Mechanical Engineering
Abstracts
Data Efficiency and Machine Learning
Neil LawrenceEntropy is a key component of information and probability, and may provide the key to data efficient learning. While we've seen great success with the AlphaGo computer program and strides forward in image and speech recognition our current machine learning systems are incredibly data inefficient. Better understanding of entropy within these systems may provide the key to data efficient learning.
Pricing Neural Information
Jim StoneNeurons process information, and that is pretty much all that they do. When a visual neuron sends information to the brain, that information reduces the brain's uncertainty. Thus, in the context of Shannon's theory of information (1948), entropy (uncertainty) and information are two sides of the same coin. But neurons are expensive; in children, the brain soaks up half of the total energy budget. Accordingly, the efficient coding hypothesis (Barlow, 1961) proposes that, when neurons recode the visual data from the eye into a series of digital neuronal spikes, each spike carries as much information as possible for each Joule of energy expended. In this talk, I will describe how the steep rise in the price that neurons pay for information is due to a fundamental limit set by Shannon's noisy channel coding theorem, and how this may account for key properties of neural information processing.
Avoiding Equilibrium in Early Twentieth-Century Russian Prose
Adam FergusYevgeny Zamyatin’s We (1920), now known as a precursor to Aldous Huxley’s Brave New World presents a utopian society as fundamentally dystopic: ultimately all fantasy and creativity must disappear. In this talk, I shall focus on how Zamyatin’s contemporary, Aleksei Remizov, treats Russian history and the Russian literary canon playfully and subversively as a counterweight to the threat of the creative equilibrium, which was present in both Soviet Russia and in the Russian emigration. Remizov has often been dismissed as whimsical, but a serious moral purpose lies behind his oeuvre: the creative energy of the imagination is essential to his morally alert literary vision. By its very nature, as I shall show, this vision cannot be conclusive.
A Chemist's View of the Importance of Entropy in 2 Examples
Tony RyanSolutions of one molecule in another are driven by maximising entropy, think of the dissolution of a coloured molecule in water and the entropy can overcome unfavourable interactions between the molecules. As the interactions between the molecules become more unfavourable they overcome the mixing entropy and separation occurs, think oil and water. But what about more complicated molecules, those that have a water hating part and a water loving part. A soap molecule is an oily chain with a ionic group on the end. Water hates the oil but loves the ion, and a single soap molecule in solution makes the water molecules adopt non-random configurations to accommodate the oily chain. To maximise the entropy the oily chains are excluded from the solution with their water loving ions remaining in solution, there is local separation on the molecule scale that maximises the entropy of the water by making little balls of soap containing ~ 100 molecules. So this morning you relied on the configurational entropy of water to wash before you came to work.
The structure, properties and dynamics of polymers are dominated by their entropy, more specifically their configurational entropy. We will demonstrate this using a rubber band, which is made of a collection of linked polymer chains. Rubbers have a number of unusual properties, for example they are perfectly elastic at small strains and a loaded rubber will shrink on heating. Both of these phenomena are driven by maximum entropy. An unperturbed polymer molecule can be described by a random walk, when you stretch a rubber the walk becomes biased in the stretching direction, and when the stretching force is removed then the relaxation back to a random walk provides a restoring force and the rubber returns to its original length. In the classical thermodynamics taught to chemistry students the entropy change is defined by the reversible heat transfer and we will feel the the entropy as heat is given out and taken in on the loading and unloading.
Do archaeologists need thermodynamics?
Roger DoonanUnderstanding the development of social complexity has often been couched in terms of social evolution which in turn is seen often (but not always) allied with Darwinian evolution. Archaeologists have vacillated in their affection for such approaches, with some tying themselves closely with evolutionary biology and others rejecting evolutionary accounts instead preferring to hold human practice as something special; something that has decoupled itself from evolutionary processes, and can only be understood from unique contextual perspectives. This paper questions how accounts of human life have become decoupled from biological and physical systems and in turn asks how thermodynamics and kinetics might provide useful ways to understand the development of social complexity. Work by Swenson et al. has highlighted how Darwinian natural selection cannot provide a comprehensive theory of evolution simply because it cannot account for how life, itself, was the product of evolution. The decoupling of the biological and physical systems in this manner is now well-rehearsed and more recently some scholars have turned to exploring how social systems may be reconnected with a general theory of evolution that is understood as a planetary (or even universal) phenomenon, where the Earth system evolves as a single global entity. This has obvious implications for our understanding of traditional physical and biological systems, but also for our ideas of how social systems might develop. The paper concludes by suggesting that archaeologists and anthropologists should be more comfortable with ideas of determining conditions, perhaps more familiar to physicists, and should not consider these in tension with histories that seek to reveal the creativity and uniqueness of human communities. It is instead argued that the vital forces for creativity and their myriad manifestations through time are precisely the result of such specific forms of life emerging within systems that are undeniably coupled to thermodynamic processes.
Can you calculate entropy generation in bio-chemical-techno-socio-political systems?
Alastair BuckleyIn science, entropy is the driver for all physical, chemcial and biological processes. However the calculation of entropy in these different kinds of processes seems to be being done subtly differently. It would be great to try and figure out a way to calculate entropy generation in systems that involve all kinds of different biological, technological and socio-technical processes. Maybe this kind of calculation would be useful in understanding historical environmental changes and allow society's to plan more effectively for future change?
All I know is that I know nothing, and I can prove it
Iñaki EsnaolaThe data deluge we are witnessing highlights the need for operational definitions of information in complex systems. It comes as no surprise then to see that information theory is reaching into domains that go far beyond its original scope. However, when Shannon used entropy to quantify the information content of a message in 1948, he focused on very specific point-to-point communication systems; and the extension of information theory to complex multiterminal systems is far from complete. Moreover, the concept of information, in the wide sense, is traditionally linked to the notion of knowledge about a particular process or system. For that reason, entropy is usually envisioned as a mean for developing tools that increase our knowledge extraction capabilities. In this talk we will argue that Shannon's entropy has a less known sobering dimension to it: Entropy allows us to quantify how fundamentally ignorant we are.
Engines and refrigerators could be more efficient if it weren't for that pesky second law
Stephen Beck
Engineering is the science, or art, of achieving the best compromise to solve problems. Many Engineers are interested in the optimisation of energy conversion. The earliest work on the second law by Carnot defined limits on the conversion of heat to work (e.g. burning coal to pump water). This was later codified by Clausius and Thomson into the second law of thermodynamics, based on heat, work and temperature. Even though these limits are inviolable, there are ways of sidestepping the laws, but many of the routes to Thermotopia are limited by technology.
This understanding of the first and second laws of thermodynamics have led to the development of two main families of devices, heat engines and refrigerators. Stephen will show a few basics of Engineering thermodynamics, provide a physical insight into these limits and show some simple ways of spotting perpetual motion machines.