Finding networks in biological systems at different scales, from single molecules to groups of organisms. Some history of statistical physics approaches. Experimental progress and its implications. Maximum entropy methods as a link between emerging data and statistical physics models. Agenda for the rest of the lectures.
The simplest model: constant speeds, local correlations. Correct prediction of long-ranged correlations; do birds know Goldstone's theorem? Checking the sufficiency of local correlations; an aside about model comparison and Occam factors. Varying speeds and a fuller description. Are flocks poised near a critical point?
The protein folding problem, and its inverse: the ensemble of sequences consistent with structure and function. Early experimental approaches, connections to maximum entropy. Approximation schemes, identifying local interactions; can we fold proteins from sequence statistics? Limitations from data set size, scaling back the problem to something we can solve. Criticality again?
Some ideas about pattern formation in embryos. Basic facts about the fruit fly. Signals, noise, and information flow in genetic networks. Measuring positional information in real embryos; signatures of optimality. How would we recognize a critical network? Evidence from the fly. Biological function of criticality. Ideas for a new generation of experiments.
Reminder of Hopfield and other physics-based models. Minimal implications of widespread, weak correlations in simple networks. The hard work of building accurate models for 10, 50, 100, ... neurons in the retina. From statistical mechanics to thermodynamics. Asides about recording methods and scaling up to bigger networks. First steps in the hippocampus, C elegans, perhaps more. Critical phenomenology, and ruling out less interesting alternative explanations. Tuning, adaptation, learning.