Easter is a moveable feast, meaning its date is not fixed relative to the standard calendar. In principle, Easter is defined as the Sunday following the Full Moon after the vernal equinox. But the Church does not use astronomical observation to determine the date of the full moon. Instead, they use an abstruse theoretical model known as the Ecclesiastical Full Moon. The implementation of this model has led to differences in how the Western and Eastern Churches fix the date of Easter: Eastern Orthodox Christians use an ecclesiastical full moon that occurs four to five days later than the western ecclesiastical full moon. To further add to the confusion, the Roman Catholic Church since 1583 has been using the Gregorian Calendar to calculate the date of the vernal Equinox (March 21), while Eastern Churches uses the Julian Calendar.
The calculation of the date of Easter, a process known as computus, serves to illustrate a simple point: whenever possible, empirical observation (i.e., looking at the fucking moon) is preferable to complex theoretical amphibology.
One man who has spent the last 50 years answering big theoretical questions with beautiful experimental observations is Masakazu Konishi. This delightful fellow established both the songbird and the barn owl as badass models for understanding the neural computations that underlie behavior. My unmitigated enthusiasm for Konishi’s scientific corpus is so powerful that it often results in physical violence toward people I love. Today, with the help of a half-dozen Xanax and a bottle of vermouth, we will calmly focus our attention on some of Konishi’s experiments in the adorable barn owl.
Barn owls have supernatural hearing abilities. They use their auditory system to track down and slaughter tiny rodents under cover of complete darkness. In the 1970’s Konishi caught some barn owls and brought them into the lab. He was interested in studying owl sound localization, or how the brain pinpoints the location of a sound source. Some theorists (like Jeffress) had suggested that one might do this by intelligently comparing the slight timing differences of the sound signals received by the two ears. For example, when a mouse rustles on the owl’s left side, the sound reaches the left ear a few microseconds before the right. Using custom-built owl headphones, Konishi demonstrated that owls use this very slight timing difference, called the interaural time difference, to compute the location of a sound source along the horizontal plane.
A few years later, Konishi, together with Eric Knudsen, used electrophysiological recordings to identify neurons in the owl midbrain that fired only when sounds were presented in specific locations around the owl. These space-specific neurons were organized in a highly structured anatomical map, with one axis corresponding to vertical space and the other to horizontal. A similar topographic map of the external world had been found previously in the visual cortex of cats. However, in the visual system, the photoreceptors are organized as a spatial map within the retina, and this spatial structure is simply preserved in the cortex. Unlike an eye, an ear receives no inherently spatial information. In the auditory system, the brain must somehow compute spatial coordinates from the intensity and timing of sound signals received at the ear.
Computation of interaural time difference requires highly precise comparison of sound signals from the two ears.In the late 1980’s, Carr and Konishi identified the neural circuits that implement this computation within the owl brain. They found that neurons in the nucleus laminaris receive temporally phase-locked spikes from both ears (via the magnocellular nuclei). However, the signals from the two ears arrive at slightly different times, depending on how long it takes spikes to travel the axon from the magnocellular nucleus to the nucleus laminaris. So, for example, a laminaris neuron might receive spikes with a shorter delay from the left ear than the right. This neuron then responds maximally when it receives convergent input from both ears- it acts as a coincidence detector. Therefore, selectivity for interaural time difference arises from the very slight difference in sound arrival time at the two ears, and an equivalent difference in neural conduction to the nucleus laminaris. Different laminaris neurons receive signals with different conduction delays. So the nucleus laminaris contains a physical map of interaural time difference.
Theoretical models are often used to guess at the underlying algorithms that brains use to transform and organize sensory information. But Konishi’s work in the barn owl auditory system is one of very few instances in which humans have identified the actual neural computation that the brain uses to solve a complicated sensory problem. The sickness of this accomplishment is devastating. And under-appreciated. If Mark Konishi doesn’t win a Nobel Prize within the next ten years, we should all get owl facial tattoos in protest.