Infinite Ascent.

by CJ Quineson

Notes on senses

they’re more similar than you’d think

Most presentations of sensory neuroscience tackle senses separately. This is fair: their specific mechanisms are quite different, and we understand some senses much better than others. But the senses also share plenty of similarities, and I’d like to explore some of them here.

I am not a neuroscientist. All numbers approximate.

What are the senses?

General and special senses

Consider the five classical senses: hearing, smell, taste, touch, and vision. One of them is the odd one out. Which?

You can single out any sense by its modality, or the type of sensation it produces. Hearing senses sound, vision senses sight. Or you can single out a sense based on which things, or stimuli, it can sense: hearing senses sound waves, vision senses light. The type of stimuli also leads to several two–three splits: two senses require closeness, three work at a distance; two sense chemistry, three sense physical phenomena. But the distinction I’m thinking of involves none of this.

Considering anatomy, there’s one clear answer: touch. Each other sense has a dedicated organ, but touch does not. The other four senses, plus the sense of balance, are called the special senses, and all other senses are called the general senses. The naming is a bit ironic. Hearing, smell, taste, balance, and vision are all special. Touch, and other as-yet unnamed senses, are general.

The sense of touch can encompass a broad number of stimuli, depending on your definition of touch. Many parts of your body can sense light taps and heavy presses, their position relative to the rest of your body, their muscles tensing and stretching, itching and tickling, slow and fast vibration, heat and cold, sharp pain and dull pain and aching pain. Somatosensation is the broad name for the senses encompassing touch–position and temperature–pain.

Interoception

Most of the senses we’ve discussed so far detect stimuli from the external world, making them all examples of exteroception. But we also sense things from inside the body, which is called interoception.

The five special senses are exteroceptive. Somatosensation is mostly exteroceptive; the exception1 is proprioception, our sense of body position and movement. Proprioception is responsible for knowing where our arms are after waking up, or feeling our muscles tense and relax.

The remaining general senses are all interoceptive. We have a sense of thirst and hunger, and we can sense when we need to urinate or defecate. After a quick run, you might sense your breathing or heartbeat. We can sense stomachache and heartburn. Our body regulates its internal temperature and blood pressure by sensing them, which we’re not usually conscious of. All of these are examples of interoception.2

Many of the examples we’ll discuss involve exteroception and proprioception, because these are familiar and well-studied. Evidence suggests these hold for other kinds of interoception,3 but the research isn’t as conclusive.

How does sensory information travel?

Sensory receptors

You might know that the nervous system transmits information via nerves, which conduct electrical signals. So how does something like a sound wave or an odor molecule generate electricity?

Sensation starts from sensory receptors, cells that react to stimuli. In response to stimuli, sensory receptors open and close ion channels, changing the amount of ions that pass through them. The resulting potential difference causes an electrical signal, which is called the receptor potential.

The process of converting stimuli to electrical signals is called transduction, and different kinds of sensory receptors transduce in different ways. From a physical perspective, a sensory receptor converts the energy of a stimulus to electrical energy. We can classify sensory receptors into four broad kinds,4 based on the kind of energy it converts.

  • Chemoreceptors convert chemical energy. The clearest examples are taste and smell receptors. But we also have osmoreceptors that detect blood solute concentration to regulate thirst, acid-sensing ion channels that detect carbon dioxide levels to regulate breathing, and pruriceptors that detect histamine (among other chemicals) to cause itching.

  • Thermoreceptors convert thermal energy. We have them on our skin to sense ambient temperature, and inside our body to regulate body temperature. The most studied kinds are TRP channels, like TRPV1 which activates above 43°C, and TRPM8 which activates below 26°C. But TRPV1 also happens to react to capsaicin, and TRPM8 also happens to react to menthol, so they’re more precisely called polymodal.

  • Photoreceptors convert radiant energy. We have them in our eyes, and they can be classified into rods, cones, and intrinsically photosensitive retinal ganglion cells. They all have the same mechanism: a photon hits retinal bound to an opsin, causing a cascade of chemical reactions. Each kind of photoreceptor has a different kind of opsin, and each kind responds best to a certain wavelength.

  • Mechanoreceptors convert mechanical energy. They’re abundant in our skin, but they’re also responsible for hearing. The other sensory receptors we’ve discussed rely on chemical reactions to open or close channels, but mechanoreceptors work by deforming these ion channels open or closed.

Because mechanoreceptors work without chemical reactions, their transduction speed is the fastest. The time from introducing the stimulus to the onset of an electrical signal takes less than 50 μs for hair cells. In contrast, every other kind of receptor takes tens of milliseconds: the photoreceptor cascade takes 40 ms for cones and 100 ms for rods, TRPV1 takes 50 ms, and olfactory transduction takes 60 ms.

Afferent nerves

If mechanoreceptors transduce orders of magnitude faster than other sensory receptors, why aren’t hearing or touch orders of magnitude faster than vision? While human mean reaction time varies based on modality, they’re all in the same order of magnitude: sound takes 150 ms, touch 155 ms, and light 190 ms. While the differences between them are mostly accounted for by transduction speed, that difference is small compared to the rest of the sensory pathway.

The resulting electrical signals from transduction travel via afferent nerves, arriving at the brain. These contrast with efferent nerves exiting the brain, which carry electrical signals representing motor information.

The difference between general and special senses shows up in afferent nerves.5 The afferent nerves from the special senses all connect to cranial nerves, which connect directly to the brain. In contrast, the afferent nerves from general senses connect mostly to spinal nerves, which go through the spinal cord.

A nerve’s conduction velocity depends largely on its diameter and myelination, or how much myelin surrounds it. The auditory pathway goes through the vestibulocochlear nerve CN VIII (the eighth cranial nerve), which is myelinated, has a mean diameter of 3 μm, and a conduction velocity of 20 m/s. In contrast, the visual pathway goes through the optic nerve CN II, which is also myelinated, but has a thinner 1 μm diameter, and thus a slower conduction velocity of 15 m/s. While transduction speed accounts for most of the difference, conduction velocity also contributes.

Sensory pathways

A common symptom of coronary artery disease is pain in the chest wall, shoulder, and the left arm and hand. This is called angina, and it’s the textbook example of referred pain, where visceral organ damage is misperceived as somatic pain, pain that comes from skin, muscles, or joints. The common explanation for this has to do with the sensory pathways for pain.

Recall that sensory receptors transduce stimuli to electrical signals, which travel via afferent nerves, which connect to the central nervous system via cranial or spinal nerves. These eventually connect6 to the sensory cortices. Neurons in each sensory cortex process sensory information, then relay it to the rest of the brain.

Hearing, for example, is a pathway we know much about. CN VIII connects to the cochlear nucleus in the middle of the brainstem, which connects to several other spots around the brainstem, which all connect to the inferior colliculus in the midbrain, which connects to the medial geniculate nucleus in the thalamus, which then connects to the auditory cortex.

Balance, vision, and taste follow a similar pattern: their cranial nerves eventually connect to somewhere in the thalamus, which then connects to the corresponding cortex. Smell is unique in its absence of primary connections to the thalamus, as the olfactory bulbs connect directly to the primary olfactory cortex. But smell has other stuff going on; it’s the special sense we understand least.

The spinal cord has two somatosensory pathways: the dorsal column–medial lemniscus pathway, and the anterolateral system. Both eventually connect to the primary somatosensory cortex. The afferent nerves that carry visceral pain, and the nerves that carry somatic pain, both connect to nerves in the dorsal horn. I think this is a pretty compelling explanation for referred pain, though there’s some criticism of this theory.

What does sensory information look like?

Efficient coding

Screens and monitors represent images with a grid of pixels. In contrast, the visual cortex represents properties like edges, orientations, and other patterns, but nothing pixel-like.7 This is not only entirely different, but seemingly more complex. Why?

The translation between sensory information and electrical signals is called neural coding. While receptor potential may be continuous and graded, afferent nerves carry this information as pulses of action potentials.

Different stages in different sensory pathways use different neural codes, but there are some shared properties. Consider, for example, these two extremes:

  • Each possible state of neurons represents a different thing. For even modestly-sized sets of neurons, there’d be an astronomical number of possible items, so much of the capacity would remain unused.8 Decoding would be expensive, as any supposed downstream cell would have to take the whole set as input. It’d also be quite sensitive to noise, as any misfire would affect the representation.

  • Each neuron represents a different thing. You would have a so-called “grandmother cell” that corresponds to perceiving your grandmother. This is wasteful capacity-wise, as you would need a single neuron for each possible item. Generalizing would be difficult: it’d be hard to learn about what faces are shaped like if each face had a separate neuron.

A reasonable scheme would lie between these. The efficient coding theory states that neural coding encodes as much information as possible, while balancing constraints like9 metabolic cost, robustness against noise, capacity, and ability to generalize. Such a neural code would be optimized for natural stimuli, and more common inputs would take less energy to represent.

If this sounds like something straight from an information theory book, you’d be right. Efficient coding applied to information-theoretic methods have successfully predicted much neural phenomena. Perhaps the most famous example involves a neural network trained to reconstruct natural images, while minimizing the number of active units: the resulting components represented edges and orientations, much like certain neurons in the visual cortex. Similar results hold for hearing and smell.

Adaptation speed

Some sensations fade from our attention quickly, like the feeling of new clothes on your skin, or a room’s ambient scent. Others don’t, like the feeling of lifting a heavy weight, or the pain of an injury. Why does sensory adaptation happen in such different rates?

Efficient coding predicts sensory adaptation, for the same reason that adaptive data compression works well for streaming data. It’s wasteful to keep transmitting an unchanged input. Constant stimuli tend to, over time, stop causing sensory neurons to fire.

Because different stimuli change at different rates, efficient coding also predicts adaptation to happen over different timescales. If a noisy stimulus10 only changed after long intervals, a quickly-adapting coding system would tend to overfit to noise, which would be inefficient.

Adaptation has different mechanisms, depending on modality and which stage in the sensory pathway:

Inhibition

When your toe hits a hard surface and starts hurting, you might rub the area around the toe, which reduces the pain. Why does this work?

The general thing happening is sensory inhibition. The kind involved here is feedforward inhibition, where stimuli trigger a principal neuron and an inhibitory neuron. The principal neuron fires, and after a while the inhibitory neuron fires, suppressing the principal neuron. In our toe example, gate control theory asserts that suppression happens at the dorsal horn: rubbing causes feedforward inhibition12 of the pain.

Another kind of inhibition is feedback inhibition: the principal neuron fires, which triggers an inhibitory neuron, which suppresses the principal neuron. There’s also lateral inhibition, where an excited neuron inhibits neighboring neurons. Lateral inhibition is often implemented via feedback inhibition, where the inhibitory neuron suppresses the neighbors of the original principal neuron.

Once again, efficient coding predicts lateral inhibition. Suppose a stimulus pressed on several touch receptors, all in a clump. What would the neural coding look like? We could send an action potential from each one, but this’d cost more than sending a single signal from the center, maybe in a way that indicated the range of the stimulus. This is precisely what lateral inhibition does.13

The functions of lateral inhibition are more varied than raw efficiency, however. Lateral inhibition increases visual contrast, which helps with edge perception; it attenuates similar sound frequencies, which helps with echo suppression; it might also be a mechanism for controlling somatosensory attention. Similarly, feedforward inhibition has been thought to sharpen temporal contrast.

Conclusion

All senses share the same function: transmitting information about stimuli to the brain. This information travels from sensory receptors, through afferent nerves, to the central nervous system. The efficient coding of this information leads to effects observable across modalities, like adaptation and inhibition.

We’ve focused on things that happen before sensory information reaches the brain. This information needs to be organized, integrated, and interpreted. These bottom-up processes that assemble higher-order concepts out of lower-level stimuli are only part of the story—the full story will have to wait for some other time.

  1. 1

    Proprioceptors sense properties of muscles and joints, so I’d argue that, by our definition, they’re interoceptive. However, many sources will treat proprioception as neither interoception nor exteroception, but as a third category.

  2. 2

    In order: proprioception, osmoreception and gastric perception, bladder perception and rectal perception, respiroception and cardioception, visceral nociception, thermoception and baroreception.

  3. 3

    We know respiroception and cardioception follow similar sensory pathways. The EPIC model is a kind of efficient coding. Adaptation happens with urethral flow and the baroreceptor reflex. Inhibition happens in osmoreception and respiroception.

  4. 4

    Most sources I’ve read using this classification split nociceptors as a fifth category. But all nociceptors are either chemoreceptors, thermoreceptors, or mechanoreceptors.

  5. 5

    Probably not for any deep reason. The special sense organs are all on the head, so it’d make sense that they’re innervated by cranial nerves, which are nearer.

  6. 6

    It’s not clear to me this holds for things like cardioception or respiroception, but it’s true for the special senses and the somatosensory system.

  7. 7

    There’s other first-principles ways to exclude pixel-like representations, like invariance, but efficient coding will come up again later.

  8. 8

    The brain has billions of neurons. Even 2502^{50} states of 5050 neurons is more moments than you’d experience in a lifetime.

  9. 9

    I’m being intentionaly vague about constraints here. There’s a neat framework that, depending on the parametrization of these constraints, unifies many different versions of efficient coding.

  10. 10

    I’ve alluded to neuronal noise, but here I’m talking about noise inherent to stimuli themselves. Light, for example, has noise due to photon statistics.

  11. 11

    Not all mechanoreceptors adapt this fast: Merkel discs and bulbous corpuscles are much slower, for example.

  12. 12

    The contemporary view of gate control theory has more subtleties than this. It’s still feedforward inhibition, by our definition: the inhibitory neuron fires due to a stimulus, and not due to the principal neuron, or neighboring neurons.

  13. 13

    Rubbing the area around a stubbed toe is not an example of lateral inhibition, because it involves different kinds of sensory receptors. They may be in the same section of skin, but they’re not strictly neighboring.

Comments

Loading...