Feels good right? Well, you’re not imagining it - research has shown that the senses of touch and hearing are linked, and that the experience of one is likely enhanced by the other. This notion - that our senses complement one another to improve our perception - is called multisensory integration. You may not always realize it, but most things you do depend on it. For instance, understanding a person as they talk requires your brain to combine the sight of their facial movements with the sound of their voice.
So how does it work?
We’re still figuring it out. But a group of scientists recently carried out an experiment with some mice to try and figure out how touch and sound might be combined in the brain. Here’s what they did.
The experiment: The researchers took a little mouse paw (is that cute or what) and placed it on a vibrating button. They then measured neural activity in the part of the mouse’s brain that is known to process touch (known as S1). As expected, they observed that when they vibrated the button, neurons in S1 responded, showing that mice could sense the touch of the button. But here’s where it gets interesting. When the researchers vibrated the button again but this time paired it with a sound that played in tandem, these S1 neurons increased their activity even more than before. This implies that the mice were able to sense the button vibration more clearly when it was coupled with a sound. Based on this, the scientists concluded that neurons in S1 do more than just respond to touch - they also play a role in multisensory integration with hearing.
The Leak: It looks like your sense of touch could be heightened when paired with the right sounds, so now you know why your massage hits different when you can hear a babbling brook in the background. And if you’re a little rodent looking to treat yourself to a spa day, go ahead and click “add to cart” on that pair of noise cancelling headphones you’ve been eyeing on A-mouse-on.com - we promise it’ll be worth it.
Scientists have successfully restored a fully paralyzed man’s ability to speak. No, you didn’t misread that. Yes, it really is true. In 2003, a man (nicknamed Pancho) was in a terrible car crash that left him paralyzed and unable to speak. After years of hard work, a team of neuroscientists was able to design a device that allows Pancho to vocalize certain sentences using only his brain. They reported their findings in the New England Journal of Medicine last month.
As insane as it sounds, this paper is just the latest in a long string of research on brain-computer interfaces (BCIs for short). The goal of BCIs is to build devices that can directly communicate with a patient’s brain and give them back a function that they may have lost.
How does it work?
In Pancho’s case, the researchers implanted an array of electrodes over the parts of his brain that formerly controlled his speech. The electrodes were used to measure neural signals as he attempted to say certain words. These signals were then transmitted to a computer algorithm (known as a neural network). Through practice, the algorithm learned which signals corresponded to which words. The rest was straightforward - any time the computer detected a signal that in Pancho’s brain it recognized, it played the corresponding word out loud, thus effectively restoring Pancho’s speech.
Are there limitations?
Unfortunately, yes. For now, the algorithm is only able to recognize a limited set of 50 words. The device also requires a wire to be plugged into a port sticking out of Pancho’s skull (which is not exactly convenient). Nonetheless, the fact that neuroscientists can accomplish this in any capacity AT ALL is a major feat.
The Leak: We may not have flying cars, but 2021 is shaping up to be quite the year for neurotechnological advancements. With the first successful case of a paralyzed individual having his speech restored now in the books, the future of brain computer interfaces has never looked brighter.
No, we didn’t mix up our cheesy sayings. You’ve heard about it before – whether in team sports, a millennial gym (Crossfit, we’re lookin’ at you), soul-crushing corporations, the harsh realities of graduate school, or even a global pandemic – shared suffering has the strange effect of bringing people together and creating lasting bonds. But how do stressful events actually create a feeling of closeness?
The experiment: College women were brought into a research lab to participate in a pain task (don’t worry, by pain we mean a mild but uncomfortable zap on the hand). The participants sat in pairs and were each hooked up to an electrical stimulator with a board between them, so they could hear but not see one another. Each participant watched a screen that indicated on each trial, the size of the zap coming for either themselves or their partner. After a couple of seconds, the zap was delivered and the participants rated the intensity experienced on a 0-10 scale.
Is that it?
No, the scientists also simultaneously measured neural activity for both participants using electroencephalographic (EEG) electrodes placed on their scalps. This allowed them to look at oscillations of neural activity in an area of the brain associated with anticipation of pain. After the experiment, the participants were also asked questions to assess how empathetic they felt towards one another.
What did they find?
Surprisingly, neural oscillations were synchronized between participants during the anticipation of pain, regardless of whether they were expecting a zap for themselves or for their partner. What's more, it turned out that the more social and empathetic pairs had stronger neural synchrony.
The Leak: It turns out the expression “being on the same wavelength as someone else” might just be literal, especially when it comes to shared painful experiences. That said, the extent to which this happens does seem to depend on how much each person wants to make friends, which might explain why you don’t find yourself besties with the jacked Mr. Meathead on the bench press (because no one likes cable TV Rob Lowe).
In case you missed it: