Avatar

Neuroscience

@neurosciencestuff / neurosciencestuff.tumblr.com

Articles and news from the latest research reports.
Avatar

MIT researchers have discovered a brain circuit that drives vocalization and ensures that you talk only when you breathe out, and stop talking when you breathe in.

The newly discovered circuit controls two actions that are required for vocalization: narrowing of the larynx and exhaling air from the lungs. The researchers also found that this vocalization circuit is under the command of a brainstem region that regulates the breathing rhythm, which ensures that breathing remains dominant over speech.

“When you need to breathe in, you have to stop vocalization. We found that the neurons that control vocalization receive direct inhibitory input from the breathing rhythm generator,” says Fan Wang, an MIT professor of brain and cognitive sciences, a member of MIT’s McGovern Institute for Brain Research, and the senior author of the study.

Jaehong Park, a Duke University graduate student who is currently a visiting student at MIT, is the lead author of the study, which appeared in Science. Other authors of the paper include MIT technical associates Seonmi Choi and Andrew Harrahill, former MIT research scientist Jun Takatoh, and Duke University researchers Shengli Zhao and Bao-Xia Han.

Vocalization control

Located in the larynx, the vocal cords are two muscular bands that can open and close. When they are mostly closed, or adducted, air exhaled from the lungs generates sound as it passes through the cords.

The MIT team set out to study how the brain controls this vocalization process, using a mouse model. Mice communicate with each other using sounds known as ultrasonic vocalizations (USVs), which they produce using the unique whistling mechanism of exhaling air through a small hole between nearly closed vocal cords.

“We wanted to understand what are the neurons that control the vocal cord adduction, and then how do those neurons interact with the breathing circuit?” Wang says.

To figure that out, the researchers used a technique that allows them to map the synaptic connections between neurons. They knew that vocal cord adduction is controlled by laryngeal motor neurons, so they began by tracing backward to find the neurons that innervate those motor neurons.

This revealed that one major source of input is a group of premotor neurons found in the hindbrain region called the retroambiguus nucleus (RAm). Previous studies have shown that this area is involved in vocalization, but it wasn’t known exactly which part of the RAm was required or how it enabled sound production.

The researchers found that these synaptic tracing-labeled RAm neurons were strongly activated during USVs. This observation prompted the team to use an activity-dependent method to target these vocalization-specific RAm neurons, termed as RAmVOC. They used chemogenetics and optogenetics to explore what would happen if they silenced or stimulated their activity. When the researchers blocked the RAmVOC neurons, the mice were no longer able to produce USVs or any other kind of vocalization. Their vocal cords did not close, and their abdominal muscles did not contract, as they normally do during exhalation for vocalization.

Conversely, when the RAmVOC neurons were activated, the vocal cords closed, the mice exhaled, and USVs were produced. However, if the stimulation lasted two seconds or longer, these USVs would be interrupted by inhalations, suggesting that the process is under control of the same part of the brain that regulates breathing.

“Breathing is a survival need,” Wang says. “Even though these neurons are sufficient to elicit vocalization, they are under the control of breathing, which can override our optogenetic stimulation.”

Rhythm generation

Additional synaptic mapping revealed that neurons in a part of the brainstem called the pre-Bötzinger complex, which acts as a rhythm generator for inhalation, provide direct inhibitory input to the RAmVOC neurons.

“The pre-Bötzinger complex generates inhalation rhythms automatically and continuously, and the inhibitory neurons in that region project to these vocalization premotor neurons and essentially can shut them down,” Wang says.

This ensures that breathing remains dominant over speech production, and that we have to pause to breathe while speaking.

The researchers believe that although human speech production is more complex than mouse vocalization, the circuit they identified in mice plays the conserved role in speech production and breathing in humans.

“Even though the exact mechanism and complexity of vocalization in mice and humans is really different, the fundamental vocalization process, called phonation, which requires vocal cord closure and the exhalation of air, is shared in both the human and the mouse,” Park says.

The researchers now hope to study how other functions such as coughing and swallowing food may be affected by the brain circuits that control breathing and vocalization.

(Image credit: Image: Jose-Luis Olivares, MIT)

Avatar

Hearing study: each nerve fiber trains on it’s own

A complex network of nerve fibers and synapses in the brain is responsible for transmission of information. When a nerve cell is stimulated, it generates signals in the form of electrochemical impulses, which propagate along the membrane of long nerve cell projections called axons. How quickly the information is transmitted depends on various factors such as the diameter of the axon. In vertebrates, where the comparatively large brain is enclosed in a compact cranium, another space-saving mechanism plays a major role: myelination. This involves the formation of a biomembrane that wraps around the axon and significantly accelerates the speed of signal transmission. The thicker this myelin sheath, the faster the transmission.

“Even though myelination is an integral part of neural processing in vertebrate brains, its adaptive properties have not yet been comprehensively understood,” says Dr. Conny Kopp-Scheinpflug, neurobiologist at the LMU Biocenter. She is the principal investigator of a study recently published in the journal Proceedings of the National Academy of Sciences (PNAS), which reveals new insights into the principles of myelination. The researchers investigated the question as to how sensory stimulation affects the formation of the myelin layers. “We know that axons which are regularly stimulated have enhanced myelin sheath thickness,” explains Dr. Mihai Stancu, lead author of the paper. Accordingly, regular training improves transmission capability. It was unknown, however, whether this change takes place at the level of individual nerve fibers or if adaptive myelination is also transferred to neighboring, passive axons in a fiber bundle.

To answer this question, the scientists investigated the neural activity of mice. “We focused on the auditory system, because it allows separate activation of the left and right neural circuits,” explains Kopp-Scheinpflug. To this end, the team rendered the lab mice temporarily deaf in one ear by means of an earplug. This way, one side received stronger acoustic stimulation than the ear-plugged other side for the duration of the experiment. “Surprisingly, all the nerve fiber bundles we investigated in the brain contained axons that carried information from the right ear as well as axons transmitting information from the left ear,” says Stancu. The experimentally-induced one-sided deafness allowed the researchers to test their hypothesis.

Their results showed that in the mixed nerve fiber bundles, only the myelin sheaths of the axons that belonged to the non-plugged active ear were strengthened. Consequently, the active axons did not transfer adaptive changes in myelination to the other, passive fibers, even when they were located in close proximity. “The principle seems to hold that each axon trains on it’s own,” observes Kopp-Scheinpflug. “As such, the activity of one input channel cannot compensate for the deficits of another.” The authors conclude that varied sensory experience throughout the lifespan of a person is vitally important. “If you want to remain cognitively fit, you should give your brain comprehensive all-round training.”

Source: lmu.de
Avatar

Schizophrenia and aging may share a common biological basis

Researchers from the Stanley Center for Psychiatric Research at the Broad Institute of MIT and Harvard, Harvard Medical School, and McLean Hospital have uncovered a strikingly similar suite of changes in gene activity in brain tissue from people with schizophrenia and from older adults. These changes suggest a common biological basis for the cognitive impairment often seen in people with schizophrenia and in the elderly. 

In a study published today in Nature, the team describes how they analyzed gene expression in more than a million individual cells from postmortem brain tissue from 191 people. They found that in individuals with schizophrenia and in older adults without schizophrenia, two brain cell types called astrocytes and neurons reduced their expression of genes that support the junctions between neurons called synapses, compared to healthy or younger people. They also discovered tightly synchronized gene expression changes in the two cell types: when neurons decreased the expression of certain genes related to synapses, astrocytes similarly changed expression of a distinct set of genes that support synapses. 

The team called this coordinated set of changes the Synaptic Neuron and Astrocyte Program (SNAP). Even in healthy, young people, the expression of the SNAP genes always increased or decreased in a coordinated way in their neurons and astrocytes. 

“Science often focuses on what genes each cell type expresses on its own,” said Steve McCarroll, a co-senior author on the study and an institute member at the Broad Institute. “But brain tissue from many people, and machine-learning analyses of those data, helped us recognize a larger system. These cell types are not acting as independent entities, but have really close coordination. The strength of those relationships took our breath away.”

Schizophrenia is well-known for causing hallucinations and delusion, which can be at least partly treated with medications. But it also causes debilitating cognitive decline, which has no effective treatments and is common in aging as well. The new findings suggest that the cognitive changes in both conditions might involve similar cellular and molecular alterations in the brain.

“To detect coordination between astrocytes and neurons in schizophrenia and aging, we needed to study tissue samples from a very large number of individuals,” said Sabina Berretta, a co-senior author of the study, an associate professor at Harvard Medical School, and a researcher in the field of psychiatric disorders. “Our gratitude goes to all donors who chose to donate their brain to research to help others suffering from brain disorders and to whom we’d like to dedicate this work.” 

McCarroll is also director of genomic neurobiology for the Broad’s Stanley Center for Psychiatric Research and a professor at Harvard Medical School. Berretta also directs the Harvard Brain Tissue Resource Center (HBTRC), which provided tissue for the study. Emi Ling, a postdoctoral researcher in McCarroll’s lab, was the study’s first author.

SNAP insights

The brain works in large part because neurons connect with other neurons at synapses, where they pass signals to one another. The brain constantly forms new synapses and prunes old ones. Scientists think new synapses help our brains stay flexible, and studies — including previous efforts by scientists in McCarroll’s lab and international consortia — have shown that many genetic factors linked to schizophrenia involve genes that contribute to the function of synapses. 

In the new study, McCarroll, Berretta, and colleagues used single-nucleus RNA sequencing, which measures gene expression in individual cells, to better understand how the brain naturally varies across individuals. They analyzed 1.2 million cells from 94 people with schizophrenia and 97 without. 

They found that when neurons boosted expression of genes that encode parts of synapses, astrocytes increased the expression of a distinct set of genes involved in synaptic function. These genes, which make up the SNAP program, included many previously identified risk factors for schizophrenia. The team’s analyses indicated that both neurons and astrocytes shape genetic vulnerability for the condition. 

“Science has long known that neurons and synapses are important in risk for schizophrenia, but by framing the question a different way — asking what genes each cell type regulates dynamically — we found that astrocytes too are likely involved,” said Ling.

To their surprise, the researchers also found that SNAP varied greatly even among people without schizophrenia, suggesting that SNAP could be involved in cognitive differences in healthy humans. Much of this variation was explained by age; SNAP declined substantially in many — but not all — older individuals, including both people with and without schizophrenia. 

With better understanding of SNAP, McCarroll says he hopes it might be possible to identify life factors that positively influence SNAP, and develop medicines that help stimulate SNAP, as a way to treat the cognitive impairments of schizophrenia or help people maintain their cognitive flexibility as they age.

In the meantime, McCarroll, Berretta, and their team are working to understand if these changes are present in other conditions such as bipolar disorder and depression. They also aim to uncover the extent to which SNAP appears in other brain areas, and how SNAP affects learning and cognitive flexibility.

Avatar

Exposure to different kinds of music influences how the brain interprets rhythm

When listening to music, the human brain appears to be biased toward hearing and producing rhythms composed of simple integer ratios — for example, a series of four beats separated by equal time intervals (forming a 1:1:1 ratio).

However, the favored ratios can vary greatly between different societies, according to a large-scale study led by researchers at MIT and the Max Planck Institute for Empirical Aesthetics and carried out in 15 countries. The study included 39 groups of participants, many of whom came from societies whose traditional music contains distinctive patterns of rhythm not found in Western music.

“Our study provides the clearest evidence yet for some degree of universality in music perception and cognition, in the sense that every single group of participants that was tested exhibits biases for integer ratios. It also provides a glimpse of the variation that can occur across cultures, which can be quite substantial,” says Nori Jacoby, the study’s lead author and a former MIT postdoc, who is now a research group leader at the Max Planck Institute for Empirical Aesthetics in Frankfurt, Germany.

The brain’s bias toward simple integer ratios may have evolved as a natural error-correction system that makes it easier to maintain a consistent body of music, which human societies often use to transmit information.

“When people produce music, they often make small mistakes. Our results are consistent with the idea that our mental representation is somewhat robust to those mistakes, but it is robust in a way that pushes us toward our preexisting ideas of the structures that should be found in music,” says Josh McDermott, an associate professor of brain and cognitive sciences at MIT and a member of MIT’s McGovern Institute for Brain Research and Center for Brains, Minds, and Machines.

McDermott is the senior author of the study, which appeared in Nature Human Behaviour. The research team also included scientists from more than two dozen institutions around the world.

A global approach

The new study grew out of a smaller analysis that Jacoby and McDermott published in 2017. In that paper, the researchers compared rhythm perception in groups of listeners from the United States and the Tsimane’, an Indigenous society located in the Bolivian Amazon rainforest.

To measure how people perceive rhythm, the researchers devised a task in which they play a randomly generated series of four beats and then ask the listener to tap back what they heard. The rhythm produced by the listener is then played back to the listener, and they tap it back again. Over several iterations, the tapped sequences became dominated by the listener’s internal biases, also known as priors.

“The initial stimulus pattern is random, but at each iteration the pattern is pushed by the listener’s biases, such that it tends to converge to a particular point in the space of possible rhythms,” McDermott says. “That can give you a picture of what we call the prior, which is the set of internal implicit expectations for rhythms that people have in their heads.”

When the researchers first did this experiment, with American college students as the test subjects, they found that people tended to produce time intervals that are related by simple integer ratios. Furthermore, most of the rhythms they produced, such as those with ratios of 1:1:2 and 2:3:3, are commonly found in Western music.

The researchers then went to Bolivia and asked members of the Tsimane’ society to perform the same task. They found that Tsimane’ also produced rhythms with simple integer ratios, but their preferred ratios were different and appeared to be consistent with those that have been documented in the few existing records of Tsimane’ music.

“At that point, it provided some evidence that there might be very widespread tendencies to favor these small integer ratios, and that there might be some degree of cross-cultural variation. But because we had just looked at this one other culture, it really wasn’t clear how this was going to look at a broader scale,” Jacoby says.

To try to get that broader picture, the MIT team began seeking collaborators around the world who could help them gather data on a more diverse set of populations. They ended up studying listeners from 39 groups, representing 15 countries on five continents — North America, South America, Europe, Africa, and Asia.

“This is really the first study of its kind in the sense that we did the same experiment in all these different places, with people who are on the ground in those locations,” McDermott says. “That hasn’t really been done before at anything close to this scale, and it gave us an opportunity to see the degree of variation that might exist around the world.”

Cultural comparisons

Just as they had in their original 2017 study, the researchers found that in every group they tested, people tended to be biased toward simple integer ratios of rhythm. However, not every group showed the same biases. People from North America and Western Europe, who have likely been exposed to the same kinds of music, were more likely to generate rhythms with the same ratios. However, many groups, for example those in Turkey, Mali, Bulgaria, and Botswana showed a bias for other rhythms.

“There are certain cultures where there are particular rhythms that are prominent in their music, and those end up showing up in the mental representation of rhythm,” Jacoby says.

The researchers believe their findings reveal a mechanism that the brain uses to aid in the perception and production of music.

“When you hear somebody playing something and they have errors in their performance, you’re going to mentally correct for those by mapping them onto where you implicitly think they ought to be,” McDermott says. “If you didn’t have something like this, and you just faithfully represented what you heard, these errors might propagate and make it much harder to maintain a musical system.”

Among the groups that they studied, the researchers took care to include not only college students, who are easy to study in large numbers, but also people living in traditional societies, who are more difficult to reach. Participants from those more traditional groups showed significant differences from college students living in the same countries, and from people who live in those countries but performed the test online.

“What’s very clear from the paper is that if you just look at the results from undergraduate students around the world, you vastly underestimate the diversity that you see otherwise,” Jacoby says. “And the same was true of experiments where we tested groups of people online in Brazil and India, because you’re dealing with people who have internet access and presumably have more exposure to Western music.”

The researchers now hope to run additional studies of different aspects of music perception, taking this global approach.

“If you’re just testing college students around the world or people online, things look a lot more homogenous. I think it’s very important for the field to realize that you actually need to go out into communities and run experiments there, as opposed to taking the low-hanging fruit of running studies with people in a university or on the internet,” McDermott says.

Source: news.mit.edu
Avatar

The human brain has billions of neurons. Working together, they enable higher-order brain functions such as cognition and complex behaviors. To study these higher-order brain functions, it is important to understand how neural activity is coordinated across various brain regions. Although techniques such as functional magnetic resonance imaging (fMRI) are able to provide insights into brain activity, they can show only so much information for a given time and area. Two-photon microscopy involving the use of cranial windows is a powerful tool for producing high-resolution images but conventional cranial windows are small, making it difficult to study distant brain regions at the same time.

Now, a team of researchers led by the Exploratory Research Center on Life and Living Systems (ExCELLS) and the National Institute for Physiological Sciences (NIPS) have introduced a new method for in vivo brain imaging, enabling large-scale and long-term observation of neuronal structures and activities in awake mice. This method is called the “nanosheet incorporated into light-curable resin” (NIRE) method, and it uses fluoropolymer nanosheets covered with light-curable resin to create larger cranial windows.

“The NIRE method is superior to previous methods because it produces larger cranial windows than previously possible, extending from the parietal cortex to the cerebellum, utilizing the biocompatible nanosheet and the transparent light-curable resin that changes in form from liquid to solid,” says lead author Taiga Takahashi of the Tokyo University of Science and ExCELLS.

In the NIRE method, light-curable resin is used to fix polyethylene-oxide–coated CYTOP (PEO-CYTOP), a bioinert and transparent nanosheet, onto the brain surface. This creates a “window” that fits tightly onto the brain surface, even the highly curved surface of the cerebellum, and maintains its transparency for a long time with little mechanical stress, allowing researchers to observe multiple brain regions of living mice.

“Additionally, we showed that the combination of PEO-CYTOP nanosheets and light-curable resin enabled the creation of stronger cranial windows with greater transparency for longer periods of time compared with our previous method. As a result, there were few motion artifacts, that is, distortions in the images caused by the movements of awake mice,” says Takahashi.

The cranial windows allowed for high-resolution imaging with sub-micrometer resolution, making them suitable for observing the morphology and activity of fine neural structures.

“Importantly, the NIRE method enables imaging to be performed for a longer period of more than 6 months with minimal impact on transparency. This should make it possible to conduct longer-term research on neuroplasticity at various levels—from the network level to the cellular level—as well as during maturation, learning, and neurodegeneration,” explains corresponding author Tomomi Nemoto at ExCELLS and NIPS.

This study is a significant achievement in the field of neuroimaging because this novel method provides a powerful tool for researchers to investigate neural processes that were previously difficult or impossible to observe. Specifically, the NIRE method’s ability to create large cranial windows with prolonged transparency and fewer motion artifacts should allow for large-scale, long-term, and multi-scale in vivo brain imaging.

“The method holds promise for unraveling the mysteries of neural processes associated with growth and development, learning, and neurological disorders. Potential applications include investigations into neural population coding, neural circuit remodeling, and higher-order brain functions that depend on coordinated activity across widely distributed regions,” says Nemoto.

In sum, the NIRE method provides a platform for investigating neuroplastic changes at various levels over extended periods in animals that are awake and engaged in various behaviors, which presents new opportunities to enhance our understanding of the brain’s complexity and function.

(Image caption: In vivo two-photon imaging through a large cranial window extending from the cerebral cortex to cerebellum. Credit: ExCELLS/NINS)

Avatar

Air pollution particles linked to development of Alzheimer’s

Magnetite, a tiny particle found in air pollution, can induce signs and symptoms of Alzheimer’s disease, new research suggests.

Alzheimer’s disease, a type of dementia, leads to memory loss, cognitive decline, and a marked reduction in quality of life. It impacts millions globally and is a leading cause of death in older individuals.

The study, Neurodegenerative effects of air pollutant particles: Biological mechanisms implicated for early-onset Alzheimer’s disease, led by Associate Professor Cindy Gunawan and Associate Professor Kristine McGrath from the University of Technology Sydney (UTS) was recently published in Environment International.

The research team, from UTS, UNSW Sydney and the Agency for Science, Technology and Research in Singapore, examined the impact of air pollution on brain health in mice, as well as in human neuronal cells in the lab. 

Their aim was to better understand how exposure to toxic air pollution particles could lead to Alzheimer’s disease. 

“Fewer than 1% of Alzheimer’s cases are inherited, so it is likely that the environment and lifestyle play a key role in the development of the disease,” said Associate Professor Gunawan, from the Australian Institute for Microbiology and Infection (AIMI).

“Previous studies have indicated that people who live in areas with high levels of air pollution are at greater risk of developing Alzheimer’s disease. Magnetite, a magnetic iron oxide compound, has also been found in greater amounts in the brains of people with Alzheimer’s disease. 

"However, this is the first study to look at whether the presence of magnetite particles in the brain can indeed lead to signs of Alzheimer’s,” she said. 

The researchers exposed healthy mice and those genetically predisposed to Alzheimer’s to very fine particles of iron, magnetite, and diesel hydrocarbons over four months. They found that magnetite induced the most consistent Alzheimer’s disease pathologies.

This included the loss of neuronal cells in the hippocampus, an area of the brain crucial for memory, and in the somatosensory cortex, an area that processes sensations from the body. Increased formation of amyloid plaque was seen in mice already predisposed to Alzheimer’s.

The researchers also observed behavioural changes in the mice that were consistent with Alzheimer’s disease including increased stress and anxiety and short-term memory impairment, the latter particularly in the genetically predisposed mice.

“Magnetite is a quite common air pollutant. It comes from high-temperature combustion processes like vehicle exhaust, wood fires and coal-fired power stations as well as from brake pad friction and engine wear,” said Associate Professor McGrath from the UTS School of Life Sciences.

“When we inhale air pollutant, these particles of magnetite can enter the brain via the lining of the nasal passage, and from the olfactory bulb, a small structure on the bottom of the brain responsible for processing smells, bypassing the blood-brain barrier,” she said.

The researchers found that magnetite induced an immune response in the mice and in the human neuronal cells in the lab. It triggered inflammation and oxidative stress, which in turn led to cell damage. Inflammation and oxidative stress are significant factors known to contribute to dementia.

“The magnetite-induced neurodegeneration is also independent of the disease state, with signs of Alzheimer’s seen in the brains of healthy mice,” said Dr Charlotte Fleming, a co-first author from the UTS School of Life Sciences.

The results will be of interest to health practitioners and policymakers. It suggests that people should take steps to reduce their exposure to air pollution as much as possible, and consider methods to improve air quality and reduce the risk of neurodegenerative disease.

The study has implications for air pollution guidelines. Magnetite particles should be included in the recommended safety threshold for air quality index, and increased measures to reduce vehicle and coal-fired power station emissions are also needed. 

Source: uts.edu.au
Avatar

Research Shows Continued Cocaine Use Disrupts Communication Between Major Brain Networks

A collaborative research endeavor by scientists in the Departments of Radiology, Neurology, and Psychology and Neuroscience at the UNC School of Medicine have demonstrated the deleterious effects of chronic cocaine use on the functional networks in the brain.

Their study titled “Network Connectivity Changes Following Long-Term Cocaine Use and Abstinence”, was highlighted by the editor of Journal of Neuroscience in “This Week in The Journal.” The findings show that continued cocaine use affects how crucial neural networks communicate with one another in the brain, including the default mode network (DMN), the salience network (SN), and the lateral cortical network (LCN).

“The disrupted communication between the DMN and SN can make it harder to focus, control impulses, or feel motivated without the drug,” said Li-Ming Hsu, PhD, assistant professor of radiology and lead author on the study. “Essentially, these changes can impact how well they respond to everyday situations, making recovery and resisting cravings more challenging.”

Hsu led this project during his postdoctoral tenure at the Center for Animal MRI in the Biomedical Research Imaging Center and the Department of Neurology. The work provides new insights into the brain processes that underlie cocaine addiction and creates opportunities for the development of therapeutic approaches and the identification of an imaging marker for cocaine use disorders.

The brain operates like an orchestra, where each instrumentalist has a special role crucial for creating a coherent piece of music. Specific parts of the brain need to work together to complete a task. The DMN is active during daydreams and reflections, the SN is crucial for attentiveness, and the CEN, much like a musical conductor, plays a role in our decision-making and problem-solving.

The research was motivated by observations from human functional brain imaging studies suggesting chronic cocaine use alters connectivity within and between the major brain networks. Researchers needed a longitudinal animal model to understand the relationship between brain connectivity and the development of cocaine dependence, as well as changes during abstinence.

Researchers employed a rat model to mimic human addiction patterns, allowing the models to self-dose by nose poke. Paired with advanced neuroimaging techniques, the behavioral approach enables a deeper understanding of the brain’s adaptation to prolonged drug use and highlights how addictive substances can alter the functioning of critical brain networks.

Hsu’s research team used functional MRI scans to explore the changes in brain network dynamics on models that self-administrated cocaine. Over a period of 10 days followed by abstinence, researchers observed significant alterations in network communication, particularly between the DMN and SN.

These changes were more pronounced with increased cocaine intake over the 10 days of self-administration, suggesting a potential target for reducing cocaine cravings and aiding those in recovery. The changes in these networks’ communication could also serve as useful imaging biomarkers for cocaine addiction.

The study also offered novel insights into the anterior insular cortex (AI) and retrosplenial cortex (RSC). The former is responsible for emotional and social processing; whereas, the latter controls episodic memory, navigation, and imagining future events. Researchers noted that there was a difference in coactivity between these two regions before and after cocaine intake. This circuit could be a potential target for modulating associated behavioral changes in cocaine use disorders.

“Prior studies have demonstrated functional connectivity changes with cocaine exposure; however, the detailed longitudinal analysis of specific brain network changes, especially between the anterior insular cortex (AI) and retrosplenial cortex (RSC), before and after cocaine self-administration, and following extended abstinence, provides new insights,” said Hsu.

Avatar

Neurons help flush waste out of brain during sleep

There lies a paradox in sleep. Its apparent tranquility juxtaposes with the brain’s bustling activity. The night is still, but the brain is far from dormant. During sleep, brain cells produce bursts of electrical pulses that cumulate into rhythmic waves – a sign of heightened brain cell function.

But why is the brain active when we are resting?

Slow brain waves are associated with restful, refreshing sleep. And now, scientists at Washington University School of Medicine in St. Louis have found that brain waves help flush waste out of the brain during sleep. Individual nerve cells coordinate to produce rhythmic waves that propel fluid through dense brain tissue, washing the tissue in the process.

“These neurons are miniature pumps. Synchronized neural activity powers fluid flow and removal of debris from the brain,” explained first author Li-Feng Jiang-Xie, PhD, a postdoctoral research associate in the Department of Pathology & Immunology. “If we can build on this process, there is the possibility of delaying or even preventing neurological diseases, including Alzheimer’s and Parkinson’s disease, in which excess waste – such as metabolic waste and junk proteins – accumulate in the brain and lead to neurodegeneration.”

The findings are published Feb. 28 in Nature.

Brain cells orchestrate thoughts, feelings and body movements, and form dynamic networks essential for memory formation and problem-solving. But to perform such energy-demanding tasks, brain cells require fuel. Their consumption of nutrients from the diet creates metabolic waste in the process.

“It is critical that the brain disposes of metabolic waste that can build up and contribute to neurodegenerative diseases,” said Jonathan Kipnis, PhD, the Alan A. and Edith L. Wolff Distinguished Professor of Pathology & Immunology and a BJC Investigator. Kipnis is the senior author on the paper. “We knew that sleep is a time when the brain initiates a cleaning process to flush out waste and toxins it accumulates during wakefulness. But we didn’t know how that happens. These findings might be able to point us toward strategies and potential therapies to speed up the removal of damaging waste and to remove it before it can lead to dire consequences.”

But cleaning the dense brain is no simple task. Cerebrospinal fluid surrounding the brain enters and weaves through intricate cellular webs, collecting toxic waste as it travels. Upon exiting the brain, contaminated fluid must pass through a barrier before spilling into the lymphatic vessels in the dura mater – the outer tissue layer enveloping the brain underneath the skull. But what powers the movement of fluid into, through and out of the brain?

Studying the brains of sleeping mice, the researchers found that neurons drive cleaning efforts by firing electrical signals in a coordinated fashion to generate rhythmic waves in the brain, Jiang-Xie explained. They determined that such waves propel the fluid movement.

The research team silenced specific brain regions so that neurons in those regions didn’t create rhythmic waves. Without these waves, fresh cerebrospinal fluid could not flow through the silenced brain regions and trapped waste couldn’t leave the brain tissue.

“One of the reasons that we sleep is to cleanse the brain,” Kipnis said. “And if we can enhance this cleansing process, perhaps it’s possible to sleep less and remain healthy. Not everyone has the benefit of eight hours of sleep each night, and loss of sleep has an impact on health. Other studies have shown that mice that are genetically wired to sleep less have healthy brains. Could it be because they clean waste from their brains more efficiently? Could we help people living with insomnia by enhancing their brain’s cleaning abilities so they can get by on less sleep?”

Brain wave patterns change throughout sleep cycles. Of note, taller brain waves with larger amplitude move fluid with more force. The researchers are now interested in understanding why neurons fire waves with varying rhythmicity during sleep and which regions of the brain are most vulnerable to waste accumulation.

“We think the brain-cleaning process is similar to washing dishes,” neurobiologist Jiang-Xie explained. “You start, for example, with a large, slow, rhythmic wiping motion to clean soluble wastes splattered across the plate. Then you decrease the range of the motion and increase the speed of these movements to remove particularly sticky food waste on the plate. Despite the varying amplitude and rhythm of your hand movements, the overarching objective remains consistent: to remove different types of waste from dishes. Maybe the brain adjusts its cleaning method depending on the type and amount of waste.”

Avatar

The brain is typically depicted as a complex web of neurons sending and receiving messages. But neurons only make up half of the human brain. The other half—roughly 85 billion cells—are non-neuronal cells called glia. The most common type of glial cells are astrocytes, which are important for supporting neuronal health and activity. Despite this, most existing laboratory models of the human brain fail to include astrocytes at sufficient levels or at all, which limits the models’ utility for studying brain health and disease.

Now, Salk scientists have created a novel organoid model of the human brain—a three-dimensional collection of cells that mimics features of human tissues—that contains mature, functional astrocytes. With this astrocyte-rich model, researchers will be able to study inflammation and stress in aging and diseases like Alzheimer’s with greater clarity and depth than ever before. Already, the researchers have used the model to reveal a relationship between astrocyte dysfunction and inflammation, as well as a potentially druggable target for disrupting that relationship.

The findings were published in Nature Biotechnology.

“Astrocytes are the most abundant type of glial cell in the brain, yet they have been underrepresented in organoid models of the brain,” says senior author Rusty Gage, professor and Vi and John Adler Chair for Research on Age-Related Neurodegenerative Disease at Salk. “Our model rectifies this deficit, offering a glial-enriched human brain organoid that can be used to explore the many ways that astrocytes are essential for brain function, and how they respond to stress and inflammation in various neurological conditions.”

In the last 10 years, organoids have emerged as a prevalent tool to bridge the gap between cell and human studies. Organoids can mimic human development and organ generation better than other laboratory systems, allowing researchers to study how drugs or diseases affect human cells in a more realistic setting. Brain organoids are typically grown in culture dishes, but their limited capacity to efficiently produce certain brain cells like astrocytes has remained problematic.

Astrocytes develop through the same pathway as neurons, beginning first as a neuronal stem cell until a molecular switch flips and turns the cell’s fate from neuron to astrocyte. To create a brain organoid with abundant astrocyte populations, the team looked for a way to trigger this switch.

To do this, the researchers delivered specific gliogenic compounds to the organoid, looking to see if they would promote astrocyte formation. The team then began running tests to see whether astrocytes had developed and, if they had, how many and to what extent they had matured.

The brain organoids cultured in a dish still lacked the microenvironment and the neuronal structural arrangement of a human brain. To create a more human brain-like environment, researchers transplanted the organoids into mouse models, allowing them to further develop over several months.

"Our transplanted organoid model produced more sophisticated and differentiated astrocyte populations than would have been possible with older models,” says co-first author Lei Zhang, a former postdoctoral researcher in Gage’s lab. “What was really exciting is that we observed order in the organoids. The organization of functional cell groups in the human brain is very difficult to mimic in a laboratory setting, but these astrocytes in our organoid model were doing just that.”

After observing astrocyte subtype development and maturation in the transplanted organoids, the researchers aimed to investigate the role of astrocytes in the process of neuroinflammation. Aging and age-related neurological diseases have strong ties to the immune system and inflammation, and whether astrocytes are also involved in this relationship has long been a question for neuroscientists.

To test this, the researchers introduced a proinflammatory compound into the transplanted organoids and found that a subtype of astrocytes became activated and promoted further proinflammatory pathways. Additionally, they found that a molecule called CD38 was crucial in mediating metabolic and energetic stress in those reactive astrocytes. Knowing CD38 signaling plays this important role suggests that CD38 inhibitors may be able to alleviate the neuroinflammation and related stresses caused by these reactive astrocytes, says Gage.

“We have created a human brain model for research that is more similar to its real-life counterpart than ever before—it has all the major astrocyte subclasses found in the human cortex,” says co-first author Meiyan Wang, a postdoctoral researcher in Gage’s lab. “With this model, we have already found a link between inflammation and astrocyte dysfunction and, in the process, revealed CD38 as a potentially druggable target to disrupt that link.”

Their findings build on another recent model developed in the lab that featured a different glial cell type, called microglia. While this astrocyte-rich model is the most advanced yet, the team is already looking to improve and expand on their organoid model by incorporating additional brain cell types and promoting further cell maturation. In the meantime, they aim to use the sophisticated model to investigate brain function and dysfunction in new detail, with the hopes that their findings will lead to new interventions and therapeutics for neurological conditions like Alzheimer’s disease.

(Image caption: Human astrocytes (green) extending processes that wrap around the host blood vessel (magenta). Credit: Salk Institute)

Avatar

How ketamine acts fast and slow

New treatments for depression are needed that act rapidly and also have sustained effects. Ketamine accomplishes this, but toxic side effects limit its long-term use. Scientists haven’t understood how ketamine was able to do both, which hindered drug development.

A new Northwestern Medicine study brings that goal one step closer. This work identifies mechanisms that enable ketamine to work rapidly and also have long-term effects. The short-term and longer-term effects both involve newborn neurons. However, the short-term effects depend upon activity of new neurons that already were born when the drug was taken while the longer-term effects are due to an increased number of newborn neurons that result from the drug. 

“This study is exciting, because it lays the groundwork for development of non-toxic treatments that exert antidepressant effects within hours like ketamine but that also have the longer-term sustained effects necessary for the treatment of depression,” said senior study author Dr. John Kessler, professor of neurology at Northwestern University Feinberg School of Medicine. “This is a tremendous advance for the field.”

The study was published in Cellular and Molecular Life Sciences.

Ketamine differs from most antidepressants because it produces antidepressant effects within hours instead of weeks like most other medications. This is enormously helpful for patients, potentially reducing their risk of death and suicide in the short term. But the drug’s toxic side effects limit its longer-term use.

Corresponding study author Dr. Radhika Rawat, a former research fellow in Kessler’s lab and a third-year medical student at Feinberg, had previously discovered that ketamine’s ability to produce a rapid antidepressant effect is the result of stimulating activity of newborn neurons, so that they fire more rapidly sending more messages to the rest of the brain. 

In Rawat’s new study, she investigated two questions: how does the sustained effect of ketamine work and is it different from the rapid effect? She found the sustained effect is, indeed, different. It works by increasing the number of immature cells that have increased activity and firing.

“To make an analogy, think of the young neurons as ‘teenagers’ who are texting their friends. Increasing the number of text messages spreads information rapidly — that is how ketamine acts rapidly. Increasing the number of teenagers also increases the spread of information, but it takes time for them to be born and mature — that is why there are delayed but longer-term effects.”

Rawat also found that the longer-term effects of ketamine occur by acting on the BMP (bone morphogenetic protein) signaling pathway in the hippocampus. The Kessler lab has previously shown that decreased BMP signaling is a common pathway for the action of standard antidepressants. The new study shows this is also true for the sustained effect of ketamine.

Avatar

New study links placental oxygen levels to fetal brain development

A new study shows oxygenation levels in the placenta, formed during the last three months of fetal development, are an important predictor of cortical growth (development of the outermost layer of the brain or cerebral cortex) and is likely a predictor of childhood cognition and behaviour.

“Many factors can disrupt healthy brain development in utero, and this study demonstrates the placenta is a crucial mediator between maternal health and fetal brain health,” said Emma Duerden, Canada Research Chair in Neuroscience & Learning Disorders at Western University, Lawson Health Research Institute scientist and senior author of the study.

The connection between placental health and childhood cognition was demonstrated in previous research using ultrasound, but for this study, Duerden, research scientist Emily Nichols and an interdisciplinary team of Western and Lawson researchers used magnetic resonance imaging (MRI), a far superior and more holistic imaging technique. This novel approach to imaging placental growth allows researchers to study neurodevelopmental disorders very early on in life, which could lead to the development of therapies and treatments.

“While ultrasound provides some measure of placental function, it is imprecise and prone to error, so MRI is just a bit more specific and precise,” said Nichols, lead author of the study. “You wouldn’t use MRI necessarily to diagnose placental growth restriction, you would use ultrasound, but MRI gives us a much better way to understand the mechanisms of the placenta and how placental function is affecting the fetal brain.”

The study, published in the high impact journal JAMA Network Open, was led by Duerden and Nichols and co-authored by researchers from the Faculty of Education, Schulich School of Medicine & Dentistry, Western Engineering and Lawson Health Research Institute.

The placenta, an organ that develops in the uterus during pregnancy, is the main conduit for oxygenation and nutrients to a fetus, and a vital endocrine organ during pregnancy.

“Anything a fetus needs to grow and thrive is mostly delivered through the placenta so if there is anything wrong with the placenta, the fetus might not be receiving the nutrients or the levels of oxygenation it needs to thrive,” said Nichols.

Poor nutrition, smoking, cocaine use, chronic hypertension, anemia, and diabetes may result in fetal growth restriction and may cause problems for the development of the placenta. Fetal growth restriction is relatively common and happens in about six per cent of all pregnancies and globally impacts 30 million pregnancies each year.

“There can be many issues related to the healthy development of the placenta,” said Duerden. “If it does not develop properly, the fetal brain may not get enough oxygen and nutrients, which may affect childhood cognition and behaviour.”

Impact, affect and change

The study revealed that a healthy placenta in the third trimester particularly impacts the cortex and the prefrontal cortex, regions of the child’s brain that are important for learning and memory.

“An unhealthy placenta can place babies at risk for later life learning difficulties, or even something more serious, like a neurodevelopmental disorder,” said Duerden. “This research can open a lot of doors as we still don’t really understand everything there is to know about the placenta. We are just scratching the surface.”

The study, funded by grants from Brain Canada, Children’s Health Foundation, Canadian Institutes of Health Research, BrainsCAN and the Molly Towell Perinatal Research Foundation, is also an important first step in biomarking the impact of oxygenation levels in the placenta and considering changes for expectant mothers to deal with less-than-ideal placental conditions.

While oxygenation in the placenta in the third trimester predicts fetal cortical growth (development of the outermost layer of the brain – the cerebral cortex), results of the study indicate it may not affect subcortical maturation, or the deep gray and white matter structures of the brain.

Subcortical structures in the brain, responsible for children’s temperament or motor functions such as the amygdala and basal ganglia, may be more vulnerable to factors affecting the placenta in the second trimester.

“We now have a better understanding of how the placenta affects the cortex. With this basic knowledge, we now have an idea of how these two things are related and we can identify or benchmark healthy levels that lead to brain cortical growth,” said Nichols. “The subcortical regions of the brain appear to be unaffected by placental growth, at least in the healthy samples from our study.”

Duerden, Nichols, and the team scanned pregnant women twice (during their third trimester) for the study at Western’s Translational Imaging Research Facility.

“This is one of the few datasets in the world where there are two scans collected in utero during the third trimester. There are not many groups in the world doing fetal MRI, so it is a super-rich data set that allows us to look at growth over time,” said Duerden. “Western is probably one of the few places where we can do the research because we have the expertise and the facilities to do it.”

(Image caption: A magnetic resonance imaging (MRI) scan of a healthy third-trimester fetus, shown in the head-down position in utero. Credit: Emily Nichols/TIRF image)

Avatar

Long-term memory and lack of mental images

When people lack visual imagination, this is known as aphantasia. Researchers from the University Hospital Bonn (UKB), the University of Bonn and the German Center for Neurodegenerative Diseases (DZNE) investigated how the lack of mental imagery affects long-term memory. They were able to show that changes in two important brain regions, the hippocampus and the occipital lobe, as well as their interaction, have an influence on the impaired recall of personal memories in aphantasia. The study results, which advance the understanding of autobiographical memory, have now been published online by the journal "eLife".

Most of us find it easy to remember personal moments from our own lives. These memories are usually linked to vivid inner images. People who are unable to create mental images, or only very weak ones, are referred to as aphantasics. Previous neuroscientific studies have shown that the hippocampus, in particular, which acts as the brain's buffer during memory formation, supports both autobiographical memory and visual imagination. However, the relationship between the two cognitive functions has not yet been clarified: "Can you remember specific events in your life without generating inner images? We investigated this question and, in collaboration with the Institute of Psychology at the University of Bonn, studied the autobiographical memory of people with and without visual imagination," says corresponding author Dr. Cornelia McCormick from the Department of Neurodegenerative Diseases and Geriatric Psychiatry, who also conducts research at the DZNE and the University of Bonn.

Recall of memories is dependent on the generation of mental images

The Bonn team led by McCormick investigated the question of whether the hippocampus - in particular its connection, or connectivity, to other brain regions - is altered in people with aphantasia and examined the brain activities and structures associated with deficits in autobiographical memory in aphantasia. The study involved 14 people with aphantasia and 16 control subjects. The extent of aphantasia and the respective autobiographical memory were initially determined using questionnaires and interviews. "We found that people with aphantasia have more difficulty recalling memories. Not only do they report fewer details, but their narratives are less vivid and their confidence in their own memory is diminished. This suggests that our ability to remember our personal biography is closely linked to our imagination," says co-first author Merlin Monzel, a doctoral student at the Institute of Psychology at the University of Bonn. The study participants then recalled autobiographical events while images of their brains were recorded using functional magnetic resonance imaging (fMRI). "This showed that the hippocampus, which plays an important role in recalling vivid, detailed autobiographical memories, is less activated in people with aphantasia," says co-first author and PhD student Pitshaporn Leelaarporn, who works at the UKB and the DZNE. There were also differences in the interaction between the hippocampus and the visual cortex, which is responsible for processing and integrating visual information in the brain and is located in the occipital lobe. "The connectivity between the hippocampus and the visual cortex correlated with the imagination in people without aphantasia, whereas there was no correlation in those affected," explains Leelaarporn.

"Overall, we have been able to show that autobiographical memory does not work as well in people who have limited visual imagination as it does in people who can visualize something very easily. These results raise further questions that we are currently investigating," says McCormick. On the one hand, it is now important to find out whether people who are blind from birth and have never been able to build up a repertoire of inner images can remember detailed autobiographical events. On the other hand, the Bonn researchers want to investigate whether this ability can be trained. "It may even be possible to help people who suffer from memory disorders, such as Alzheimer's disease, by offering training in visual imagination instead of the usual memory training," says McCormick. 

Source: uni-bonn.de
Avatar

Sniffing our way to better health

Imagine if we could inhale scents that delay the onset of cancer, inflammation, or neurodegenerative disease. Researchers at the University of California, Riverside, are poised to bring this futuristic technology closer to reality. 

In lab experiments, a team led by Anandasankar Ray, a professor of molecular, cell and systems biology, exposed the fruit fly (Drosophila melanogaster) to diacetyl, a microbial volatile compound released by yeast, and found changes in gene expression in the fly’s antennae in just a few days. In separate experiments, the team found similar gene expression changes in mice and human cells. 

“That exposure to an odorant can directly alter expression of genes, even in tissues that have no odorant receptors, came as a complete surprise,” Ray said. “These molecules are able to get to the cell nucleus through the cell membrane.”

Diacetyl is widely used in food and beverage flavorings. It occurs naturally in a variety of dairy products and is a natural byproduct of fermentation and brewing. While diacetyl is found in beer, wine, Greek yogurt, and many ripening fruits, it is considered unsafe to inhale at high concentrations.

“Our initial discovery was made using diacetyl, as a proof of concept, and this compound may not be the perfect candidate for therapy,” Ray said. “We are already working on identifying other volatiles that lead to changes in gene expression. Our important finding is that some volatile compounds emitted from microbes and food can alter epigenetic states in neurons and other eukaryotic cells. Ours is the first report of common volatiles behaving in this way. It opens an entire field of inquiry. The possibilities are limitless.”

The research, published in eLife, shows that alterations in gene expression and chromatin — the mixture of DNA and proteins that form chromosomes — are possible in an organism even without the organism actively consuming the volatile compound source. The source could even be at some distance from the organism.

“We have shown for the first time that some of these odor molecules to which we are exposed and are being absorbed into the cells of our skin, nose, lungs, even probably to the brain through the bloodstream are fundamentally altering gene expression,” Ray said. “Is this something to be concerned about? How is it affecting our predisposition to certain diseases? How exactly is it affecting the genes we express? These remain unanswered questions.”

Ray’s team found that diacetyl can act as an inhibitor of histone deacetylase, or HDAC, enzymes and discovered several related volatiles with similar potential. HDAC inhibitors are used as anti-cancer drugs and may find use also in treating inflammatory diseases as well as neurodegeneration. When HDACs, which are conserved in plants and animals, are inhibited, DNA gets less compactly wound in cells, leading to more gene expression. 

“This opens the potential for odorant-based HDAC inhibitors to delay neurodegeneration or memory deficits in diseases,” Ray said.

In the fruit fly, Ray’s team found exposure to diacetyl volatiles substantially slowed degeneration of photoreceptor cells linked to Huntington’s disease. In transgenic mice, the team found exposure to diacetyl showed gene expression changes in lungs and brains; gene levels that are upregulated in cancers like neuroblastoma showed a significant reduction in mice exposed to diacetyl. 

In human cell lines, the team found diacetyl changed acetylation levels, with higher levels of acetylation resulting in higher levels of gene expression. In further testing on human cancer cell lines, the team found exposure to diacetyl prevented proliferation of neuroblastoma.

Apart from human diseases the research has enormous implications for agriculture. Because HDACs are highly conserved, they also affect plants. 

“Plants appear to have a very strong response to some of these volatiles,” Ray said. “In plants, any process that requires changes in gene expression can now be affected via exposure to this special class of odorants.” 

Ray explained that the volatiles are like tiny drugs that can change levels of gene expression and exploit the plant’s genetic potential for improving growth of roots, leaves, flowers and even responses to abiotic stress like freezing and drought.

“Volatile chemicals can deliver a therapeutic dose to plants and animals, with no need for pills or injections,” he said. “They can simply be breathed in, almost giving a new meaning to scent-based therapy.”

With the help of the Office of Technology Partnerships at UCR, Ray has filed patents for volatiles that can slow down neurodegeneration and cancer and alter plant growth and responses to stress.

Last year, Ray launched a startup, Remote Epigenetics, which has the exclusive license to use these volatiles that alter gene expression. The company is headquartered in the Multipurpose Research Building on the UCR campus. The new agritech startup will focus on developing new tools for agriculture using low-cost volatiles to address several important problems. 

Ray was joined in the research by Sachiko Haga-Yamanaka, Rogelio Nuñez-Flores, Christi Ann Scott, Sarah Perry, Stephanie Turner Chen, Crystal Pontrello, and Meera Goh Nair of UCR.

Ray is also the founder of another startup, Sensorygen, which works on the computational neurobiology of olfaction and taste.

Source: news.ucr.edu
Avatar

Scientists at the University of Zurich have developed an innovative neural cell culture model, shedding light on the intricate mechanisms underlying neurodegeneration. Their research pinpointed a misbehaving protein as a promising therapeutic target in the treatment of amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD).

Neurodegenerative diseases cause some of the neurons in our brains to die, resulting in different symptoms depending on the brain region affected. In amyotrophic lateral sclerosis (ALS), neurons in the motor cortex and spinal cord degenerate, leading to paralysis. In frontotemporal dementia (FTD), on the other hand, neurons located in the parts of the brain involved in cognition, language and personality are affected.

Both ALS and FTD are relentlessly progressive diseases and effective treatments are still lacking. As the population ages, the prevalence of age-related neurodegenerative diseases such as ALS and FTD is expected to increase.

Despite the identification of the aberrant accumulation of a protein called TDP-43 in neurons in the central nervous system as a common factor in the vast majority of ALS and about half of FTD patients, the underlying cellular mechanisms driving neurodegeneration remain largely unknown.

Flexible, durable, reproducible: ideal cell culture model for ALS and FTD research

In their study, first author Marian Hruska-Plochan and corresponding author Magdalini Polymenidou of the Department of Quantitative Biomedicine at the University of Zurich developed a novel neural cell culture model that replicates the aberrant behavior of TDP-43 in neurons. Using this model, they discovered a toxic increase in the protein NPTX2, suggesting it as a potential therapeutic target for ALS and FTD.

To mimic neurodegeneration, Marian Hruska-Plochan developed a new cell culture model called “iNets” (shortened from “interconnected neuronal networks”), derived from human induced pluripotent stem cells. These cells, originated from skin cells and reprogrammed to a very early, undifferentiated stage in the laboratory, serve as a source for developing many different, desired cell types. iNets are a network of interconnected neurons and their supporting cells growing in multiple layers in a dish.

The cultures lasted exceptionally long – up to a year – and were easily reproduced. “The robustness of aging iNets allows us to perform experiments that would not have been possible otherwise,” says Hruska-Plochan. “And the flexibility of the model makes it suitable for a wide range of experimental methodologies.” As a case in point, the iNets cell cultures provided the ideal model to investigate the progression from TDP-43 dysfunction to neurodegeneration.

How protein dysfunction leads to neurodegeneration

Employing the iNets model, the researchers identified a toxic accumulation of NPTX2, a protein normally secreted by neurons through synapses, as the missing link between TDP-43 misbehavior and neuronal death. To validate their hypothesis, they examined brain tissue from deceased ALS and FTD patients and indeed found that, also in patients, NPTX2 accumulated in cells containing abnormal TDP-43. This means that the iNets culture model accurately predicted ALS and FTD patient pathology.

In additional experiments in the iNets model, the researchers tested whether NPTX2 could be a target for drug design to treat ALS and FTD. The team engineered a setup in which they lowered the levels of NPTX2 while neurons were suffering from TDP-43 misbehavior. They found that keeping NPTX2 levels low counteracted neurodegeneration in the iNets neurons. Therefore, drugs that reduce the amount of the protein NPTX2 have potential as a therapeutic strategy to halt neurodegeneration in ALS and FTD patients.

Magdalini Polymenidou sees great promise in this discovery: “We still have a long way to go before we can bring this to the patients, but the discovery of NPTX2 gives us a clear shot of developing a therapeutic that acts at the core of the disease,” she said. “In conjunction with two additional targets recently identified by other research teams, it is conceivable that anti-NPTX2 agents could emerge as a key component of combination therapies for ALS and FTD in the future,” she added.

(Image caption: Progressive degeneration in a neuronal network: blue represents healthy neurons, while orange and red represent the protein NPTX2. Yellow shows the toxic aggregation of the protein TDP-43. Digital drawing: Niklas Bargenda)

Avatar

Uncovering Anxiety: Scientists Identify Causative Pathway and Potential Cures

Anxiety-related disorders can have a profound impact on the mental health and quality of life of affected individuals. Understanding the neural circuits and molecular mechanisms that trigger anxiety can aid in the development of effective targeted pharmacological treatments. Delta opioid receptors (DOP), which localize in the regions of the brain associated with emotional regulation, play a key role in the development of anxiety. Several studies have demonstrated the therapeutic effects of DOP agonists (synthetic compounds which selectively bind to DOPs and mimic the effect of the natural binding compound) in a wide range of behavioral disorders. One such selective DOP agonist—KNT-127—has been shown to exert 'anxiolytic' or anxiety-reducing effects in animal models, with minimal side effects. However, its mechanism of action is not clearly understood, thereby limiting its widespread clinical application.

To bridge this gap, Professor Akiyoshi Saitoh, along with Ms. Ayako Kawaminami and team from the Tokyo University of Science, Japan, conducted a series of experiments and behavioral studies in mice. Explaining the rationale behind their work, Prof. Saitoh says, "There are currently no therapeutic drugs mediated by delta opioid receptors (DOPs). DOPs likely exert anti-depressant and anti-anxiety effects through a mechanism of action different from that of existing psychotropic drugs. DOP agonists may, therefore, be useful for treatment-resistant and intractable mental illnesses which do not respond to existing treatments." Their study was published in Neuropsychopharmacology Reports.

The neuronal network projecting from the 'prelimbic cortex' (PL) of the brain to the 'basolateral nucleus of the amygdala' (BLA) region, has been implicated in the development of depression and anxiety-like symptoms. The research team has previously shown that KNT-127 inhibits the release of glutamate (a key neurotransmitter) in the PL region. Based on this, they hypothesized that DOP activation by KNT-127 suppresses glutamatergic transmission and attenuates PL-BLA-mediated anxiety-like behavior. To test this hypothesis, they developed an 'optogenetic' mouse model wherein they implanted a light-responsive chip in the PL-BLA region of mice and activated the neural circuit using light stimulation. Further, they went on to assess the role of PL-BLA activation on innate and conditioned anxiety-like behavior.

They used the elevated-plus maze (EPM) test, which consists of two open arms and two closed arms on opposite sides of a central open field, to assess behavioral anxiety in the mice. Notably, mice with PL-BLA activation spent lesser time in the central region and open arms of the maze, compared to controls, which was consistent with innate anxiety-like behavior. Next, the researchers assessed conditioned fear response of the animals by exposing them to foot shocks and placing them in the same shock chamber the following day without re-exposing them to current. They recorded the freezing response of the animals which reflects fear. Notably, animals with PL-BLA activation and controls exhibited similar behavior, suggesting that distinct neural pathways control innate anxiety-like behavior and conditioned fear response.

Finally, they examined the effects or KNT-127 treatment on anxiety-like behavior of mice using the EPM test. Remarkably, animals treated with KNT-127 exhibited an increase in the percentage time spent in the open arms and central field of the maze, compared to controls. These findings suggest that KNT-27 reduces anxiety-like behavior induced by the specific activation of the PL-BLA pathway.

Overall, the study reveals the role of the PL-BLA neuronal axis in the regulation of innate anxiety, and its potential function in DOP-mediated anxiolytic effects. Further studies are needed to understand the precise underlying molecular and neuronal mechanisms, for the development of novel therapies targeting DOP in the PL-BLA pathway.

Highlighting the long-term clinical applications of their work, Prof. Saitoh remarks, "The brain neural circuits focused on in this study are conserved in humans, and research on human brain imaging has revealed that the PL-BLA region is overactive in patients with depression and anxiety disorders. We are optimistic that suppressing overactivity in this brain region using DOP-targeted therapies can exert significant anxiolytic effects in humans."

Source: tus.ac.jp
Avatar

Newly discovered brain cells play a key role in right and left turns

Have you ever wondered what happens in the brain when we move to the right or left? Most people don’t; they just do it without thinking about it. But this simple movement is actually controlled by a complex process. 

In a new study, researchers have discovered the missing piece in the complex nerve-network needed for left-right turns. The discovery was made by a research team consisting of Assistant Professor Jared Cregg, Professor Ole Kiehn, and their colleagues from the Department of Neuroscience at the University of Copenhagen. 

In 2020,  Ole Kiehn, Jared Creeg and their colleagues identified the ‘brain’s steering wheel’ – a network of neurons in the lower part of the brainstem that commands right- and left- movements when walking. At the time, though, it was not clear to them how this right-left circuit is controlled by other parts of the brain, such as the basal ganglia. 

“We have now discovered a new group of neurons in the brainstem which receives information directly from the basal ganglia and control the right-left circuit,” Ole Kiehn explains. 

Eventually, this discovery may be able to help people suffering from Parkinson’s disease. The study has been published in the scientific journal Nature Neuroscience.  

The basal ganglia are located deep within the brain. For many years now, they have been known to play a key role in controlling voluntary movements. 

Years ago, scientists learned that by stimulating the basal ganglia you can affect right- and left-hand movements in mice. They just did not know how. 

“When walking, you will shorten the step length of the right leg before making a right-hand turn and the left leg before making a left-hand turn. The newly discovered network of neurons is located in a part of the brainstem known as PnO. They are the ones that receive signals from the basal ganglia and adjust the step length as we make a turn, and which thus determine whether we move to the right or left,” Jared Cregg explains. 

The study therefore provides a key to understanding how these absolutely essential movements are produced by the brain. 

In the new study, the researchers studied the brain of mice, as their brainstem closely resembles the human brainstem. Therefore, the researchers expect to find a similar right-left circuit in the human brain.  

People with Parkinson’s have difficulties making right and left turns 

Parkinson’s disease is caused by a lack of dopamine in the brain. This affects the basal ganglia, and the researchers responsible for the new study believe that this leads to failure to activate the brainstem’s right-left circuit. 

And it makes sense when you look at the symptoms experienced by people with Parkinson’s at a late stage of the disease – they often have difficulties turning when walking. 

In the new study, the researchers have studied this in mice with symptoms resembling those of people with Parkinson’s disease. They made so-called Parkinson’s model, removing dopamine from the brain of mice and thus giving them motor symptoms similar to those experienced by people suffering from Parkinson’s disease  

“These mice had difficulties turning, but by stimulating the PnO neurons we were able to alleviate turning difficulties.” Jared Cregg says. 

Using Deep Brain Stimulation, scientists may eventually be able to develop similar stimulation for humans. At present, though, they are unable to stimulate human brain cells as accurately as in mice models, where they used advanced optogenetic techniques. 

“The neurons in the brainstem are a mess, and electric stimulation, which is the type of stimulation used in human Deep Brain Stimulation, cannot distinguish the cells from one another. However, our knowledge of the brain is constantly growing, and eventually we may be able to start considering focused Deep Brain Stimulation of humans,” Ole Kiehn concludes. 

Source: news.ku.dk
Avatar

How Does the Brain Make Decisions?

Scientists have gained new insights into how neurons in the brain communicate during a decision, and how the connections between neurons may help reinforce a choice.

The study — conducted in mice and led by neuroscientists at Harvard Medical School — is the first to combine structural, functional, and behavioral analyses to explore how neuron-to-neuron connections support decision-making.

“How the brain is organized to help make decisions is a big, fundamental question, and the neural circuitry — how neurons are connected to one another — in brain areas that are important for decision-making isn’t well understood,” said Wei-Chung Allen Lee, associate professor of neurobiology in the Blavatnik Institute at HMS and professor of neurology at Boston Children’s Hospital. Lee is co-senior author on the paper with Christopher Harvey, professor of neurobiology at HMS, and Stefano Panzeri, professor at University Medical Center Hamburg-Eppendorf.

In the research, mice were tasked with choosing which way to go in a maze to find a reward. The researchers found that a mouse’s decision to go left or right activated sequential groups of neurons, culminating in the suppression of neurons linked to the opposite choice.

These specific connections between groups of neurons may help sculpt decisions by shutting down neural pathways for alternative options, Lee said.

Findings appear Feb. 21 in Nature.

A fruitful collaboration is born

It was a chance meeting on a bench outside their building during a fire drill that led Harvey and Lee to realize the complementary nature of their work. On that day, they forged a collaboration that propelled the new work.

The Harvey lab uses mice to study behavioral and functional aspects of decision-making. Typical experiments involve placing a mouse in a virtual reality maze and recording neural activity as it makes decisions. Such experiments have shown that distinct, but intermingled, sets of neurons fire when an animal chooses left versus right.

Lee works in a new field of neuroscience called connectomics, which aims to comprehensively map connections between neurons in the brain. The goal, he said, is to figure out “which neurons are talking to each other, and how neurons are organized into networks.”

By combining their expertise, Harvey and Lee were able to delve deeper into the different types of neurons involved in decision-making and how these neurons are connected.

Choosing a direction

The new study focused on a region of the brain called the posterior parietal cortex — what Lee describes as an “integrative hub” that receives and processes information gathered by multiple senses to help animals make decisions.

“We were interested in understanding how neural dynamics arise in this brain area that is important for navigational decision-making,” Lee said. “We’re looking for rules of connectivity — simple principles that provide a foundation for the brain’s computations as it makes decisions.”

The Harvey lab recorded neural activity as mice ran a T-shaped maze in virtual reality. A cue, which happened several seconds beforehand, indicated to the mice whether a reward would be in the left or right arm of the T. The Lee lab used powerful microscopes to map the structural connections between the same neurons recorded during the maze task.

By combining modalities, the researchers distinguished excitatory neurons — those that activate other cells — from inhibitory neurons, which suppress other cells. They found that a specific set of excitatory neurons fired when a mouse decided to turn right, and these “right-turn” neurons activated a set of inhibitory neurons that curbed activity in “left-turn” neurons. The opposite was true when a mouse decided to turn left.

“As the animal is expressing one choice, the wiring of the neuronal circuit may help stabilize that choice by suppressing other choices,” Lee said. “This could be a mechanism that helps an animal maintain a decision and prevents ‘changes of mind’.”

The findings need to be confirmed in humans, although Lee expects that there is some conservation across species.

The researchers see many directions for future research. One is exploring the connections between neurons involved in decision-making in other brain regions.

We used these combined experimental techniques to find one rule of connectivity, and now we want to find others,” Lee said.

You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.