Articles Published by Meta News 

(acquired by the Chan Zuckerberg Initiative)

MDMA-assisted Psychotherapy for the Treatment of PTSD

MDMA (3,4-methylenedioxy-methamphetamine) is a methamphetamine derivative commonly known as the club drug “ecstasy.” Party-goers take ecstasy (also known as “E,” “X,” or “Molly”) for its euphoric and empathogenic effects. Due to significant safety concerns, including hyperthermia and dehydration, MDMA is illegal in most countries. However, researchers have been exploring the use of MDMA in clinical research as a potential therapy for post-traumatic stress disorder (PTSD) by taking advantage of its effects on sociability and communication. Many victims of abuse, as well as war veterans, suffer from PTSD, a debilitating anxiety disorder. The symptoms of PTSD include flashbacks of negative past events, as well as hyperarousal and avoidance.

MDMA acts by increasing levels of monoamines in the brain (mostly, serotonin) by blocking their reuptake. The acute effects of MDMA administration include enhanced endurance and energy, greater sociability and extraversion, and a heightened sense of closeness to others. Long-term use, however, can lead to serotonin toxicity, psychiatric problems, and cardiovascular toxicity. Since MDMA is illegal, current formulations of the product sold on the streets are unregulated, making it impossible for users to know exactly what they are taking. Clearly, the use of MDMA in medical research is quite controversial due to the safety concerns associated with the drug.

The Multidisciplinary Association for Psychedelic Studies (MAPS) is a non-profit organization that is pioneering clinical studies to develop MDMA into an FDA-approved prescription drug for the treatment of PTSD. Since MDMA is short-acting (the primary effects usually last approximately 4 hours), MAPS sees the potential for MDMA-assisted psychotherapy in helping PTSD patients come to terms with and reduce the fear associated with past memories. MAPS has designed a series of Phase II and III global clinical trials aimed at proving the efficacy of MDMA for PTSD therapy. If successful, MDMA would become the first psychedelic drug approved by the FDA for clinical use.

In 2011, Mithoefer et al. published results on the first randomized control study of MDMA in patients with PTSD. The study included 23 patients whose ages ranged from 21 to 70 years. Patients were divided into two groups and each group received two psychotherapy sessions either with concomitant MDMA administration or placebo. Based on standardized symptom scales, MDMA-assisted psychotherapy resulted in clinically and statistically significant improvements in PTSD symptoms. During administration, some patients experienced elevated blood pressure, pulse, and body temperature; however, these measurements returned to baseline at the end of each session. The most common side effects reported include jaw tightness, nausea, feeling cold, dizziness, loss of appetite, and impaired balance but no serious drug-related adverse events occurred. An additional study published two years later reported similar results (Oehen et al., 2012). Although the results did not reach significance in this study, the safety profile was favorable. These preliminary results are very encouraging and may pave the way into opening a world of medical research on psychedelic drugs.


The Genetic Link Between Creativity and Mental Illness

"There is only one difference between a madman and me. I am not mad.” — Salvador Dali

Sylvia Plath was a tortured writer who famously put her head in the oven and ended her life. Van Gogh was brilliant painter who is best known for cutting off his ear. Throughout history, many great artists have been known to suffer from mental illness. As such, the association between creativity and mental illness has been hotly debated since the Renaissance. Much of the early work on this association was derived from anecdotes or biographies of artists who suffered from mental disorders.

Some of the first clinical studies in the 1980s evaluated small samples of highly creative individuals and reported that they had significantly high levels of mental illness. Additionally, studies have also found that the first-degree relatives of these individuals were predisposed to creativity, as well as mental illness. Other studies have indicated that there is a much greater risk of suicide in creative individuals who suffer from mental illness.

Although many writers and artists suffer from depression as well as bipolar disorder, much of the research has focused on the relationship between schizophrenia and creativity. Both the symptoms and neurological substrates of schizophrenia highly overlap with those of a creative individual. The frontal lobe represents the seat of all logic, decision-making, and personality. Dysfunction of the frontal lobe can lead to delusions, hallucinations, and disorganized thought processes, a hallmark of the schizophrenic condition, as well as a source of inspiration for the creative mind.

It is well known that genetics play an important role in schizophrenia, as several studies have demonstrated that the risk of mental illness is greater when there is a family history of the disease. Despite these known genetic associations, it has been difficult to pinpoint just how many genes are affected and what that might mean clinically. In 2014, the Schizophrenia Working Group of the Psychiatric Genomics Consortium published the largest molecular genetics study on schizophrenia. Based on this genome-wide association study, they identified 108 loci that met genome-wide significance, 83 of which were novel.

In a more recent study, a research team in Iceland investigated whether copy number variants that are associated with the risk of mental illness were also associated with creativity. They first calculated polygenic risk scores (cumulative genetic risk profiles) and then compared them a sample of 86,292 Icelandic individuals. They found that higher polygenic risk scores were associated with highly creative individuals, suggesting that there is indeed a shared genetic link between creativity and psychosis.

Despite accumulating evidence, such as the results from this study, this issue will remain controversial since these studies bring up a very important and interesting point – if a therapy was ever made available for creative individuals suffering from mental illness, would they take it or would they rather be able to create the art that greatly defines who they are? As Edvard Munch stated, “[My troubles] are part of me and my art. They are indistinguishable from me, and it [treatment] would destroy my art.” There is indeed a thin line between creativity and madness but, for many artists and writers, that line has always been blurred and they wouldn’t want it any other way.


Inside Out: Neuroscience Fact or Fiction? 

Pixar’s latest animated movie, Inside Out, takes viewers on a journey inside the brain of Riley, an 11-year-old girl. The five main facets of emotions, “Anger,” “Disgust,” “Fear,” “Sadness,” and “Joy,” narrate the ups and downs of her pre-adolescent life. Although the movie provides entertainment, with engaging characters and epic voyages, it also provides a great story of how the brain processes and controls behavior and how emotions evolve and change over time  — but does the science back up the story?

Emotional Learning

At the beginning of the movie, “Joy” is the main leader of the bunch, taking reign and keeping the other “negative” emotions in check. As Riley enters her pre-adolescent years and moves from her childhood home in Minnesota, there is a corresponding shift in emotion and “Sadness” begins to take over. A battle between “Joy” and “Sadness” ensues, as “Joy” attempts to regain control.

The amygdala, an almond-shaped structure in the brain, regulates emotional processing. Although many brain regions support the expression of emotion, the amygdala transits and modulates information across these regions. An imbalance of these circuits can lead to disorders, such as depression and anxiety. A common therapeutic technique, Cognitive Behavioral Therapy (CBT), is based on the premise that emotional disorders are driven by cognitive factors that can be controlled by retraining the brain to undo harmful emotional pathways. Inside Out simply demonstrates just how CBT works, by showing how these emotional circuits can change over time and how finding a balance between all emotions is key to overall mental health.

 Memory Processing

The characters in the movie enjoyed playing back some of Riley’s early memories in a fashion similar to the way some might watch home movies of their family. Each time a distinct experience in Riley’s life was captured, a different-colored “marble” would come rolling into the hub, where it might stay in either short-term memory or be shipped off to long-term memory. The cognitive structure that supports human memory is a bit more complex than this marble shuffling would have you believe.

 The hippocampus is a seahorse-shaped structure that is responsible for memory processing. In addition to encoding and storing memories, the hippocampus coordinates the activity of many other brain regions that contribute to memory in different ways. Although the brain does store all of the details associated with each “marble,” allowing for precise playback (or recall), the components that make up a memory are supported by a wide array of connections across the brain. Memory encoding and recall are intricate processes that rely upon a web of networks; a brain full of memory “marbles” would simply not hold!

Cognitive and Personality Development

In addition to showing us how emotion works in the brain, the characters in the movie also introduce us to personality development with Riley’s “personality islands.” As she moves from her childhood home in Minnesota to San Francisco, Riley’s old islands begin to crumble and new ones are built based on her evolving life. This depiction of personality development reflects Albert Bandura’s theory of social learning. Bandura posited that personality develops as a result of one’s social environment; thus, new patterns of behavior are acquired through observing the behavior of others, which eventually leads to the building blocks of personality.

Riley’s move from Minnesota to San Francisco corresponded with her transition from childhood to pre-adolescence. According to Jean Piaget’s theories on cognitive development, Riley was also on the cusp of transitioning from the “concrete operational” to the “formal operational” stage, a crucial time period reflected in her attitudinal changes as she recognizes her own personal situation and begins to rationalize her experiences and needs.

In summary, Inside Out took on a very tall order and managed to provide an entertaining and fairly accurate account of the emotional and cognitive processes that are supported by extremely complex neurological processes. And gives us a glimpse of what goes on inside all of our own very busy minds every day.


The Cognitive Neuroscience of Music Preference

Music is a ubiquitous part of human culture and plays a key role in the development of one’s unique identity. It helps us connect socially and helps define cultural groups. The powerful effects of music on emotions and behavior have been studied since the fourth century BCE. However, over the past 2 decades, the development of functional imaging technology has led to a surge in the field of music cognition.

Neuroimaging studies have revealed that several structures, including the medial thalamus, dorsal cingulate cortex, and retrosplenial cortex, are involved in processing the tonalities of music, while the amygdala and cerebellum are responsible for rhythmic perception and timing. The orbitofrontal cortex and cingulate gyrus are activated when listening to music that induces positive emotions. This strong emotional response is believed to be under the control of the neurotransmitter dopamine (the “reward” chemical), as blocking its activity during listening suppresses the emotional experience.

Since musical preferences are so unique to each individual and yet we all experience a similar emotional response, several studies have looked at the pleasurable effect of listening to music on brain activation levels. Wilkins et al. (2014) found an association between activation of the precuneus, a part of the parietal lobe, and the “default,” or resting-state, network of the brain, which is believed to support self-referential thoughts, empathy, and self-awareness. This study also found differential activation between the hippocampus (which supports learning and memory) and the auditory cortex (which supports sound processing) when listening to a preferred new song compared to an old favorite. Interestingly, these activation patterns were found to be similar across individuals regardless of their specific preferences, which may explain how we all experience similar positive experiences despite our unique preferences that range from Beethoven to Eminem.

This strong emotional response to certain types of music has led many researchers to explore the underlying basis of individual differences in music preferences. Studies have attempted to group music preferences into different genres, such as “urban” and “pop,” in order to understand the cultural, regional, and personality traits that best align with these preferences. Rentfrow et al. (2011) proposed a 5-factor model of music preference based on listeners’ affective reactions to a wide variety of music samples, which consist of mellow, urban, sophisticated, intense, and campestral factors, known as the MUSIC model.

A more recent study utilized the MUSIC model in the context of the empathizing-systemizing (E-S) theory to investigate the link between empathy and musical preferences in over 4000 participants. The E-S theory is based on the assumption that individuals either exhibit more empathetic traits, having greater recognition and sensitivity to the thoughts and emotions of others, or exhibit more systemizing traits, having greater interest in the organization and prediction of behaviors in the world. Empathy levels and personality traits were determined using scores from the Empathy Quotient and NEO Personality Inventory-Revised questionnaires. Based on these analyses, they found that participants with high empathy levels (type E) had strong preferences for mellow music, while participants with high systemizing levels (type S) had strong preferences for intense music. Type E participants preferred R&B/soul, adult contemporary, and soft rock music with low arousal, negative valence, and emotional depth. Personality type S participants preferred punk, heavy metal, and hard rock music with high arousal and aspects of positive valence and cerebral depth.

Does your Spotify or iTunes playlist match with your personality type? Try your own E-S experiment on your personal music collection to put this study to a real-world test!


New Perspectives on the Placebo Effect

The “placebo effect” has perplexed both researchers and clinicians alike for over 200 years. This phenomenon has prompted a great deal of research to understand all of the factors that contribute to this controversial issue in medicine.

Although the placebo effect has been shown to occur across a wide range of clinical conditions, the magnitude of the effect varies drastically and is susceptible to many different factors. These factors include the specific type of verbal direction given to a patient during drug administration, as well as the patient’s past clinical experience; in addition, the patient’s expectation of the experimental outcome can significantly alter their treatment response. In fact, the desire-expectation model of emotions proposes that, as one’s emotional intensity increases while approaching a goal, an individual is more likely to fulfill his or her own expectations of attaining that goal (the power of “positive thinking”).

Cognitive and emotional factors also play very important roles in determining whether a patient will experience the placebo effect and to what extent. In Parkinson’s disease, patients suffer from motor tremors and other symptoms that are caused by a reduction in dopamine levels. Using positron emission tomographic (PET) scanning, Lidstone et al (2010) evaluated dopamine release in Parkinson’s patients after administering a placebo. Patients were told that they had a certain probability that the medication (placebo) would be effective in treating their symptoms. They found that there was a direct correlation between the patient’s expectation of treatment efficacy (75%) and dopamine release.

Many researchers who study the placebo effect have focused on placebo-mediated analgesia. Amanzio & Benedetti (1999) found that placebo-mediated analgesia is opioid-dependent, as naloxone blocks this effect (an opioid antagonist); however, additional studies have also demonstrated that not all instances of placebo-induced analgesia are opioid-dependent. Wager et al (2004) found that the activity of several brain regions associated with pain, including the thalamus, insula, and anterior cingulate cortex, were reduced in response to placebo-mediated analgesia. In addition, activity in the prefrontal cortex, a region associated with planning and high-level cognition, increased in response to the expectation of pain, suggesting that the placebo effect changes the experience of pain at the cellular level.

In a more recent study, Wager’s team looked at placebo-mediated analgesia in the presence and absence of patient expectations. Since it has previously been shown that patients can experience the placebo effect even when they are aware that they are receiving a placebo, Schafer et al (2015) tested the pain response before and after the patients were made of aware of their treatment allocation. They found that placebo-mediated analgesia persisted even after the experimental manipulation was revealed to the patients, suggesting that expectancy alone cannot sufficiently explain placebo-mediated analgesia.

We are still unraveling the enigma of the placebo effect and certainly more research is needed in order to properly understand this response, but this mysterious phenomenon also has significant implications for how clinical trials are conducted for new therapies. The placebo effort reinforces the notion of the mind-brain-body relationship and how a single pill alone will never be as effective as taking the entire this trio into account for optimal therapeutic outcomes.


"Just Picture It:" A Life Without Mental Imagery

Close your eyes and picture yourself in a peaceful place surrounded by people you love. Or, think back to a relaxing vacation on the beach. Most people picture themselves in a calming environment when they are stressed or trying to fall asleep. Mental imagery is a key part of our daily lives and is a fundamental part of our imagination and our creativity.

Aphantasia is a recently coined term, which refers to the lack of a "mind's eye;" in other words, it is the inability to mentally visualize imagery. The term is derived from the Greek word for "imagination." Although this phenomenon was first reported in 1883, it has received little attention until recently. A few cases of patients who reported the inability to mentally visualize imagery were brought to the attention of neurologists who were puzzled by the condition. These patients had suffered from traumatic brain injuries specific to the temporal and parietal lobes (regions of the brain responsible for perception and memory), whereas their occipital lobe (the region of the brain responsible for processing visual information) remained intact.

The recognition of individuals with this condition suggested a neural dissociation between regions that support visual perception and visual imagery. For example, visual agnosia is a condition characterized by the inability to recognize objects, which is usually a result of damage to the occipital lobe or associated regions. Neuroimaging studies have shown that visual imagery deficits appear to occur independently from visual perception impairments and that the temporal lobe plays an integral part in mentally visualizing imagery. Thus, there appears to exist distinct neural networks that support visual perception processing and visual memory loss (which includes visual agnosia) and visual imagery impairments (which includes aphantasia).

In 2010, Zeman et al published a case study of patient "MX"who lost the ability to mentally visualize imagery after undergoing coronary angioplasty. Despite this impairment, MX displayed intact perception of visual imagery and visual memory. Neuroimaging analyses demonstrated that, during attempted visualization, there was a reduction in activity in posterior regions and an increase in activity in frontal regions in MX compared to controls. These findings prompted this group to further explore this interesting phenomenon in a larger subset of individuals.

In a follow-up study, Zeman et al (2015) evaluated mental visual imagery impairments in 21 participants using the Vividness of Visual Imagery Questionnaire (VVIQ). These participants displayed significant impairments on the VVIQ compared to a group of 121 controls, while experiencing a wide range of involuntary images that occurred as "flashes" or within dreams. Based on the results from this study, the researchers coined the term aphantasia to refer to this condition of reduced or absent voluntary imagery. This topic is certainly one of great interest to visual scientists and cognitive neuroscientists and is in need of further study.

How well can you mentally visualize imagery? Take the Vividness of Visual Imagery Questionnaire here and find out!


The Life, Work, and Quirks of Oliver Sacks

Oliver Sacks, world-renown neurologist and writer, died Sunday, August 30 in his NYC home. Sacks is best known for his books and other writings on neurology in which he presented patient case studies in a humanizing and fascinating manner. He was born into a family of doctors and scientists in London in 1933. Following his training as a physician, he left the UK for the US in the 1960s where he lived out the rest of his days. He had a discerning eye and a curiosity that led him to study the peculiarities of the human mind and its manifestation as rare and unusual neurological conditions.

Medical Storyteller

 While consulting at Beth Abraham Hospital in the Bronx, Sacks observed patients with "sleeping sickness," an atypical type of encephalitis. This experience prompted him to write his first book, Awakenings. This work inspired a movie in the same name released in 1990. His following works, which included The Man Who Mistook His Wife for a Hat and An Anthropologist on Mars, described patients suffering from other unusual conditions, such as prosopagnosia, phantom limb pain, and Tourette's syndrome.

Personal Quirks

Despite his formal training in medicine, Sacks did not approach neurology in a standard manner. In an interview, Sacks noted, "I am struck by the affinity of anthropology to medicine and the need to see lives and diseases in context and not so isolate them as is often the case." He studied individual patients instead of larger patient populations in order to gain a greater understanding of how these diseases changed the nature of patients.. As such, he would often visit patients at their homes in order to gain a better understand a patient's situation and condition. His colleagues did not always appreciate these quirks in his practice. Sacks noted that, while he was a resident at UCLA, he was "thought of as partly an embarrassment, partly an ornament." 


Sacks has inspired doctors, scientists, and writers alike. He possessed the heart of a poet and the mind of a scientist; together, with his medical background, he treated each patient as a whole, with great detail paid to every symptom and behavior. When asked how he would like to be seen by his colleagues, Sacks noted, "I would like to be seen as a sort of explorer, driven by, and trying to share, a strong sense of wonder." I do believe that the global community would heartily agree; Sacks put the wonder into neurology, reminding us that medicine tells a story of humanity that we often forget to listen to.

The Oliver Sacks Foundation is a nonprofit organization devoted to increasing the understanding of the mind through the power of narrative case histories. Learn more about the life and work of Oliver Sacks here.


Trouble Quitting Smoking? You Can Thank Your Insular Cortex

"I've tried quitting so many times but I just can't kick the habit." Sound familiar? Cigarette smoking is a highly addictive habit that is difficult to quit and many people who were successful in quitting often relapse. Although there are many methods available to help people quit, the addiction to nicotine is very difficult to beat. As with other types of addiction, there is a common belief that strength of character can help overcome the addictive behaviors; however, mounting evidence has pointed to powerful neural networks that mediate different types of addiction.

Recent studies have suggested that the root of smoking addiction may lie deep within the layers of the brain in a region called the insular cortex (IC). The IC is known to play a role in sensory processing, particularly for taste. This region also supports cognitive processing due to its unique convergence of inputs from the amygdala, dorsomedial nucleus of the thalamus, and the prefrontal cortex. The IC has also been implicated in addiction, as studies have demonstrated that this region is involved in cravings for drugs of abuse, such as cocaine and tobacco. In addition, pre-clinical studies have demonstrated that inactivation of the IC blocks nicotine self-administration behavior in rats. Despite this evidence, the precise role of the IC in human nicotine-seeking behavior (smoking) has remained unclear.

Two recent studies led by Amir Abdolahi investigated nicotine withdrawal and smoking cessation behavior in stroke patients. In the first study, nicotine withdrawal symptoms were evaluated in 156 patients who suffered from ischemic stroke to the IC or other regions of the cortex. All patients had reported smoking at least one cigarette per day during the month prior to their stroke and at least 100 across their lifetime. The Wisconsin Smoking Withdrawal Scale (WSWS) and Minnesota Nicotine Withdrawal Scale (MNWS) were used to assess nicotine withdrawal symptoms. Compared to patients with stroke damage to other regions of the cortex, patients with IC damage exhibited significantly fewer and less severe symptoms during their hospital stay.

In a follow-up study, Abdolahi and colleagues evaluated smoking cessation behavior in the same cohort of 156 stroke patients included in the previous study. Abstinence from smoking or nicotine use was assessed through patient interviews. At 3 months post-hospitalization, patients with IC damage had higher relative odds of complete abstinence compared to patients with stroke damage to other regions of the cortex. In addition, patients with IC damage exhibited a longer time relapse of smoking. Taken together, these findings are quite striking and could lead to a targeted therapy to help people quit smoking.


Researchers Get Closer to Understanding How the Brain Dreams

"To sleep, perchance to dream." — Shakespeare

For centuries people have been fascinated by sleep and by dreams. While many studies have produced important and interesting findings on sleep itself, there has been less research on how the brain dreams.

How Do We Dream?

Several hypotheses have attempted to explain why and how humans (and possibly other animals) dream. These theories, which range from Freud's psychodynamic approach ("wish fulfillment" and "day residue") to Hobson's activation-input-modulation model to Foulkes & Domhoff's neurocognitive theory, have lacked any real data to support them. Although there may be numerous reasons as to why we dream, researchers have also long debated whether dreams are truly unique perceptions conjured up by the brain itself or rather just images produced by neurons firing at random.

The Physiology of Sleep

Sleep consists of four stages broken up into two main categories, non-REM (stages I-III) and REM (stage IV), which are characterized by changes in electrophysiological signals. Dreams normally occur doing REM sleep, the stage in which the brain exhibits gamma frequency activity (30–80 Hz) that is similar to activity in the awake state. These physiological characteristics of REM sleep suggest that the brain may in fact be creating our dreams rather than sitting idly by while they occur. Although EEG recordings and fMRI studies have produced important findings on sleep and dreaming, these data represent an overall sum of brain activity and therefore do not provide detailed information as to what occurs on a cellular level.

Taking a Closer Look

A recent study led by Thomas Andrillon measured neuronal activity during sleep and wakefulness in 19 neurosurgical patients with intractable epilepsy. Depth electrodes were implanted within the medial temporal lobe and neocortex and single-unit activity was recorded during the awake state (either with visual stimulation or without visual stimulation) and during REM sleep. Remarkably, the team found that single-unit activity in the medial temporal lobe and cortex exhibited very similar levels of activity across all three experimental paradigms; however, the pattern of activity during REMS sleep and visual stimulation was most similar.

These results strongly suggest that, during REMS sleep, the brain is actively perceiving visual images conjured by dreaming itself. Therefore, the brain could be playing an active role in dreams. The stories that delight us (or terrify, in some cases) may not just be due to “random” neural activity after all. Of course, while this is a landmark study in sleep and dream research, it's far too soon to know for sure what is really going on behind the scenes during the time you close your eyes at night to when you open them each morning.


Trust Your Gut: Probiotics and the Gut-Brain Axis

For many years, the sole origin of mood disorders has been traced back to dysfunctional neurotransmitter systems in the brain. However, recent evidence has suggested an important role for the gut, specifically gut microbiota, in anxiety and depression. There is bidirectional communication along the gut-brain axis, a circuit composed of the gastrointestinal, central and autonomic, and immune systems, which is largely regulated by microbiota. This circuit can modulate metabolism and energy, as well as inflammation. Thus, a disruption in the balance of the microbiota system in the gut can greatly affect general health and wellness.

An important study published in 2004 first demonstrated a link between microbiota and anxiety and depression. Mice without normal gut microbiota expressed higher levels of ACTH and corticosterone levels in response to restraint stress; in addition, these mice had lower levels of BDNF expression in key brain regions that play an important role in regulating anxiety and depression. These results were partially reversed by administration of probiotics. Although these data were collected in pre-clinical models, they suggested an important role for probiotics in patients with depression. In addition, approximately 30% of patients with major depressive disorder (MDD) also suffer from irritable bowel syndrome (IBS) and IBS patients have dysfunctional gut microbiota, which is related to their condition.     

Based on these correlative findings, treatment with probiotics has been proposed as a potential adjunctive therapy for patients with MDD. A randomized controlled trial led by Laura Steenbergen and her colleagues evaluated the effects of probiotics administration on cognitive reactivity in a population of healthy participants. Forty non-smoking young adults with no psychiatric or neurological disorders and no personal or family history of depression or migraine were randomly assigned to receive either probiotics or placebo over a period of four weeks (n=20 in each group). The probiotics supplement contained the strains Bifidobacterium bifidum W23, Bifidobacterium lactis W52, Lactobacillus acidophilus W37, Lactobacillus brevis W63, L. Casei W56, Lactobacillus salivarius W24, and Lactococcus lactis (W19 and W58). Before and after the four-week trial, participants were screened with questionnaires that assessed changes in cognitive reactivity. The LEIDS-r questionnaire was used to evaluate feelings of aggression, suicidality, coping, control, risk aversion, and rumination, while the Beck's inventories were used to assess clinically valid indicators of anxiety or depression.

After four weeks of supplementation, participants taking probiotics exhibited significantly reduced levels of self-reported aggression, rumination, and overall cognitive reactivity compared to those taking placebo; however, there was no effect on the Beck's scores for anxiety or depression. This study is the first human trial to evaluate the effect of probiotic supplement on cognitive reactivity. Although these findings are preliminary and the study was performed on healthy participants instead of a clinical population, the results are quite encouraging after only a short period of probiotics supplementation. For now, keep eating that yogurt and stay tuned for future studies!


Serendipity in Drug Discovery: Teaching Old Drugs a New Trick

The discovery and approval of new therapies usually starts with the identification of a promising compound and positive pre-clinical studies, which are then followed by rigorous testing in human clinical trials. However, in some cases, drug discovery follows a very different path. The history of medicine is punctuated by many cases of serendipity in drug discovery, as scientists often stumble upon unintended therapeutic benefits for one disease while studying the effects on another, or entirely due to a mistake or mishap in the lab. The most famous example of this serendipity is the discovery of penicillin by Alexander Fleming, which was a result of his working in a dusty lab that contaminated his samples.

According to recent analyses, in 2014, the cost of developing a drug was 2.6 billion USD, a 2.5-fold increase over the last decade. In addition, the process of drug approval can take a decade or longer. Therefore, serendipitous drug discovery based on developing new uses for old drugs can certainly save money and bring new therapies to patients sooner. One recent example is a study published in Cancer Cell. This research team led by Ksenya Shchors designed an elegant series of experiments to evaluate the effects of two common medications on glioma progression.

Glioma is a form of brain cancer that does not respond well to treatment. Long-term treatment with the tricyclic antidepressant imipramine has been shown to reduce the incidence of glioma by inducing autophagy, a type of programmed cell death. In this study, the effect of imipramine was first evaluated in a p53 model of glioma, as p53 deficiency promotes glioma development. Imipramine treatment significantly increased overall survival of the treated mice. Additional analyses revealed that the underlying mechanism was indeed due to autophagy and not apoptosis, as previous research has suggested.

Based on these results, the researchers went further and attempted to enhance the therapeutic effects of imipramine by screening additional drugs for their effects on cell survival. They identified ticlopidine, an anti-platelet drug, which produced synergistic effects on cell survival with concomitant imipramine treatment. Further in vitro  and in vivo analysis of the combined treatment with these two drugs revealed that their effect on cell survival was a result of their regulation of autophagic regulatory circuitry. These findings were replicated in several other types of glioma models, suggesting that these results were indeed transferable to different types of induced glioma. Finally, the last experiment in this series determined that imipramine and ticlopidine regulate autophagy by increasing levels of cAMP, an important signaling pathway that is known to modulate gliomagenesis.

This series of well-designed experiments indicates that two FDA-approved drugs may potentially be re-purposed in combination as an effective new treatment for glioma. Of course, these studies represent the pre-clinical trial phase and this combination of treatment will be subjected to the usual rigors of human clinical trials. However, this inspirational work highlights the importance of serendipity in drug development and how mistakes in the lab and creative thinking have led to some of the most important discoveries in medicine.


Using Biomarkers to Determine Cardiovascular Disease Outcomes

The last few decades have seen a surge in the identification of risk factors for a wide range of diseases that can be used to predict, prevent, or help treat patients earlier that had previously been possible. The treatment of oncology specifically has greatly benefited from the use of biomarkers to identify specific types of cancer. Cardiovascular disease, however, has lagged behind in this age of innovation due to its complex pathology and the multiple biological pathways involved in its pathogenesis.

Based on analyses from the CDC in 2011, 5-10% of all emergency department admissions were due to reports of chest pain or heart disease; however, not all of these patients later experience myocardial infarction or other cardiac events. Therefore, it is crucial to avoid unnecessary hospitalization and accelerate the diagnostic process by more accurately screening patients at the time of hospital admission.

Troponin is a regulatory complex in muscle tissue that plays an important role in the calcium-mediated regulation of skeletal and cardiac muscle contraction. Troponin I (cTnI) is a subunit that binds to actin and blocks its interactions with myosin. Elevated levels of cTnI are indicative of cardiac injury. Therefore, cTnI has been identified as a cardiac biomarker that is an independent risk factor for myocardial infarction. Increased levels of cTnI have also been associated with various other cardiovascular disorders. Several studies have developed and tested protocols for screening patients at admission and have demonstrated that cTnI screening could identify up to 40% of patients who are at a low risk of acute coronary syndrome.

A recent study published in The Lancet screened troponin levels in patients admitted to the hospital with a presentation of acute coronary syndrome. This prospective study included over 6,000 patients admitted to two different hospitals in Scotland. All patients were first screened for acute coronary syndrome and then troponin levels were measured at the time of presentation and repeated 6 and 12 hours later. Troponin levels were then correlated with outcome, including myocardial infarction or cardiac arrest at 30 days.

Analyses revealed that very low troponin levels of <5 ng/L were significant negative predictor values of a cardiac outcome event. When stratified by age, cardiovascular risk factors, previous cardiovascular disease, or the presence of myocardial ischemia, the effect persisted and was similar in both men and women. Overall, based on this cohort, researchers found that two-thirds of the patients were at very low risk of an event based on their troponin levels.

These findings represent a greater push towards using biomarkers to routinely screen for patients that have at a low risk of experiencing a cardiac event. This screening process could keep patients from undergoing unnecessary hospitalization and could save hospitals money.


The Effects of Birth Order on Personality Traits

Middle child syndrome? Burdened with responsibility as the oldest child? Blissfully spoiled as the youngest? Have you ever blamed a person's behavior on their birth order? Well, you are not alone. Since the late nineteenth century, psychologists have been investigating the link between birth order and personality traits and have yet to come to a solid consensus on the issue.

A Brief History on Birth Order and Personality Theories

Following Francis Galton's initial studies in 1874 on birth order and intelligence, Alfred Adler emphasized the importance of “psychological” birth order (the role a child adopts relative to others) compared to “ordinal” birth order (the actual order of birth) in the 1930s. Frank Sulloway approached birth order through an evolutionary perspective. In 1996, he published his theory, which established many tenets of the commonly known birth order-based personality traits. For example, he suggested that first-born children are more achievement-oriented and assertive, whereas later-born children are more rebellious and risk-takers.

Following these theories, many empirical studies have been carried out in order to verify these assertions; however, these studies have often found disparate results, as personality differences between siblings are certainly dependent upon both their unique experiences in the world and their shared experiences at home. Therefore, this topic has remained hotly debated in the literature.

New Findings on Birth Order and Personality Traits

A recent study published in PNAS led by Julia Rohrer tackled this very issue. This group used data from 3 large international panels from the United States, Great Britain, and Germany in order to evaluate the relationship between birth order and the Big Five personality traits. The Big Five is a standardized taxonomy of main personality traits that has been well tested in the personality psychology literature. These traits include extraversion, emotional stability, agreeableness, conscientiousness, and openness to experience. This study included two types of analyses, within-family and between-family designs, in order to increase the statistical power.

Across all individuals evaluated, intellect was found to be significantly associated with birth order,  such that first-born children have a greater chance of having a higher intellect compared to later-born children; these results were consistent across both within-family and between-family analyses. They also found a ~1.5 IQ point disparity for each increase in birth order (i.e., from first-born to second-born child). Despite these results, no significant link was found between birth order and any of the Big Five personality traits.

This study attempted to overcome limitations from previous studies by using data with independent personality assessments from several, large data sets (N: >20,000), as well as using both within-family and between-family designs. They also controlled for confounding variables, including the effect of age on personality and intelligence. The results reported here are highly consistent across all 3 data sets and clearly indicate that personality is not in fact affected by birth order.

So, based on these findings, the next time you get in a fight with a sibling, don't blame it on birth order!


Bidirectional Cortical Entrainment: A Rhythmic Tale of Brain Plasticity

Neuroplasticity remains an exciting and complex topic in systems neuroscience. The brain has the capability to change over time based on our unique experiences and can partially repair itself when damaged. Changes in the brain can be measured on an electrophysiological level, as cortical oscillatory activity represents a sum of neuronal activity. By decoding this electrical output, researchers can make inferences about different frequencies associated with different types of cognition and behavior. These studies have provided powerful information on how the brain responds to changes both internally and externally over time.

A new study published by PNAS used magnetoencephalography (MEG) recording techniques in order to determine if cortical oscillatory rhythms entrain, or synchronize, to music. The delta-theta phase (low-frequency oscillations) entrainment hypothesis proposes that certain cortical frequency ranges predict salient rhythmic inputs from the environment, such as speech. Musical training has been shown to alter several regions of the brain associated with auditory perception. Therefore, both musicians and non-musicians were included in this study in order to determine if these structural changes were correlated with changes in cortical oscillatory entrainment.

During the experiment, cortical activity was measured using MEG while participants listened to short clips of classical piano music. The participants were asked to identify pitch distortions in the music. Not surprisingly, the musicians were able to detect far more pitch distortions than non-musicians. In addition, musicians were better at entraining to higher frequencies; therefore, the behavioral difference was attributed to the differences in cortical entrainment between the two groups. Further analyses revealed that these differences in cortical entrainment were indeed correlated with the number of years of musical training the participant had received. The authors also reported that musical training also enhanced beta activity, a frequency associated with predicting the temporal structure of inputs.

This study is the first to report cortical phase entrainment in the perception of music. More importantly, this study adds to the growing literature on neural plasticity and how the brain changes based on experience. The reshaped brain responds by updating its calculations in order to best predict input from this new environment. The bidirectionality of internal and external inputs explains how the brain remains so uniquely in tune to such a fast-paced and changing environment.

So next time you plug into Spotify or iTunes, think of all of the calculations your brain is performing to process and store the temporal structure of those auditory memories while you effortlessly enjoy those beats. What an amazing little machine!


Don't Get Too Excited: How Inhibitory Networks in the Brain Shape Memory

The brain is made up of a complex web of circuits controlled by chemical and electrical signals that help different parts of the brain communicate. The brain uses these signals to orchestrate highly sophisticated behaviors, from high-level thinking and emotional processing to fine motor control, among many others. And while much of these chemical and electrical messages are “excitatory,” meaning that they give the green light for the brain to perform an action, some of these messages are “inhibitory” (red light). The opposite effects of consuming caffeine or alcohol demonstrate a very simple example of this checks and balance system, in a broader sense. Caffeine is a stimulant and activates your brain function, while alcohol is a depressant and inhibits your brain function. In order for cells in the brain to perform optimally, they must receive precise excitatory and inhibitory inputs in a time-dependent manner. 

Take, for example, the hippocampus, a brain region that supports memory function. Two of the most important chemical messengers that coordinate activity within the hippocampus are glutamate and GABA. Glutamate mediates excitatory signals and GABA mediates inhibitory signals; both types of signals together enable the precise encoding of relevant memory content and allow for the retrieval of unique, non-overlapping memory events. The hippocampus sends and receives long-range projections from surrounding cortical regions that support memory processing. The entorhinal cortex is a region that is strongly connected to several subregions within the hippocampus. The excitatory connections between the hippocampus and entorhinal cortex have been well-studied but the role of the inhibitory connections had remained largely unknown until recently.

A new study published on January 7 in Science presents an elegant series of experiments performed by Jayeeta Basu and colleagues at Columbia University Medical Center. They first used viral vector infusions to label long-range inhibitory projections (LRIPS) from the lateral entorhinal cortex to inhibitory interneurons in the CA1 subregion of the hippocampus and found that these LRIPs were quite dense, far more so than connections from the medial entorhinal cortex, a region that has been shown to be involved in spatial memory.

They then explored the functional role of this circuit role in non-spatial memory. They found that suppressing this circuit produced memory deficits in mice performing non-spatial fear and object memory tasks. The research team went on further to explore the electrophysiological properties of this inhibitory circuit. In vivo imaging experiments revealed that LRIPs act as a disinhibitory gating mechanism at different time scales. This disinhibition may allow for the enhancement of other memory signals delivered to the CA1 region. Therefore, the authors concluded that this circuit may be important for the fine-tuning of episodic memory events within the hippocampal circuitry itself.

The brain is such a beautiful, puzzling electro-chemical circuit and its intricacies are still being revealed to neuroscientists every day across the world. This study highlights the importance of both inhibitory and excitatory signaling in the brain, pealing away another layer of this highly complex system.


Treating Duchenne Muscular Dystrophy With Gene Therapy

The history of genetic engineering started back in the 1980s with the publication of two landmark studies by Jon Gordon and Frank Ruddle. Their preliminary experiments in developmental biology laid the framework for what would become a new field of genomic editing. A decade later, scientists across the world were “knocking out” or “knocking in” genes in transgenic mice in order to determine their functional role. With the completion of the Human Genome Project in 2003, scientists finally had a comprehensive blueprint of the human body. And since then, the advancements in genetic engineering have moved fast and furious.

The most exciting recent breakthrough in genetic engineering has been the evolution of the clustered regulatory interspaced short palindromic repeat (CRISPR)-associated 9 (Cas9) system. CRISPR loci and Cas proteins are present in approximately 90% of archaeal and 50% of bacterial genomes. This system arose as a mechanism to protect bacteria from viruses. When the guide RNA (gRNa) and Cas9 (endonuclease) are co-expressed, a DNA target sequence can be modified. The Cas9 system provides a high degree of specificity and is simple to use. Based on studies over the past two years, the CRISPR/Cas9 system has been proposed as a genetic editing tool that can target several viruses that cause human disease, including Human papillomaviruses, Hepatitis B, Epstein-Barr, and HIV-1.

A recent study published in Science from Duke University used a model of Duchenne muscular dystrophy to investigate the use of genetic editing to restore function in this disease. Duchenne muscular dystrophy (DMD) is a genetic disorder that affects 1 in every approximately 5,000 male births. Patients experience muscle deterioration early in life and complications due to the disorder result in a mean age of death at 19 years. DMD results from genetic deletions in the dystrophin gene, which cause an absence of or defect in the protein. Although the genetic basis of the disorder was discovered over 20 years ago, the prognosis of these patients is dismal.

Christopher Nelson and his colleagues at Duke University used AAV serotype 8 as a vector to deliver the CRISPR/Cas9 system to cardiac and skeletal muscle in an mdx model of DMD. This particular model has a nonsense mutation in exon 23 that disrupts production of the protein dystrophin. The CRISPR/Cas9 system used in this experiment was designed to excise exon 23 in order to restore dystrophin expression and improve muscle function. Molecular analyses confirmed successful removal of exon 23, as well as a subsequent increase in dystrophin expression levels. Furthermore, behavioral assessments showed significant improvements in muscle function and strength in the treated mice.

This landmark study is the first to demonstrate the potential of using gene therapy to treat patients with DMD. Although these findings are pre-clinical, these data represent the enormous possibilities for the CRISPR/Cas9 system for use in such a simple therapeutic design. 2016 is surely off to an incredible start in the world of genetic editing!

If you would like to learn more about Duchenne Muscular Dystrophy, please visit the Muscular Dystrophy Association site.