Toxoplasma’s Dark Side: The Link Between Parasite and Suicide

We human beings are very attached to our brains. We’re proud of them – of their size and their complexity. We think our brains set us apart, make us special. We scare our children with tales of monsters that eat them, and obsessively study how they work, even when these efforts are often fruitless. So, of course, we are downright offended that a simple, single-celled organism can manipulate our favorite organ, influencing the way we think and act.

Toxoplasma gondii is arguably the most interesting parasite on the planet. In the guts of cats, this single-celled protozoan lives and breeds, producing egg-like cells which pass with the cats bowel movements. These find their way into other animals that come in contact with cat crap. Once in this new host, the parasite changes and migrates, eventually settling as cysts in various tissues including the host’s brain, where the real fun begins. Toxoplasma can only continue its life cycle and end up a happy adult in a cat’s gut if it can find its way into a cat’s gut, and the fastest way to a cat’s gut, of course, is to be eaten by a cat. Incredibly, the parasite has evolved to help ensure that this occurs. For example, Toxoplasma infection alters rat behavior with surgical precision, making them lose their fear of (and even become sexually aroused by!) the smell of cats by hijacking neurochemical pathways in the rat’s brain.

Of course, rats aren’t the only animals that Toxoplasma ends up in. Around 1/3 of people on Earth carry these parasites in their heads. Since Toxoplasma has no trouble affecting rats, whose brains are similar in many ways to our own, scientists wonder how much the parasite affects the big, complex brains we love so much. For over a decade, researchers have investigated how this single-celled creature affects the way we think, finding that indeed, Toxoplasma alters our behavior and may even play a role in cultural differences beween nations.

The idea that this tiny protozoan parasite can influence our minds is old news. Some of the greatest science writers of our time have waxed poetic about how it sneaks its way into our brains and affects our personalities. Overall, though, the side effects of infection are thought to be minor and relatively harmless. Recently, however, evidence has been mounting that suggests the psychological consequences of infection are much darker than we once thought.

In 2003, E. Fuller Torrey of the Stanley Medical Research Institute in Bethesda, Maryland his colleagues noted a link between Toxoplasma and schizophrenia – specifically, that women with high levels of the parasite were more likely to give birth to schizophrenics-to-be. The hypothesis given for this phenomenon is that while for most people who are infected, Toxoplasma has minor effects, for some, the changes are much more pronounced. The idea has gained traction – a later paper found, for example, that anti-psychotics worked just as well as parasite-killing drugs in restoring normal behaviors in infected rats, affirming the similarities between psychological disorders and Toxoplasma infection.

Continuing to work with mental patients, scientists later discovered a link between suicide and parasite infection. But, of course, this link was in people who already have mental illness. Similarly, a study found that countries with high Toxoplasma infection rates also had high suicide rates – but the connection between the two was weak, and there was no direct evidence that the women who committed suicide were infected.

What scientists really wanted to understand is whether Toxoplasma affects people with no prior disposition to psychological problems. They were in luck: in Denmark, serum antibody levels for Toxoplasma gondii were taken from the children of over 45,000 women as a part of a neonatal screening study to better understand how the parasite is transmitted from mother to child. Since children do not form their own antibodies until three months after birth, the antibody levels reflect the mother’s immune response. Thus the scientists were both able to passively screen women not only for infection status, but degree of infection, as high levels of antibodies are indicative of worse infections. They were then able to use the Danish Cause of Death Register, the Danish National Hospital Register and the Danish Psychiatric Central Research Register to investigate the correlation between infection and self-directed violence, including suicide.

The results were clear. Women with Toxoplasma infections were 54% more likely to attempt suicide – and twice as likely to succeed. In particular, these women were more likely to attempt violent suicides (using a knife or gun, for example, instead of overdosing on pills). But even more disturbing: suicide attempt risk was positively correlated with the level of infection. Those with the highest levels of antibodies were 91% more likely to attempt suicide than uninfected women. The connection between parasite and suicide held even for women who had no history of mental illness: among them, infected women were 56% more likely to commit self-directed violence.

While these results might seem frightening, they make sense when you think about how Toxoplasma is known to affect our personalities. In 2006, researchers linked Toxoplasma infection to neuroticism in both men and women. Neuroticism – as defined by psychology – is the “an enduring tendency to experience negative emotional states,” including depression, guilt and insecurity. The link between neuroticism and suicide is well established, thus if the parasite does make people more neurotic, it’s not surprising that it influences rates of self-violence.

How does a parasite affect how we think? The authors suggest that our immune system may actually be to blame. When we are infected with a parasite like Toxoplasma gondii, our immune system goes on the offensive, producing a group of molecules called cytokines that activate various immune cell types. The trouble is, recent research has connected high levels of cytokines to depression and violent suicide attempts. The exact mechanism by which cytokines cause depression and other mental illnesses is poorly understood, but we do know they are able to pass the blood-brain barrier and alter neurotransmitters like serotonin and dopamine in the brain.

But the authors caution that even with the evidence, correlation is not causation. “Is the suicide attempt a direct effect of the parasite on the function of the brain or an exaggerated immune response induced by the parasite affecting the brain? We do not know,” said Teodor T. Postolache, the senior author and an associate professor of psychiatry and director of the Mood and Anxiety Program at the University of Maryland School of Medicine, in a press release. “We can’t say with certainty that T. gondii caused the women to try to kill themselves.”

“In fact, we have not excluded reverse causality as there might be risk factors for suicidal behavior that also make people more susceptible to infection with T. gondii,” Postolache explained. But given the strong link between the two, there is real potential for therapeutic intervention. “If we can identify a causal relationship, we may be able to predict those at increased risk for attempting suicide and find ways to intervene and offer treatment.” The next step will be for scientists to affirm if and how these parasites cause negative thoughts. Not only could such research help target at-risk individuals, it may help scientists understand the dark neurological pathways that lead to depression and suicide that the sinister protozoan has tapped into. But even more disconcerting is that scientists predict that Toxoplasma prevalence is on the rise, both due to how we live and climate change. The increase and spread of this parasitic puppeteer cannot be good for the mental health of generations to come.

 

Citation: Pedersen, M.G., Mortensen, P.B., Norgaard-Pedersen, B. & Postolache, T.T. Toxoplasma gondii Infection and Self-directed Violence in Mothers, Archives of General Psychiatry, DOI: 10.1001/archgenpsychiatry.2012.668

Photos: Toxoplasma gondii parasites in rat ascitic fluid from the CDC’s Public Health Image Library; Brain MRI Scan in Patient with Toxoplasma Encephalitis from the University of Washington’s HIV Web Study

Using cannabinoids to overcome fear in the brain

We all know exactly what fear feels like. Without our consent, our hearts begin to beat a little faster. The hairs on the back of our neck prickle. Our palms sweat through clenched fingers. Fear is so much more than an emotion; it is a whole body experience. But it doesn’t start that way – it starts in our brains.

When we sense danger or threat, a signal is sent to the walnut sized structure in our forebrains called the amygdala, which is responsible for alerting the rest of our body to prepare for fight or flight. Once the threat is removed, the signal relaxes, and so do we. But for those who suffer from anxiety disorders, it takes much longer for the brain to sound the all clear.

“We’ve learned a fair amount of the circuitry that’s involved in generating the initial fear response. We really know relatively little about the circuitry that’s involved in turning it off,” explains neuroscientist Richard Davidson. Now, in a study published this week in Molecular Psychiatry, researchers from Duke university have found that marijuana-like compounds and the enzyme that degrades them may be the key to understanding fear’s off switch.

Previous research has connected endocannabinoids – naturally-produced compounds that are structurally similar to the active ingredients in marijuana – to the fear response. In mice, for example, brain-wide deletion of cannabinoid receptor type 1 results in a loss of ability to regulate fear. The brains cannabinoid system, though, has a wide variety of important functions, so clinicians are hesitant to use it as a target for potential anxiety therapies.

Researchers have found, though, that one particular cannabinoid – anandamide – seems to play a large role in modulating fear responses. Since anandamide levels in the brain are regulated by a single enzyme, fatty acid amide hydrolase (FAAH), researchers wondered if they could boost anandamide levels by turning off this one enzyme without too many negative side effects.

To test this, the team used a highly selective FAAH inhibitor, AM3506, to knock down the activity in a fear-prone mice (a commonly used model for treating anxiety disorders). They found that the drug not only helped the mice recover from fear faster, the research team specifically traced the drug’s effects to the amygdala. When they looked for unwanted side effects, including altered appetite and depression, they found no evidence for them. This is good news for potential clinical uses of AM3506, though the researchers were careful to note that testing across a broad range of assays will be needed to ensure the drug is safe and effective in people.

While their results were promising, they didn’t answer the real question the Duke team was asking: whether regulating FAAH in people could help treat anxiety disorders.

As luck would have it, in 2009 the same team discovered some people have a common variant in the FAAH gene that affects how well it functions. Now, using fMRI scans of people’s brains, the researchers were able to compare how people with each type of gene reacted to threatening images. As predicted, those with lower FAAH activity levels – like the mice who received the inhibitor – were able to overcome their fear more quickly. A survey of 1,000 New Zealanders further supported these results by showing that those with the low-functioning variant were more level-headed and calm in the face of stress.

Bound together, these results strongly suggest that regulating the activity of FAAH may be a novel way to treat difficult anxiety-based disorders such as post-traumatic stress disorder (PTSD). “What is most compelling is our ability to translate first from mice to human neurobiology and then all the way out to human behavior,” said Ahmad Hariri, co-author of the study and a neurobiologist at the Duke Institute for Genome Sciences & Policy. “That kind of translation is going to define the future of psychiatry and neuroscience.”

Reference: Gunduz-Cinar, O., et al (2012). Convergent translational evidence of a role for anandamide in amygdala-mediated fear extinction, threat processing and stress-reactivity Molecular Psychiatry DOI: 10.1038/mp.2012.72

Image by Victor Bezrukov c/o Wikimedia Commons

The Nose Knows: Telling Age Based On Scent

Our sense of smell is often overlooked. After all, our 20 million smell receptors pale in comparison to the 220 million found in the noses of man’s best friends. We don’t take credit for our ability to distinguish thousands of different smells, even in minute quantities, or how our brains can form strong and lasting memories of scents from a very young age. Yet of all our senses, smell is the first to develop, and before we can feel or see, our noses are hardwired into our brain’s limbic system and amygdala, the parts of the brain where emotions are generated and emotional memories are stored. Since smell detects chemicals in the air, it can warn us of dangers at a distance, even when our eyes and ears are unable to detect them. But more importantly, because it relies on chemicals, smell is one of the most honest ways animals communicate. Human odors, from the smell of tears to underams, have been shown to affect how we think, feel and act, and although we don’t often realize it, our noses play a key role in how we recognize and communicate with one another.

So perhaps it shouldn’t be surprising that scientists have discovered we can distinguish a person’s age by scent alone. In a study published today in PLoS ONE, scientists document how our noses are able to distinguish older people from middle-aged and younger ones not based on the scent mothballs or denture cream, but based on the smell of their body odor.

While we tend to think of B.O. as a reason to wear deodorant, the chemical complexity of Eau de Self is remarkable. Our personal scents can convey a biological and social information, and are thought to play a role in who we like, how we recognize others, and even how we tell men from women. Babies know their mother’s smell shortly after birth, and at a young age we can distinguish between family members and non-family based on their scent. Research has even suggested that we really do have ‘chemistry’ with the people we like, as we can smell immune system differences that might factor into attractiveness, and that we might even be able to smell differences in personalities. Yet there is a lot we don’t understand about how our olfactory system works and exactly what information we obtain through our sense of smell. Given that other animals have been shown to distinguish older animals based on smell, Susanna Mitro and her colleagues from Swarthmore College and the Karolinska Institute in Sweden wondered if smells help us distinguish a person’s age.

To find out, the research team placed pads under the armpits of people in three age groups: young (20–30 years old); middle-age (45–55 years old); and old-age (75–95 years old). Young research participants were then told to discriminate between age categories in side-by-side comparisons and to group the smells according to age as well as rate their perceptual properties like how pleasant or strong the smell. The researchers were startled to find that the participants rated the older aged group’s smell as the most pleasant and least intense. “This was surprising given the popular conception of old age odor as disagreeable,” said co-author Johan Lundström. “However, it is possible that other sources of body odors, such as skin or breath, may have different qualities.” Participants also rated the smell of women to be far more pleasant than the smell of men. But what the researcher’s really wanted to know was whether they could tell the difference between age groups in head-to-head comparisons – and they could.

While the participants were unable to distinguish between young and middle-aged scents, the smell of old age was distinctive. “These data suggest that, akin to other animals, humans are able to discriminate old individuals from younger individuals based on body odor,” write the authors in their conclusion. “The modest effects suggest a limited impact on our everyday interactions but does support previous reports of a unique ‘old person odor’.”

Scientists aren’t entirely sure what makes the smell of older adults different from that of younger people. Studies have found that certain chemicals are present in different levels in older body odor, suggesting that these compounds may serve as biomarkers for old age, but the relevance of these chemicals to age determination has yet to be tested explicitly, and it is unknown whether these chemicals can be detected well by our noses. It’s also unclear whether the ability to distinguish age changes over time. This study focused on young participants as the odor-sniffers, yet it’s possible that the ability to tell age is age-dependent. Older people may lose this ability, or people may be more able to tell ages that are strikingly different from their own. It’s also unclear if the ability to distinguish age has any evolutionary relevance, or if it is simply a byproduct of our more-acute-than-we-think sense of smell.

The researchers also expressed that their study was careful to keep the body odors pure, and thus it is unclear how the scents of hygiene products might affect their results. Does wearing Old Spice actually make a guy appear older, for example? Or does a flowery perfume enhance the youthfulness of a woman? While this study did not explore these questions directly, it has created a foundation for future studies, which may lead to a better understanding not only of our innate ability to determine age by smell but also how our hygiene routines and how product choices affect how we are perceived by others. I can’t wait to see scientists build off these results!

 

Reference: Susanna Mitro, Amy R. Gordon, Mats J. Olsson, & Johan N. Lundstrom (2012). The Smell of Age: Perception and Discrimination of Body Odors of Different Ages PLoS ONE : doi: 10.1371/journal.pone.0038110

Photo of Nose to Nose by XtremeCamera user Lover1969

The Benefits of Thanks

Today is Thanksgiving – a day to relax, take a step back, and honestly express gratitude.

Gratitude. By definition, it is the state of being grateful or thankful. It is universally seen as a positive human attribute. You can hear how highly gratitude is thought of over and over again in sayings from all over the world:

A thankful heart is not only the greatest virtue, but the parent of all the other virtues. -Roman saying

The truly rich are those who enjoy what they have. – Yiddish proverb

If you’re not thankful then you’re a wizard. – African Proverb

Perhaps the merits of gratitude have been parised for centuries in so many cultures for good reason. Studies have shown that expressing gratitude is connected to a wide variety of positive outcomes.

Gratitude may help us deal with stress, for example. Over the past decade, evidence has been mounting to show that gratitude mitigates the negative consequences of traumatic events. Studies have found that soldiers who score higher on dispositional gratitude are less likely to develop post-traumatic stress disorder (PTSD). Similarly, another study found an inverse correlation between gratitude and PTSD symptom levels in college women who experienced trauma.

Of course, the benefits of gratitude extend far beyond serious traumatic events. Simply expressing gratitude has positive effects on our daily lives. In one study, Kent State researchers had students write one letter every two weeks with the simple ground rules that it had to be positively expressive, require some insight and reflection, be nontrivial and contain a high level of appreciation or gratitude. After each letter, students completed a survey to gauge their moods, satisfaction with life and feelings of gratitude and happiness – all of which increased after each letter – the more they wrote, the happier they were.

Similar results have been found in a number of other studies. Middle school students that counted their blessings expressed enhanced self-reported gratitude, optimism, life satisfaction, and decreased negative feelings. In adults, the keeping of gratitude journals led to overall happier thoughts. Not only were the journal keepers in a better mood, they also were more likely to to report having helped someone with a personal problem or offered emotional support to another, suggesting that the positive affects of gratitude expand outward.

Given the benefits of expressing thanks, I decided to do so myself. Yesterday, I sat on my couch with a pad of paper and my favorite purple sharpie pen and wrote out all the little things in my life that I am thankful for. It turned out to be quite a long list – about 8 pages long, actually. So rather than bore you with the whole thing, I’ve included a few of my favorite highlights:

  • I am thankful for all of the people in my life who have made me smile. If you have ever been my friend, then you have surely made me smile a lot, so I am extra grateful for you.
  • I am thankful for my family, who helped me become the woman I am today.
  • I am thankful for Bora, for the term ‘BlogFather’ is more fitting than he knows, and for the rest of my eccentric but lovable blogging family.
  • I am thankful for cheese. Yes, cheese. Cheese, bacon, sushi, curry, pasta and Dippin’ Dots. My six basic food groups.
  • I’m thankful to the natural world for providing beautiful, intricate and complex puzzles that I, as a scientist, am lucky enough to study.
  • I am thankful for my liver, for without its championship team of Alcohol Dehydrogenase and Aldehyde Dehydrogenase, I never could have become the marine scientist I am today. (#drunksci)
  • I am thankful for the people and things that inspire me to do what I love.
  • I am thankful for my wonderful roommate, who always knows whether a situation requires a bottle of wine, a hug, or a patient ear.
  • I am thankful to her cat, Yoshi, for finally forgiving me for making him do the cat dance.
  • I am thankful to the people who remind me that the greatest thing you can do for yourself is to love someone else fully.
  • I am thankful for little surprises.
  • And last but not least, thank you, for it is you, my readers, that make blogging worthwhile.

Enjoy your day of thanks, and don’t forget to express to the people you love just how thankful you are to have them. Happy Thanksgiving!

Image c/o holidays.kaboose.com

Your average, everyday zombie

“The purpose of man’s life…is to become an abject zombie who serves a purpose he does not know, for reasons he is not to question.” – Ayn Rand

Of all the cryptic, creepy and cruel creatures that emerge each Halloween, few captivate our imaginations like the living dead. Sure, they look dreadful, smell bad, and have the conversational skills of a well-adjusted slug, but despite their bad manners and constant lust for our brains, zombies have clawed their way into our hearts. It just wouldn’t be Halloween without them.

Like any good monster, the legend of the zombie is based in truth. According to Voodoo traditions, powerful spiritual men called Bokors have the power to kill a man and then raise him from the dead, turning him into a zombi, a mindless slave to the Bokor who created him. As outsiders began to look into the myths, they discovered that these sinister sorcerers use a chemical cocktail, including the deadly poison tetrodotoxin, to mimic the physiological signs of death. This zombi juice doesn’t just slow the victim’s heart rate – the toxic mix has lasting effects on the human brain including memory loss and delirium, making them ideal slaves. Once the target is declared dead and buried, the Bokor can dig up the grave and claim his zombi.

While Bokors have been tinkering with brain chemistry to make the perfect zombie for hundreds of years, there are others that have been perfecting the art of zombification for much longer. Parasites are the Victor Frankensteins of the natural world. These mini-neuroscientists turn other creatures into mindless slaves, serving only to further the master’s selfish goals. They do that voodoo better than we do, and they have for eons.

In all of the cases I’m about to present, once the host is infected, they are the living dead. They have no hope of recovery, no chance for redemption. They will perform their acts as needed by their parasitic Bokor, and only once their bodies have served their function will the zombie be released into the sweet arms of eternity.

The zombie, it stings

While they might lack the lust for brains, parasitic wasps are the masters of neurological zombification. The emerald cockroach wasp, for example, turns its cockroach host into an ill-fated nanny. With carefully placed stings which inject a venom cocktail into the roach’s brain, the wasp puts the roach into a zombie-like state where it happily follows its attacker to a dark chamber underground. The wasp then lays her eggs in the complacent roach and seals it in its tomb. Soon enough, the eggs hatch, eat, pupate, and emerge while the cockroach sits and waits for its body to be consumed.

But perhaps it is the parasitic wasp Cotesia glomerata that most deserves the title of master. Females lay their eggs in unsuspecting caterpillars. When the larvae hatch, they literally eat their host from the inside out. But even more incredible is the caterpillar’s reaction to such usury: the hapless host transforms into an undead bodyguard, protecting the young wasps even after they have gorged themselves on the caterpillar’s internal organs. The mostly-eaten victim will even spin a silk web over the pupated wasps which just feasted on its flesh, a final act of devotion to its Bokors, before it dies. What chemical spell the larvae cast on their host to instill such loyalty remains a mystery.

Fly me to the grave

They might look like fruit flies, but Phorid flies are more than just a harmless pest – at least to the ants which serve as host for the fly’s larvae. Phorid flies are the ideal Hollywood zombie parasite, complete with a hunger for neurological tissue. Gruesome as they might be, you gotta give the flies points for style.

Phorid fly larvae hatching from an
ant's head, from National Geographic


First, the momma fly injects her eggs into the body cavity of the ant host. Once they hatch, the larvae make their way to the ant’s head, where they feed on hemolymph (ant blood), muscle, and nervous tissue (aka brraaaaaiiiiiinnnnssss). The larval flies will spend weeks inside the head of their host, controlling its behavior while snacking on its brains until they have completely emptied the ant’s head. The ant, meanwhile, wanders around in a zombie-like state. When the young fly decides it is ready to pupate, it releases an enzyme which decapitates the ant by dissolving the membrane which attaches its head to its body. The larva pupates in the ant’s disembodied head for about two more weeks, then emerges as a full-grown adult, ready to mate and find its own zombie host.

Other flies turn wasps into murderous zombie queens. Xenos vesparum larvae wait patiently until a wasp lands close by. When it senses its victim, the larva leaps onto the wasp and burrows through its exoskeleton into its abdomen, and begins feasting on the wasp’s blood. As it grows, the larva starts to mess with the wasp’s mind. The normally social wasp withdraws from its colony, and starts to act recklessly. Then, in early summer, the infected wasp just leaves its colony behind, and as if under some spell, travels to a meeting place. Soon, other parasitized wasps arrive, and the parasites begin to mate. The male flies emerge from the wasps and seek out the females which remain in their zombie hosts, granting the men access to the only part needed for reproduction.

The wasps which were lucky enough to be infected by male flies die. Those infected with females, though, are still under the control of their parasites. They begin to act like wasp queens, gathering food and gaining weight. They then travel to sites where queens gather in the fall, and spend the winter in the lap of luxury among other wasp royalty. When the seasons change and the wasps emerge, the zombie queens go back to their colonies, carrying their deadly load of fly larvae. Everywhere they travel, larvae are left behind, waiting for the next unsuspecting wasp to land.

Rosemary’s baby barnacle

Most parasites are really, really small. But not all zombifying parasites are itsy-bitsy. Take, for example, the Rhizocephalans – the parasitic barnacles.

Yes, I did just say parasitic barnacle, although they don’t look much like barnacles as adults. The adult parasites look like a sac where a female crab would have eggs. It’s classified as a barnacle, however, due to its larval forms, specifically the cypris larvae, which neatly place it in the class Cirripedia.

The barnacle’s strategy is simple: find a crab. Stop her from shedding so she can’t get rid of you. Tap into her nervous and circulatory systems and get comfortable. Grow. Sterilize her, so nothing competes for your living space. As you get too big to fit inside, pretend to be an egg sac. Have her bathe you in water, take gentle care of you, and feed you nutrients because she thinks she’s pregnant. She’ll even help you spread your larvae like they’re her own eggs by fanning them into the water.

However, there is an obvious problem. Not every crab is female, and thus predisposed to tending its eggs. So what does the parasite do when it accidentally infects a male crab instead? Well, it’s simple, really. It makes it act like a woman.

That’s right – the parasite turns male crabs into transvestites.

By altering hormones in the male crab’s neurophysiology, it causes the male crab to act like a female one. Its abdomen flattens and widens, and it starts taking on the behavioral traits of a pregnant female. The male goes through the same process of nurturing and caring for its wee little parasite babies that the female would. The big boy will continue to love, adore and feed its little parasitic baby until he dies, just like a girl crab would.

Swimming with the fishes

Spinochordodes tellinii, also known as a hairworm, are free-living aquatic organisms as adults, who, as nematodes, eek out an existence in the mud. Their young, though, thrive on the flesh of crickets. Problem is, crickets don’t always hang around wet places, and if the nematode were to leave its host when it’s somewhere high and dry, it would die. What better way to ensure that it ends up where it wants to be than to drive the unwitting cricket host to suicide by drowning?

How exactly the worm control the cricket’s brain is unknown, but theories suggest that the parasite has developed a way to mimic natural chemical signals in the bug’s brain. Analysis of the compounds of infested brains reveals heightened levels of neurotransmitters and chemicals responsible for movement and orientation, particularly with relation to gravity. These proteins are similar to the ones produced by the insect, but are not naturally occurring, suggesting that the parasite is able to produce and excrete its own chemical signals to screw with the cricket’s mind. The end result of which, much to the parasite’s joy, is that the cricket seeks out water and hops right in. More often than not, the act is fatal – crickets are terrible swimmers, and most who leap into the cool depths drown. This assisted suicide gives the parasite the chance to break free and wiggle its way easily down to the mud, where it continues its life cycle.

More than a fluke

Trematodes, or flukes, are known for their complex life cycles which involve multiple hosts. Often, switching from host to host is facilitated by predatory interactions. Dicroelium dentriticum, a trematode which lives in the livers of sheep, is no exception. The trouble is, its intermediate host is an ant, and sheep aren’t generally known for their hunger for ants. So how does a parasitic fluke get from an ant to a sheep?

Well, actually, it starts with a snail. The snail accidentally eats the fluke’s eggs while going along its merry snail way, and the parasite hatches and develops in the snail’s gonads. Eventually, the fluke is excreted in the snail’s slime, which is conveniently eaten by an ant. This is where things get weird. Once infected, the ant continues about its business as normal – for the most part. By day the ant acts like an ant. But as the sun goes down, the parasite takes over. Every night, the zombie ant will leave its colony behind and search for a blade of grass. When it finds one, it climbs to the top, bites down, and waits until sunrise. Night after night, the ant will dutifully wait atop its blade until it gets accidentally eaten by a grazing sheep, thus completing its enslaver’s life cycle.

Who said zombies aren’t fungis?

Even microscopic fungi get in on the zombifying trend. The fungus Ophiocordyceps unilateralis is yet another zombifying parasite targeting ants (they really get the short end of the zombie stick). It’s all well and good to target ants, but a bit of a dilemma for the fungus arose from its choice of host: the fungus needs the perfect combination of light, humidity and temperature to survive, and those conditions don’t exist on the ground. Ants generally live on the ground, and thus shouldn’t be good housing for Ophiocordyceps. Not to worry, though – the fungus has engineered a mind-altering solution.

Ants infected with Ophiocordyceps unilateralis find themselves compelled to amble far from their homes, climb trees or sprouts, and end up on the bottom side of a leaf where they clamp down their jaws and wait while the fungus eats them from the inside out. What’s incredible is the precision with which these zombied ants choose their final resting place: they all choose to die on the vein on the bottom of a north-facing leaf approximately 25 cm above the ground in an area with 94 to 95 percent humidity and between 20 and 30 degrees Celsius – perfect conditions for the fungus to grow. After a few weeks, spores fall from the now fully eaten corpse to infect other ants and continue the fungus’ life cycle.

Time – and brain chemistry – heal all wounds

I know I’m not physically hurt. Though it feels like I’ve been kicked in the stomach with steel-toed boots, my abdomen isn’t bruised. Spiking cortisol levels are causing my muscles to tense and diverting blood away from my gut, leading to this twisting, gnawing agony that I cannot stop thinking about. I can’t stop crying. I can’t move. I just stare at the ceiling, wondering when, if ever, this pain is going to go away.

It doesn’t matter that my injuries are emotional. The term heartache isn’t a metaphor: emotional wounds literally hurt. The exact same parts of the brain that light up when we’re in physical pain go haywire when we experience rejection. As far as our neurons are concerned, emotional distress is physical trauma.

Evolutionary biologists would say that it’s not surprising that our emotions have hijacked the pain system. As social creatures, mammals are dependent from birth upon others. We must forge and maintain relationships to survive and pass on our genes. Pain is a strong motivator; it is the primary way for our bodies tell us that something is wrong and needs to be fixed. Our intense aversion to pain causes us to instantly change behavior to ensure we don’t hurt anymore. Since the need to maintain social bonds is crucial to mammalian survival, experiencing pain when they are threatened is an adaptive way to prevent the potential danger of being alone.

Of course, being able to evolutionarily rationalize this feeling doesn’t make it go away.

I lie flattened, like the weight of his words has literally crushed me. I need to do something, anything to lessen this ache. The thought crosses my mind to self medicate, but I quickly decide against that. Mild analgesics like ibuprofen would be useless, as they act peripherally, targeting the pain nerves which send signals to the brain. In this case, it is my brain that is causing the pain. I would have to take something different, like an opioid, which depresses the central nervous system and thus inhibits the brain’s ability to feel. Tempting as that might be, painkillers are an easy – and dangerous – way out. No, I need to deal with this some other way.

Slowly, I sit up and grab the guitar at the foot of my bed.

Where music comes from, or even why we like and create music, is still a mystery. What we do know is that it has a powerful effect on our brains. Music evokes strong emotions and changes how we perceive the world around us. Simply listening to music causes the release of dopamine, a neurotransmitter linked to the brain’s reward system and feelings of happiness. But even more impressive is its effect on pain. Multiple studies have shown that listening to music alters our perception of painful stimuli and strengthens feelings of control. People are able to tolerate pain for longer periods of time when listening to music, and will even rate the severity of the sensation as lower, suggesting that something so simple as a melody has a direct effect on our neural pathways.

So, too, does self expression. Expressive writing about traumatic, stressful or emotional events is more than just a way to let out emotion – college students told to write about their most upsetting moments, for example, were found to be in remarkably better health four months later than their counterparts who wrote on frivolous topics. These positive results of self-expression are amplified when the product is shared with others. While negative emotions may have commandeered our pain response, art has tapped into the neurochemical pathways of happiness and healing.

So, I begin to write. At first, it is just a jumble of chords and words, haphazardly strung together. But, slowly, I edit and rewrite, weaving my emotions into lyrics. I play it over and over, honing the phrasing, perfecting the sound. Eventually, it begins to resemble a song:

    (lyrics)

The rush of dopamine loosens the knot in my stomach ever so slightly. For now, the agony is dulled. Still, I can’t help but think that I’m never going to really feel better – that the memory of this moment will be seared into my brain, and a mental scar will always be there, torturing me with this intense feeling of loss.

Scientifically, I know I’m wrong. As I close my eyes, I am comforted by the thought that the human brain, though capable of processing and storing ridiculous amounts of information, is flawed. The permanence of memory is an illusion. My memory of this moment will weaken over time. It will be altered by future experiences, until what I envision when I try to recall it will be only a faint reflection of what I actually feel. Eventually, this pain won’t overwhelm me, and I will finally be able to let go.

A Moral Gene?

If our moral psychology is a Darwinian adaptation, what does that say about human nature? About social policy, which always presupposes something about human nature? About morality itself?

Steven Pinker

Morality is often considered to be the domain of philosophers, not biologists. But scientists have often wondered what role our genomes play in directing our moral compass. Today, a paper was published in the open access journal PLoS ONE which found moral decision making was influenced by different forms of a single gene.

Picture yourself standing at branching train tracks with a unstoppable train barreling towards you. On one side, an evil villain has tied five people, while on the other, he has tied only one. You’ve got the switch in your hands which chooses which track the train goes down. Do you feel it’s morally acceptable to choose to kill the one instead of the five?

The scenario above is an example of foreseen harm. When such harm is unintentional, like in the train situation, most people are willing to go with Spock and say that the needs of the many outweigh the needs of the few. But previous research has found that people taking a particular group of antidepressants called selective serotonin reuptake inhibitors (SSRIs) were different – overall, they were less willing to say that killing the one person is morally justified, even if it’s unavoidable.

Serotonin is a chemical released at the junctions between nerves as a part of signaling in the brain. Since lower levels of serotonin are linked to sadness and depression, it is thought that by preventing the reuptake of serotonin, SSRIs fake higher overall serotonin levels and thus boost happy feelings.

But the connection between SSRIs and morality got Abigail Marsh and her colleagues from Georgetown University and the National Institutes of Health thinking. They knew that natural variation in serotonin reuptake ability exists in the population because of alterations in the promoter for one of the serotonin transmitter genes. People with the long form of the promoter (L) have normal levels of reuptake, while those with a truncated version (S) have reduced serotonin reuptake, similar to taking an SSRI. The researchers wondered if this natural variation influenced moral decision making in the same way that treatment with an SSRI does.

So, they took 65 healthy volunteers and tested their genes to see what versions of the promotor they had. Overall, 22 had two copies of the long form of the gene (LL), 30 had one of each (SL), and 13 had two copies of the short form of the gene (SS). They then asked these individuals to rate the overall morality of a variety of scenarios, including ones like the one above where one person is unintentionally harmed to save five others.

The results were clear: although the three groups showed no differences when presented with morally neutral scenarios or those where harm is intentionally caused to an individual, there were significant differences between groups when it came to scenarios of foreseen harm. Those with the long form of the promoter were much more willing to approve of harming one person to protect five. They felt that doing so was the better moral choice:

Those with the short form of the gene, however, felt that harming the one was morally neutral.

“I think this study is useful in helping to point out that maybe the way people arrive at their moral intuitions is just different for different people, in ways that are very deeply rooted,” says Marsh, the lead author, in a press release. Indeed, moral decision making may be as deeply rooted as it can be – that is, in our genomes.

Of course, as the quote from Steven Pinker at the beginning alluded, this kind of result leads to bigger questions. How has natural selection shaped what we think is right and wrong? How much of our moral code is influenced by our genes? And what does this say about the nature of morality itself?

Research: Marsh, A., Crowe, S., Yu, H., Gorodetsky, E., Goldman, D., & Blair, R. (2011). Serotonin Transporter Genotype (5-HTTLPR) Predicts Utilitarian Moral Judgments PLoS ONE, 6 (10) DOI: 10.1371/journal.pone.0025148