Dr. Christie Wilcox is a science writer based in the greater Seattle area. Her bylines include National Geographic, Popular Science, and Quanta. Her debut book, Venomous, released August 2016 (Scientific American/FSG Books). To learn more about her life and work, check out her webpage or follow her on Twitter, Google+, or Facebook.
There was simply no way I could have predicted that a groggy conversation over a cup of over-sugared coffee would be directly responsible for making me cry on a crowded plane as I headed back home to Hawaii.
Unlike just about every press person at this year’s AAAS meeting, I wasn’t looking for something to write about. My interest and presence at this conference of the largest general scientific society in the world was somewhat philanthropic. Way back in May of last year, I was asked by Linda Cendes to be on a AAAS panel organized by Cornelia Dean to talk about the importance of social media in science. At their suggestion, I pitched a workshop as well, both of which made it into the program. I was at AAAS as a resource, to help convince scientists of the importance of social media and help answer their questions about emerging media technologies. At a conference boasting over 8,000 people and over 1,000 press registrants, it was pretty much dumb luck that Dan Droulette Jr. and I ended up in conversation in the press coffee room.
I was somewhere over a whitewashed South Dakota when I popped open my laptop and stared blankly at the screen, trying desperately to distill my experience from the past three weeks into a post.
I left Hawaii on January 29th and headed to North Carolina for ScienceOnline 2013, a conference and unconference that feels more like a family reunion. I have been attending ScienceOnline for four years now, yet every year I approach the conference with a certain amount of apprehension. ScienceOnline isn’t like any other conference I go to; it’s a tight community of innovators, idea generators, influencers and instigators. In short, it can be kind of intimidating, even for seasoned veterans like me. This might sound ironic considering this year I gave a session on Impostor Syndrome, but there’s a certain doubt that creeps into the back of my mind when I’m surrounded by these brilliant people — like I’m a C student that has snuck into a Mensa meeting to steal the free food. Given how many people get turned away from attending, I feel like I need to earn my place at the conference. Walking into a room full of people like Maryn Mckenna, David Dobbs, Carl Zimmer, Karyn Traphagen, and Liz Neeley is a surreal and slightly terrifying experience. I imagine it’s the same feeling a hopeful musician would have if they walked into a bar and found Jimi Hendrix, Beethoven, and Elvis Presley chatting over beers while Michael Jackson is blasting from the juke box. Part of me questions whether I belong in such esteemed company — but mostly, mostly, I feel this deep, burning drive to make sure I do. Continue reading “Thoughts after #Scio13 and #AAASmtg”
For those who are familiar with this blog from Scientific American, well hello again! Good to see you! I might be on a new site, but rest assured, I’ll still be serving up the same raw science you’ve come to expect from me.
If you’re a n00b: Welcome!
My name is Christie, and I am a bit of an amalgam. I am one part marine biologist, pursuing my PhD in Cell and Molecular Biology at the University of Hawaii, where I study the protein toxins in the world’s most venomous fish. I’m one part social media specialist, encouraging my scientific colleagues to get themselves out there and teaching them how to do so. But here, at Science Sushi, I am a giddy schoolgirl, uncontrollably blurting out all of the neat things that make me squeal with excitement. Continue reading “Hello!”
A year and a half ago, the decision to pack up shop at ScienceBlogs and begin blogging at Scientific American was an easy one. The inimitable Bora Zivkovic had assembled a blogging dream team, a group of people I respected and admired and couldn’t wait to call networkmates. Under Bora’s nurturing oversight, we all have flourished, and the SciAm blog network has become the most diverse and prolific science blogging network around. It is a supportive, successful, and safe place, full of people I am proud to call colleagues and friends, though they feel more like family. Here, I have had the freedom to write about everything from GMOs to robot babies to fake poop. I’ve played with tone and style, screaming at the top of my lungs one moment and gushing like a fangirl the next. Through it all, I have grown as a writer and person, and truly found my voice.
Like most sailors, though, I am not content to stay in port. I itch for the adventure of uncharted waters. I couldn’t have asked for a better safe harbor than SciAm, and I will always be indebted to everyone here, especially Bora for taking a chance on this young, spirited blogger. But, the time has come for me to cast off. It is with a mix of sadness, excitement, and just the right amount of fear that I wave a fond farewell to SciAm and announce that, as of today, Science Sushi is setting sail for Discover.
I am simply bubbling with excitement that I will be blogging alongside the likes of Razib Khan, Corey S. Powell, Keith Kloor, and the rest of the extraordinary crew over there. I’ve always been in awe of the high-caliber writers Discover has hosted over the years, and can hardly believe that I am becoming one of them.
So, if you would like, update bookmarks and feeds accordingly. I know that some of you may not make the journey, but I sincerely hope you will join me over at Discover. If you like what I’ve done here, I’d really appreciate your help in spreading the word about this move. Whether it’s a tweet, a blog post, +1, a facebook like or a stumble—whatever you are comfortable with—every bit helps! Comments will be closed here at Scientific American, but all of my old posts are coming with me, so you can continue the conversation over at Discover when the blog launches (soon, I promise!).
All aboard, and anchors aweigh!
Title quote has been attributed to Grace Hopper and William Shedd, so I don’t know who really said it. Image taken by me of the R/V Hi’ialakai at port at Midway Atoll
It seems like every time a male republican tries to talk about women, he somehow says something stupid and misogynistic. Last year, Missouri candidate Todd Akin was torn apart for his negligent comment that, when a woman is raped, she needn’t worry about pregnancy because “the female body has ways to try to shut that whole thing down.” Akin was vilified by his own party and lost the election. But then, just when we think republicans might have learned their lesson, former OB/GYN and current congressman from Georgia Phil Gingrey makes an even more careless blunder, calling Akin’s remark “partially right.” He cites his experience as a doctor trying to aid struggling couples, and how he told them “Just relax. Drink a glass of wine. And don’t be so tense and uptight, because all that adrenaline can cause you not to ovulate.’”
I don’t care that Gingrey has since come out with a statement saying he did not mean to say he agrees with or supports Akin’s comment. That’s exactly what he did. Citing his authority as a physician, Gingrey made the dangerous and erroneous claim that rape victims are less likely to get pregnant. When you reference science using your education to create an air of authority, you don’t have the luxury of saying ‘oops’. You get no excuses.
The errors in Akin and Gingrey’s comments are beyond the issue of whether there is such a thing as so-called “legitimate” rape. Even if we overlook the fact that they made the misogynistic assertion that women frequently lie about being raped, they are, in fact, just plain wrong when it comes to the science. Make no mistake: Gingery and Akin’s assertion that rape is less likely to result in pregnancy isn’t correct—it’s not even kind of correct. Rape victims are not less likely to ovulate or get pregnant. They aren’t even equally likely to. They are more likely to.
If you carefully listen to Gingrey’s argument, he makes the assertion that the stress a woman experiences when trying and failing to get pregnant is the same as that experienced by a rape victim. He equates high over all stress—being anxious and nervous throughout a woman’s menstrual cycle—with the acute, immediate stress induced by a violent, demeaning act. This argument is fundamentally flawed. These are not the same—not even close.
As a doctor, Gingrey should be aware of the difference. Stressed individuals like couples desperately seeking a child suffer from high overall levels of serum stress hormones like cortisol, which has consequences throughout the body. I assume that Gingrey was referring to the fact that, in women, periods of prolonged physical or emotional stress can lead to reproductive difficulties. Athletes, for example, will often fail to ovulate, or have irregular menstrual cycles—a condition known as exercise-related female reproductive disfunction—which is thought to be the result of higher overall stress hormone levels from their exhausting regimen and decreased levels of estrogen. So, his advice to struggling couples has merit; relaxing and reducing day-to-day stress may indeed help promote healthy natural cycles, making pregnancy more likely (though his suggestion of drinking alcohol to reduce this stress is way off base).
That said, the stress response elicited by rape isn’t chronic; it’s acute. Rape victims aren’t more emotionally or physically stressed than their peers before their attack occurs. No, what a rape victim suffers is an immediate, strong stress response, more akin to how an animal might react when attacked by a predator.
The difference between the effects of chronic stress and acute stress on reproduction cannot be understated. During chronic stress, serum cortisol levels are elevated repeatedly and for long periods of time, but during acute stress, serum cortisol levels only rise briefly. “There appears to be little impact of short-term increases in cortisol concentrations,” explain scientists in a review of the effects of stress on reproduction. “Many short-term stresses fail to affect reproduction and there are reports of stimulatory effects.” By ‘stimulatory effects’, they mean that acute stress actually seems to have the exact opposite effects of chronic stress: it induces ovulation instead of preventing it.
This makes sense, really, when you consider that Gingrey’s comment that ‘adrenaline prevents ovulation’ is, simply put, flat-out, 100%, dead wrong. I have no idea what he was thinking when he made that statement, because decades of scientific research show the opposite. While high levels of chronic stress may prevent proper development and release of eggs, not only does adrenaline itself not prevent ovulation, it’s required for it. Adrenaline injection even forces ovulation in fish, rabbits, chickens, and rats.
From a biological standpoint, ovulation is caused by the interplay of a number of different hormones. When a woman is ready to ovulate, specialized cells in the brain are stimulated, causing the release of GnRH (gonadotrophin releasing hormone), which further triggers the release of two other hormones—LH (luteinizing hormone) and FSH (follicle stimulating hormone)—into the bloodstream. These, together, induce ovulation. The surge of LH and FSH is so well documented that scientists and doctors can use serum levels of these hormones to track ovulation in animals as well as in humans.
Lots of animals ovulate when acutely stressed. In sheep, ewes spontaneously ovulate when they are stressed by transport or medical treatments. Similarly, female fish exposed to an acutely stressful event were 9 times more likely to ovulate within 72 hours. Acute stress can cause surges in FSH and LH that could induce ovulation in species ranging from rats to monkeys and even people, especially if a female is near ovulation to begin with. Not only does stress cause LH levels to rise, the worse the stress, the more they rise. This led scientists to conclude that women “may be induced to ovulate at any point of the menstrual cycle…if exposed to an appropriate acute stressor.”
I don’t want to scare you: none of what I just told you means rape guarantees ovulation. The duration of elevated stress hormones, timing in regards to a woman’s natural cycle, intensity of the stress response and even just innate physiological variability between women can all affect how a woman’s reproductive system responds to stressful events. Furthermore, not all rape victims react the same way (emotionally or physically) to what happened to them, and these differences will affect how their body responds. But contrary to Akin and Gingrey’s assertions, science suggests that rape victims have every reason to worry about pregnancy.
Given the connection between acute stress and ovulation, it shouldn’t come as a shock that that the odds of getting pregnant from rape are significantly higher than from consensual sex. Read that sentence again. I really want this to soak in. You are more likely to get pregnant if you are raped.
The science that stands behind that statement is strong. For a while, scientists debated whether it was true. You can’t just compare the pregnancy rates from raped and unraped women because there are many factors involved, including birth control use, number of intercourse events, and the rate of rape reporting if conception doesn’t occur. But recent studies have taken into account these confounding variables, and the result is crystal clear. Based on real data on pregnancy in the United States, they estimate that percentage of pregnancies resulting from forced intercourse is around 8%—significantly higher than the 3% that result from single episodes of consensual sex.
Gingrey is just wrong on all accounts, and so is Akin. There is no evidence to support the role of adrenaline-mediated prevention of ovulation due to rape. There is no science to support their insinuations that, somehow, rape victims are less likely to get pregnant. Their statements directly contradict reproductive science, and serve only to demean women who have already undergone a terrible atrocity. There is simply no excuse for such blatant ignorance and thinly-veiled misogyny, especially coming from the mouth of someone claiming to ‘know about these things.’
Republican war on women, by Eclectablog
Here’s a tip for the GOP and republicans in general: stop citing biology to defend your misogynistic positions. At least stop claiming things to be true without a cursory look at the literature. It’s not hard to look these things up, boys, and you have a team of assistants to do such things for you. When you flap your lips without even the slightest clue as to what the science actually is on the subject, you look stupid at best. I’d say stop talking in general, but I think it’s good that the general public sees your positions for what they really are. On second thought, ignore my advice: keep on trucking. The baseless, unscientific lies that you tell will only serve to strengthen the people who run against you.
Though childish songs make crude jokes, there’s nothing funny about diarrhea. Aside from the painful, twisting feeling in your guts, there’s just something psychologically upsetting about losing control of your bowels. It’s embarrassing. It’s disgusting. And we’ve all been there.
Micrograph of C. difficile bacteria from a stool sample culture
But for many, diarrhea is more than a shameful stain to be washed away in an impromptu laundry load; in the US alone, more than 500,000 suffer and 15,000 die every year from uncontrollable diarrhea caused by infection with Clostridium difficile. These rod-shaped bacteria are commonly found in the environment and even in our bodies, but have lately become a major concern in hospitals where antibiotics leave patients without the natural flora that protect their bodies. When C. difficile populations grow unchecked, toxins produced by the bacteria cause inflammation and cell death, leading to the explosive symptoms that, if not controlled, can lead to severe dehydration, kidney failure, holes in the intestines and death. Patients already weakened by other illnesses are particularly at risk of succumbing. Epidemics of C. difficile have become such a serious problem that the infection now rivals the superbug MRSA as one of the top emerging disease threats.
A diagram of how a stool transplant is performed
With more and more strains of C. difficile becoming resistant to antibiotics, doctors have had to find creative ways to treat the infection. One of the most promising (if not revolting) treatments that has been tested in recent years is called fecal bacteriotherapy or ‘stool transplant’, which involves taking donor poop from a healthy patient and inserting it into the gut of an infected one as a form of probiotics, seeking to replace the protective flora. I wish I was kidding. The procedure involves collecting crap from a close relative that has been screened for other microbial pathogens, mixing it with saline or milk until it reaches the right consistency, and then placing it directly in the patient’s digestive system through a tube inserted either up the anus or down from their mouths. Though it might sound gross, the dose of healthy, helpful bacteria has shown to be very effective.
Of course, most patients aren’t exactly thrilled with the suggestion of shoving someone else’s bowel movements up their rectums or down their throats. Now, University of Guelph researchers have developed a more sanitary way of achieving the same results: synthetic poop.
The researchers created fake feces, aptly named RePOOPulate, after careful examination of bacterial colonies grown from the stool of healthy volunteers. Once the right ratio of species was determined, 33 different bacteria were grown in a robotic intestine simulator affectionately called Robo-gut to create a ‘super-probiotic’ stool substitute. According to the scientists, the bacterial mixture is much more palatable than what it mimics, and smells significantly better. Two patients treated with RePOOPulate showed marked improvement after three days, remaining C. difficile-free months after treatment. Tests of their intestinal flora showed that the fake crap successfully introduced beneficial bacteria to the patients’ guts.
It’s hard to say exactly how effective the new treatment is off of such a small test, but the results are very promising. Because scientists can control the bacterial mixture in RePOOPulate, there is less risk of introducing potentially harmful bacteria than with regular stool transplants, and the treatment can be tweaked to meet the needs of different patients. This proof-of-concept paper opens the doors for future testing. In time, RePOOPulate may prove a safe and effective treatment for C. difficile infection, as well as other gut diseases caused by the imbalance of beneficial bacteria like inflammatory bowel disease.
Citation: Petrof E.O., Gloor G.B., Vanner S.J., Weese S.J., Carter D., Daigneault M.C., Brown E.M., Schroeter K. & Allen-Vercoe E. (2013). Stool substitute transplant therapy for the eradication of Clostridium difficile infection: ‘RePOOPulating’ the gut, Microbiome, 1 (1) 3. DOI: 10.1186/2049-2618-1-3
Take a moment to look at yourself in the mirror. I want you to really examine your features—the curves, lines and shapes that make up your face. How broad is your chin? Narrow, or wide? How big is your mouth in comparison? Or your nose? Do you have strong, prominent eyebrows? How close are they together?Just look at this face. Do you trust me? A new study in PLoS ONE says you might, even though you don't know why.
Or, more simply, what color are your eyes?
In a study published today in PLoS ONE, researchers from from Charles University in the Czech Republic had 238 participants rate the faces of 80 students for trustworthiness, attractiveness, and dominance. Not surprisingly, they found that the three measures correlated well with each other, with faces rating high on one scale rating high on the other two. Female faces were generally more trustworthy than male ones. But that’s wasn’t all. A much more peculiar correlation was discovered as they looked at the data: brown-eyed faces were deemed more trustworthy than blue-eyed ones.
It didn’t matter if the judge was male or female, blue-eyed or brown-eyed. Even accounting for attractiveness and dominance, the result was the same: brown-eyed people’s faces were rated more trustworthy. There was some evidence of in-group bias, with blue-eyed female faces receiving lower ratings from brown-eyed women than from blue or green-eyed ones, but this difference didn’t drive the phenomenon. All the participants, no matter what eye color they had or how good-looking they thought the face was agreed that brown-eyed people just appear to look more reliable.
The real question is why? Is there a cultural bias towards brown eyes? Or does eye color really correlate somehow with personality traits like accountability and honesty? Does eye color really matter that much?
To find out, the scientists used computer manipulation to take the same faces but change their eye colors. Without changing traits other than hue of the iris, the researchers swapped the eye colors of the test faces from blue to brown and vice versa. This time, the opposite effect was found. Despite the strange correlation to eye color, the team found that eye color didn’t affect a photo’s trustworthiness rating. So it isn’t the eye color itself that really matters—something else about brown-eyed faces makes them seem more dependable.
To get at what’s really going on, the researchers took the faces and analyzed their shape. They looked at the distances between 72 facial landmarks, creating a grid-like representation of each face. For men, the answer was clear: differences in face shape explained the appeal of brown eyes.
Shape changes associated with eye color and perceived trustworthiness, from the grid-based facial shape analysis done by the researchers. Note the similarities between the shapes of brown-eyed faces and trustworthy ones.
“Brown-eyed individuals tend to be perceived as more trustworthy than blue-eyed ones,” explain the authors. “But it is not brown eyes that cause this perception. It is the facial morphology linked to brown eyes.”
Brown-eyed men, on average, have bigger mouths, broader chins, bigger noses, and more prominent eyebrows positioned closer to each other, while their blue-eyed brethren are characterized by more angular and prominent lower faces, longer chins, narrower mouths with downward pointing corners, smaller eyes, and more distant eyebrows. The differences associated with trustworthiness are also how our faces naturally express happiness—an upturned mouth, for example—which may explain why we trust people who innately have these traits.
Although the trend was the same for female faces, researchers didn’t find the same correlation between trustworthiness and face shape in women. This result is puzzling, but female faces were overall much less variable than male faces, so it’s possible the statistical analyses used to test for correlation were hampered by this. Or, it’s possible that something else is in play when it comes to the trustworthiness of female faces. The researchers hope that further research can shed light on this conundrum.
Given the importance of trust in human interactions, from friendships to business partnerships or even romance, these findings pose some interesting evolutionary questions. Why would certain face shapes seem more dangerous? Why would blue-eyed face shapes persist, even when they are not deemed as trustworthy? Are our behaviors linked to our bodies in ways we have yet to understand? There are no easy answers. Face shape and other morphological traits are partially based in genetics, but also partially to environmental factors like hormone levels in the womb during development. In seeking to understand how we perceive trust, we can learn more about the interplay between physiology and behavior as well as our own evolutionary history.
Citation: Kleisner K., Priplatova L., Frost P. & Flegr J. (2013). Trustworthy-Looking Face Meets Brown Eyes., PLoS ONE, 8 (1) e53285. DOI: 10.1371/journal.pone.0053285
“Oh, beauty is a beguiling call to death, and I’m addicted to the sweet pitch of its siren.” – Johnny Quid, RocknRollaFemale of the Emerald cockroach wasp Ampulex compressa manipulating an American cockroach, Periplaneta americana, which has been made docile by wasp venom and that will serve as food for the wasp larva. Image courtesy of Gudrun Herzner.
Glinting in shimmering shades of blue and green, the emerald cockroach wasp is surely a thing of beauty, but its shimmering exterior masks its cruel nature. The emerald cockroach wasp is one nature’s most impressive neurochemists. At its core, it is a parasite. The female wasp lays her eggs on a cockroach host, and when they hatch, the larvae eat the creature from the inside out. You’d think the cockroach would be opposed to this idea, but instead the insect patiently awaits its fate while the larvae mature. Cockroaches are much larger than even a full grown wasp, and certainly could put up a fight, but that is where the wasps’ ingenious manipulation of neurochemistry comes in. When she encounters a potential host, the female cockroach wasp first stings the cockroach in its abdomen, temporarily paralyzing its front legs and allowing the wasp to perch precisely on its head. She then stings the roach again, this time delivering venom directly into a part of the roach’s brain called the sub-esophageal ganglia. This doesn’t kill the roach. Instead, it puts the roach in a zombie-like trance. The roach is less fearful and loses the will to flee. It allows the wasp to lead it by its antennae, like a dog on a leash, to the wasp’s burrow where the roach will play the martyr for the wasp’s unborn children. Even though the roach is fully capable of locomotion during the week to month that passes from when the wasp stings the its brain until the hungry brood finish eating it alive, the zombified insect doesn’t move. Emerald cockroach wasps have elevated neural manipulation to an art form to create perfect living incubators.
But, though the roach has been rendered harmless, the wasp-to-be is threatened by other organisms. Humans aren’t the only species that have to worry about their food spoiling—so do emerald cockroach wasps. Cockroaches truly are dirty creatures, and their insides are home to a suite of bacteria that can harm the wasp’s vulnerable larvae. One of these potential threats is Serratia marcescens, a vile sort of Gram negative bacteria found in cockroach bodies. It’s the same bacteria responsible for a number of human urinary tract infections and the weird pink stains that form in our toilets and showers. In insects, its effects are much more deadly. The bacteria possess a suite of protein-degrading enzymes that cut apart fragile larval cells. The larvae aren’t entirely defenseless, though—as a new study published today in the Proceedings of the National Academy of Sciences reveals, larval wasps sterilize their food by secreting antimicrobial compounds.
For many parasitic wasps, microorganisms are a serious concern. Studies on another wasp, Microplitis croceip, found that contamination with Serratia marcescens can lead to a 25% reduction in successful parasite emergence, and even the young that do survive can be infected. When adults are exposed to the bacteria, almost 80% die. The emerald cockroach wasp must defend against this mortal enemy, or pay the ultimate price.
But how do you protect yourself against bacteria that live inside your food? Well, you do what we do to foods that house potentially harmful pathogens: you sterilize them.
Oral droplets secreted by wasp larvae, figure 1 from the paper
Gudrun Herzner and a team from the University of Regensburg in Germany noticed that larval wasps secrete droplets from their mouths that they disperse around before they feed on their cockroach meal. They suspected these secretions kill off potentially deadly bacteria, allowing the larvae to eat in peace. The researchers tested the antimicrobial activity of the oral secretions to see if they were right.
When added to bacterial cultures from the cockroach, the droplets killed off a wide variety of bacteria, including the potentially deadly Serratia marcescens. But the team wanted to know more: exactly what in the droplets killed off bacteria? So, the researchers isolated the secretions and ran them through gas chromatography–mass spectrometry to determine the nature of the substances in them. They found nine compounds previously unknown from the wasps or the cockroaches. In particular, the secretions contained a large percentage of two compounds, a kind of mellein called (R)-(-)-mellein, and micromolide, a natural product that may hold the key to treating drug-resistant tuberculosis. Both compounds showed broad-spectrum antibacterial activity, and the combination of the two was particularly effective.
As a final test, they extracted parasitized and unparasitized cockroaches and looked for these compounds. From the parasitized cockroaches, the same antimicrobial mixture could be extracted, but not from unparasitized ones. Thus, the scientists were confident that the wasp larvae produce and use these compounds to sterilize their food from the inside out.
“We found clear evidence that A. compressa larvae are capable of coping with antagonistic microbes inside their P. americana hosts by using a mixture of antimicrobials present in their oral secretion,” write the authors. While both compounds used by the larvae have been found in other animals, never before has the combination been discovered, in insects or otherwise.
The broad-spectrum nature of these antimicrobials may be key to the wasps’ success. “Food hygiene may be of vital importance, especially to the vulnerable early developmental stages of insects,” explain the authors. “The range of microbes that A. compressa larvae may encounter during their development in their hosts is unpredictable and may encompass all different kinds of microbes, such as various Gram-positive and Gram-negative bacteria, mycobacteria, viruses, yeasts, and ?lamentous fungi.” It would be vital, then, that the antimicrobial compounds produced by the wasps are able to fend off a wide variety of potential contaminants. “The secretion of a blend of antimicrobials with broad-spectrum activity seems to represent an essential frontline defense strategy.”
These beguiling wasps not only have mastered neurochemistry, they have aced microbiology to become proficient parasites. Already, this tiny wasp has given us great insights into brains through the study of its particularly effective zombification strategy. Now, it is shedding light on another field of science. This study is one of the first to suggest that larval parasites possess the ability to protect themselves against microbial pathogens, but the authors suspect many species of insects may have similar strategies. These small creatures may prove a vital new resource for natural products to fight against human diseases. Who knows what other pharmaceutical secrets are being kept by insects like the emerald cockroach wasp, and what ailments we might be able to treat with their chemical arsenal.
Citation: Herzner G., Schlecht A., Dollhofer V., Parzefall C., Harrar K., Kreuzer A., Pilsl L. & Ruther J. (2013). Larvae of the parasitoid wasp Ampulex compressa sanitize their host, the American cockroach, with a blend of antimicrobials. PNAS Early Edition, DOI: 10.1073/pnas.1213384110
There’s a lot to be said for smarts—at least we humans, with some of the biggest brains in relation to our bodies in the animal kingdom, certainly seem to think so. The size of animal brains is extravagantly well-studied, as scientists have long sought to understand why our ancestors developed such complex and energetically costly neural circuitry.
One of the most interesting evolutionary hypotheses about brain size is The Expensive Tissue Hypothesis. Back in the early 1990s, scientists were looking to explain how brain size evolves. Brains are exceedingly useful organs; more brain cells allows for more behavioral flexibility, better control of larger bodies, and, of course, intelligence. But if bigger brains were always better, every animal would have them. Thus, scientists reasoned, there must be a downside. The hypothesis suggests that while brains are great and all, their extreme energetic cost limits their size and tempers their growth. When it comes to humans, for example, though our brains are only 2% of our bodies, they take up a whopping 20% of our energy requirements. And you have to wonder: with all that energy being used by our brains, what body parts have paid the price? The hypothesis suggested our guts took the hit, but that intelligence made for more efficient foraging and hunting, thus overcoming the obstacle. This makes sense, but despite over a century of research on the evolution of brain size, there is still controversy, largely stemming from the fact that evidence for the expensive tissue hypothesis is based entirely on between species comparisons and correlations, with no empirical tests.
With beauty and brains, Poecilia reticulata help give insights into cognitive evolution
A unique study published this month in Current Biology has taken a new approach to examining this age old question. Rather than comparing species with bigger brain-to-body ratios to smaller-brained relatives, they exploited the natural variation of brain size in guppies (Poecilia reticulata). Guppies, as it turns out, aren’t as dumb as they look. They’re able to learn, and show rudimentary ability to count. Researchers from Uppsala University in Sweden were able to use their numerical abilities to test whether brain size affects intelligence in these simple fish.
Fig. 2 from the paper; Large-brained females outperform small-brained females in a numerical learning task. Depicted are the mean and standard error values for the number of times, out of eight tests, that an individual chose the correct option of either two or four objects in females and males selected for large and small brain size.
First, the team selected for larger and smaller brains from the natural variation in guppies. They successfully created smarty-pants guppies that had brains about 9% larger than their counterparts through artificial selection. Then, they put them to the test. While the males seemed to gain no benefits from possessing larger noggins, the females with bigger brains were significantly better at the task.
But what was really remarkable was the cost of these larger brains. Gut size was 20% smaller in large-brained males and 8% smaller in large-brained females. The shrunken digestive system seemed to have serious consequences reproductively, as the smarter fish produced 19% fewer offspring in their first clutch, even though they started breeding at the same age as their dumber counterparts. And, the authors noted, this was in an idealized tank setting with an plenty of food—what about in the wild, where resources are harder to come by? How much of a cost does a reduced gut have when meals aren’t guaranteed?
“Because cognitive abilities are important to facilitate behaviors such as ?nding food, avoiding predation, and obtaining a mate, individuals with increased cognitive abilities are likely to have higher reproductive success in the wild,” explain the authors. These benefits, though, don’t come cheap. “Our demonstration of a reduction in gut size and offspring number in the experimental populations selected for larger relative brain size provides compelling experimental evidence for the cost of increased brain size.”
There are still many questions to be answered. For example, the authors aren’t entirely sure why females were the only ones to show cognitive improvement with larger brains. They suggest that, perhaps, the researcher’s measure of intelligence (the numerical task presented to the guppies) may be be geared toward female behaviors. “In the guppy, females are more active and innovative while foraging,” they explain. “Because females feed more, they may thus have had more time to associate the cue with food in our experimental design.”
The clear trade-off between brains and guts, though, is an important finding. By providing empirical evidence for the physiological costs of brains, this study provides the ?rst direct support for the expensive-tissue hypothesis, and can provide us with insights into how our own big brains evolved. One of the prevailing hypothesis for our own brain growth is that the incorporation of more animal products into our diets, through hunting or cooking or however, allowed us to obtain more energy from less food, thus offsetting the cost of a reduced gut. The less food we needed to eat for the same amount of energy, the more our brains could grow even if our guts suffered for it. The debate, however, is far from over. Comparative analyses in primates don’t support a gut-brain tradeoff, and there are certainly plenty of other hypothesis as to how and why we developed our massive lobes, and what prices our bodies paid for them.
Kotrschal A., Rogell B., Bundsen A., Svensson B., Zajitschek S., Brännström I., Immler S., Maklakov A. & Kolm N. (2013). Artificial Selection on Relative Brain Size in the Guppy Reveals Costs and Benefits of Evolving a Larger Brain, Current Biology, DOI: 10.1016/j.cub.2012.11.058
Tonight, we usher in a brand new year and say farewell to 2012. The first full year here at Scientific American Blogs. The year of the Higgs Boson. The year Curiosity landed on Mars. The year the world was ending, but didn’t.
It’s been a good year here at Science Sushi. In the past year…
I’m thankful for the wonderful year that I have had here at Scientific American, and am excited to start 2013 on such a high note. Thank you to all of you who read this blog: let’s keep this bio-nerdy party going all through 2013!