Brain’s reaction to the taste of beer helps explain why it’s hard to stop at one

Beer gets into our heads, even before the alcohol has time to kick in.
Image credit: 123RF Stock Photo

I remember quite vividly the first time I tried beer — I almost spit it out. Bitter, bubbly and generally bad, I didn’t get why everyone seemed to be so enamored with it. Yet I, like so many people in the world, continued to drink it. Have you ever wondered why we, as a species, consume alcoholic beverages even though they taste terrible at first?

A new study suggests that despite the bitter taste, the chemicals in beer trigger the brain’s reward system. This pleasurable effect might just explain why we’re so willing to keep drinking past the first sip — until intoxication takes over, and we’ll drink just about anything. But more importantly, this new research, published today in the journal Neuropsychopharmacology,  may explain why some people can drink casually while others slip into alcoholism.  Continue reading “Brain’s reaction to the taste of beer helps explain why it’s hard to stop at one”

‘Mystery meat’ takes on a whole new meaning

100% beef… if horses count as beef.

In case you didn’t hear, the big news in the food industry this week is the fact that — *gasp* — horsemeat has been detected in Burger King burgers and Ikea’s Swedish meatballs. Noses worldwide are turning up in disgust at the use of such crude ingredients in ground beef products.*

There’s no doubt that a good part of the fuss is that, for some of the Western world, horsemeat is taboo. Many people have an immediate, visceral reaction to the notion of eating horse, just like Americans generally react strongly to the idea of eating dogs. While our preferences are culturally rooted, the recent labeling exposures don’t just offend our palates. As consumers, we rely on retailers and restaurants to give us accurate information about which foods we are buying — whether it be to avoid allergies, follow religious preferences, choose more sustainable options, or count calories. Now, DNA barcoding is exposing just how often we are duped. Continue reading “‘Mystery meat’ takes on a whole new meaning”

Gingrey is a bad doctor, says science

"I’ve delivered lots of babies, and I know about these things" — apparently, Phil, you don't. Photo by CQ Roll Call.

It seems like every time a male republican tries to talk about women, he somehow says something stupid and misogynistic. Last year, Missouri candidate Todd Akin was torn apart for his negligent comment that, when a woman is raped, she needn’t worry about pregnancy because “the female body has ways to try to shut that whole thing down.” Akin was vilified by his own party and lost the election. But then, just when we think republicans might have learned their lesson, former OB/GYN and current congressman from Georgia Phil Gingrey makes an even more careless blunder, calling Akin’s remark “partially right.” He cites his experience as a doctor trying to aid struggling couples, and how he told them “Just relax. Drink a glass of wine. And don’t be so tense and uptight, because all that adrenaline can cause you not to ovulate.’”

I don’t care that Gingrey has since come out with a statement saying he did not mean to say he agrees with or supports Akin’s comment. That’s exactly what he did. Citing his authority as a physician, Gingrey made the dangerous and erroneous claim that rape victims are less likely to get pregnant. When you reference science using your education to create an air of authority, you don’t have the luxury of saying ‘oops’. You get no excuses.

The errors in Akin and Gingrey’s comments are beyond the issue of whether there is such a thing as so-called “legitimate” rape. Even if we overlook the fact that they made the misogynistic assertion that women frequently lie about being raped, they are, in fact, just plain wrong when it comes to the science. Make no mistake: Gingery and Akin’s assertion that rape is less likely to result in pregnancy isn’t correct—it’s not even kind of correct. Rape victims are not less likely to ovulate or get pregnant. They aren’t even equally likely to. They are more likely to.

If you carefully listen to Gingrey’s argument, he makes the assertion that the stress a woman experiences when trying and failing to get pregnant is the same as that experienced by a rape victim. He equates high over all stress—being anxious and nervous throughout a woman’s menstrual cycle—with the acute, immediate stress induced by a violent, demeaning act. This argument is fundamentally flawed. These are not the same—not even close.

As a doctor, Gingrey should be aware of the difference. Stressed individuals like couples desperately seeking a child suffer from high overall levels of serum stress hormones like cortisol, which has consequences throughout the body. I assume that Gingrey was referring to the fact that, in women, periods of prolonged physical or emotional stress can lead to reproductive difficulties. Athletes, for example, will often fail to ovulate, or have irregular menstrual cycles—a condition known as exercise-related female reproductive disfunction—which is thought to be the result of higher overall stress hormone levels from their exhausting regimen and decreased levels of estrogen. So, his advice to struggling couples has merit; relaxing and reducing day-to-day stress may indeed help promote healthy natural cycles, making pregnancy more likely (though his suggestion of drinking alcohol to reduce this stress is way off base).

That said, the stress response elicited by rape isn’t chronic; it’s acute. Rape victims aren’t more emotionally or physically stressed than their peers before their attack occurs. No, what a rape victim suffers is an immediate, strong stress response, more akin to how an animal might react when attacked by a predator.

The difference between the effects of chronic stress and acute stress on reproduction cannot be understated. During chronic stress, serum cortisol levels are elevated repeatedly and for long periods of time, but during acute stress, serum cortisol levels only rise briefly. “There appears to be little impact of short-term increases in cortisol concentrations,” explain scientists in a review of the effects of stress on reproduction. “Many short-term stresses fail to affect reproduction and there are reports of stimulatory effects.” By ‘stimulatory effects’, they mean that acute stress actually seems to have the exact opposite effects of chronic stress: it induces ovulation instead of preventing it.

This makes sense, really, when you consider that Gingrey’s comment that ‘adrenaline prevents ovulation’ is, simply put, flat-out, 100%, dead wrong. I have no idea what he was thinking when he made that statement, because decades of scientific research show the opposite. While high levels of chronic stress may prevent proper development and release of eggs, not only does adrenaline itself not prevent ovulation, it’s required for it. Adrenaline injection even forces ovulation in fish, rabbits, chickens, and rats.

From a biological standpoint, ovulation is caused by the interplay of a number of different hormones. When a woman is ready to ovulate, specialized cells in the brain are stimulated, causing the release of GnRH (gonadotrophin releasing hormone), which further triggers the release of two other hormones—LH (luteinizing hormone) and FSH (follicle stimulating hormone)—into the bloodstream. These, together, induce ovulation. The surge of LH and FSH is so well documented that scientists and doctors can use serum levels of these hormones to track ovulation in animals as well as in humans.

Lots of animals ovulate when acutely stressed. In sheep, ewes spontaneously ovulate when they are stressed by transport or medical treatments. Similarly, female fish exposed to an acutely stressful event were 9 times more likely to ovulate within 72 hours. Acute stress can cause surges in FSH and LH that could induce ovulation in species ranging from rats to monkeys and even people, especially if a female is near ovulation to begin with. Not only does stress cause LH levels to rise, the worse the stress, the more they rise. This led scientists to conclude that women “may be induced to ovulate at any point of the menstrual cycle…if exposed to an appropriate acute stressor.”

It’s likely that stress-induced ovulation is mediated by increases in stress hormones. While long-term rises in cortisol can prevent ovulation, artificially increasing stress hormones by injecting animals with cortisol kicks the ovulation mechanism into gear early, increasing LH release. Adrenaline plays a key role, too. If you block adrenaline synthesis or deplete adrenaline, the surge in LH required for ovulation is suppressed, and ovulation is prevented.

I don’t want to scare you: none of what I just told you means rape guarantees ovulation. The duration of elevated stress hormones, timing in regards to a woman’s natural cycle, intensity of the stress response and even just innate physiological variability between women can all affect how a woman’s reproductive system responds to stressful events. Furthermore, not all rape victims react the same way (emotionally or physically) to what happened to them, and these differences will affect how their body responds. But contrary to Akin and Gingrey’s assertions, science suggests that rape victims have every reason to worry about pregnancy.

Given the connection between acute stress and ovulation, it shouldn’t come as a shock that that the odds of getting pregnant from rape are significantly higher than from consensual sex. Read that sentence again. I really want this to soak in. You are more likely to get pregnant if you are raped.

The science that stands behind that statement is strong. For a while, scientists debated whether it was true. You can’t just compare the pregnancy rates from raped and unraped women because there are many factors involved, including birth control use, number of intercourse events, and the rate of rape reporting if conception doesn’t occur. But recent studies have taken into account these confounding variables, and the result is crystal clear. Based on real data on pregnancy in the United States, they estimate that percentage of pregnancies resulting from forced intercourse is around 8%—significantly higher than the 3% that result from single episodes of consensual sex.

Gingrey is just wrong on all accounts, and so is Akin. There is no evidence to support the role of adrenaline-mediated prevention of ovulation due to rape. There is no science to support their insinuations that, somehow, rape victims are less likely to get pregnant. Their statements directly contradict reproductive science, and serve only to demean women who have already undergone a terrible atrocity. There is simply no excuse for such blatant ignorance and thinly-veiled misogyny, especially coming from the mouth of someone claiming to ‘know about these things.’

Republican war on women, by Eclectablog

Here’s a tip for the GOP and republicans in general: stop citing biology to defend your misogynistic positions. At least stop claiming things to be true without a cursory look at the literature. It’s not hard to look these things up, boys, and you have a team of assistants to do such things for you. When you flap your lips without even the slightest clue as to what the science actually is on the subject, you look stupid at best. I’d say stop talking in general, but I think it’s good that the general public sees your positions for what they really are. On second thought, ignore my advice: keep on trucking. The baseless, unscientific lies that you tell will only serve to strengthen the people who run against you.

Fake Feces To Treat Deadly Disease: Scientists Find They Can Just Make Sh*t Up

Though childish songs make crude jokes, there’s nothing funny about diarrhea. Aside from the painful, twisting feeling in your guts, there’s just something psychologically upsetting about losing control of your bowels. It’s embarrassing. It’s disgusting. And we’ve all been there.

Micrograph of C. difficile bacteria from a stool sample culture

But for many, diarrhea is more than a shameful stain to be washed away in an impromptu laundry load; in the US alone, more than 500,000 suffer and 15,000 die every year from uncontrollable diarrhea caused by infection with Clostridium difficile. These rod-shaped bacteria are commonly found in the environment and even in our bodies, but have lately become a major concern in hospitals where antibiotics leave patients without the natural flora that protect their bodies. When C. difficile populations grow unchecked, toxins produced by the bacteria cause inflammation and cell death, leading to the explosive symptoms that, if not controlled, can lead to severe dehydration, kidney failure, holes in the intestines and death. Patients already weakened by other illnesses are particularly at risk of succumbing. Epidemics of C. difficile have become such a serious problem that the infection now rivals the superbug MRSA as one of the top emerging disease threats.

A diagram of how a stool transplant is performed

With more and more strains of C. difficile becoming resistant to antibiotics, doctors have had to find creative ways to treat the infection. One of the most promising (if not revolting) treatments that has been tested in recent years is called fecal bacteriotherapy or ‘stool transplant’, which involves taking donor poop from a healthy patient and inserting it into the gut of an infected one as a form of probiotics, seeking to replace the protective flora. I wish I was kidding. The procedure involves collecting crap from a close relative that has been screened for other microbial pathogens, mixing it with saline or milk until it reaches the right consistency, and then placing it directly in the patient’s digestive system through a tube inserted either up the anus or down from their mouths. Though it might sound gross, the dose of healthy, helpful bacteria has shown to be very effective.

Of course, most patients aren’t exactly thrilled with the suggestion of shoving someone else’s bowel movements up their rectums or down their throats. Now, University of Guelph researchers have developed a more sanitary way of achieving the same results: synthetic poop.

The researchers created fake feces, aptly named RePOOPulate, after careful examination of bacterial colonies grown from the stool of healthy volunteers. Once the right ratio of species was determined, 33 different bacteria were grown in a robotic intestine simulator affectionately called Robo-gut to create a ‘super-probiotic’ stool substitute. According to the scientists, the bacterial mixture is much more palatable than what it mimics, and smells significantly better. Two patients treated with RePOOPulate showed marked improvement after three days, remaining C. difficile-free months after treatment. Tests of their intestinal flora showed that the fake crap successfully introduced beneficial bacteria to the patients’ guts.

It’s hard to say exactly how effective the new treatment is off of such a small test, but the results are very promising. Because scientists can control the bacterial mixture in RePOOPulate, there is less risk of introducing potentially harmful bacteria than with regular stool transplants, and the treatment can be tweaked to meet the needs of different patients. This proof-of-concept paper opens the doors for future testing. In time, RePOOPulate may prove a safe and effective treatment for C. difficile infection, as well as other gut diseases caused by the imbalance of beneficial bacteria like inflammatory bowel disease.

Citation: Petrof E.O., Gloor G.B., Vanner S.J., Weese S.J., Carter D., Daigneault M.C., Brown E.M., Schroeter K. & Allen-Vercoe E. (2013). Stool substitute transplant therapy for the eradication of Clostridium difficile infection: ‘RePOOPulating’ the gut, Microbiome, 1 (1) 3. DOI:

Image credits: Clostridium bacteria from the CDC’s Public Health Image Library; diagram of colonoscope from hfsimaging / 123RF Stock Photo

Parasitic Wasps Master Microbiology In Addition To Neurochemistry

“Oh, beauty is a beguiling call to death, and I’m addicted to the sweet pitch of its siren.” – Johnny Quid, RocknRolla

Female of the Emerald cockroach wasp Ampulex compressa manipulating an American cockroach, Periplaneta americana, which has been made docile by wasp venom and that will serve as food for the wasp larva. Image courtesy of Gudrun Herzner.

Glinting in shimmering shades of blue and green, the emerald cockroach wasp is surely a thing of beauty, but its shimmering exterior masks its cruel nature. The emerald cockroach wasp is one nature’s most impressive neurochemists. At its core, it is a parasite. The female wasp lays her eggs on a cockroach host, and when they hatch, the larvae eat the creature from the inside out. You’d think the cockroach would be opposed to this idea, but instead the insect patiently awaits its fate while the larvae mature. Cockroaches are much larger than even a full grown wasp, and certainly could put up a fight, but that is where the wasps’ ingenious manipulation of neurochemistry comes in. When she encounters a potential host, the female cockroach wasp first stings the cockroach in its abdomen, temporarily paralyzing its front legs and allowing the wasp to perch precisely on its head. She then stings the roach again, this time delivering venom directly into a part of the roach’s brain called the sub-esophageal ganglia. This doesn’t kill the roach. Instead, it puts the roach in a zombie-like trance. The roach is less fearful and loses the will to flee. It allows the wasp to lead it by its antennae, like a dog on a leash, to the wasp’s burrow where the roach will play the martyr for the wasp’s unborn children. Even though the roach is fully capable of locomotion during the week to month that passes from when the wasp stings the its brain until the hungry brood finish eating it alive, the zombified insect doesn’t move. Emerald cockroach wasps have elevated neural manipulation to an art form to create perfect living incubators.

But, though the roach has been rendered harmless, the wasp-to-be is threatened by other organisms. Humans aren’t the only species that have to worry about their food spoiling—so do emerald cockroach wasps. Cockroaches truly are dirty creatures, and their insides are home to a suite of bacteria that can harm the wasp’s vulnerable larvae. One of these potential threats is Serratia marcescens, a vile sort of Gram negative bacteria found in cockroach bodies. It’s the same bacteria responsible for a number of human urinary tract infections and the weird pink stains that form in our toilets and showers. In insects, its effects are much more deadly. The bacteria possess a suite of protein-degrading enzymes that cut apart fragile larval cells. The larvae aren’t entirely defenseless, though—as a new study published today in the Proceedings of the National Academy of Sciences reveals, larval wasps sterilize their food by secreting antimicrobial compounds.

For many parasitic wasps, microorganisms are a serious concern. Studies on another wasp, Microplitis croceip, found that contamination with Serratia marcescens can lead to a 25% reduction in successful parasite emergence, and even the young that do survive can be infected. When adults are exposed to the bacteria, almost 80% die. The emerald cockroach wasp must defend against this mortal enemy, or pay the ultimate price.

But how do you protect yourself against bacteria that live inside your food? Well, you do what we do to foods that house potentially harmful pathogens: you sterilize them.

Oral droplets secreted by wasp larvae, figure 1 from the paper

Gudrun Herzner and a team from the University of Regensburg in Germany noticed that larval wasps secrete droplets from their mouths that they disperse around before they feed on their cockroach meal. They suspected these secretions kill off potentially deadly bacteria, allowing the larvae to eat in peace. The researchers tested the antimicrobial activity of the oral secretions to see if they were right.

When added to bacterial cultures from the cockroach, the droplets killed off a wide variety of bacteria, including the potentially deadly Serratia marcescens. But the team wanted to know more: exactly what in the droplets killed off bacteria? So, the researchers isolated the secretions and ran them through gas chromatography–mass spectrometry to determine the nature of the substances in them. They found nine compounds previously unknown from the wasps or the cockroaches. In particular, the secretions contained a large percentage of two compounds, a kind of mellein called (R)-(-)-mellein, and micromolide, a natural product that may hold the key to treating drug-resistant tuberculosis. Both compounds showed broad-spectrum antibacterial activity, and the combination of the two was particularly effective.

As a final test, they extracted parasitized and unparasitized cockroaches and looked for these compounds. From the parasitized cockroaches, the same antimicrobial mixture could be extracted, but not from unparasitized ones. Thus, the scientists were confident that the wasp larvae produce and use these compounds to sterilize their food from the inside out.

“We found clear evidence that A. compressa larvae are capable of coping with antagonistic microbes inside their P. americana hosts by using a mixture of antimicrobials present in their oral secretion,” write the authors. While both compounds used by the larvae have been found in other animals, never before has the combination been discovered, in insects or otherwise.

The broad-spectrum nature of these antimicrobials may be key to the wasps’ success. “Food hygiene may be of vital importance, especially to the vulnerable early developmental stages of insects,” explain the authors. “The range of microbes that A. compressa larvae may encounter during their development in their hosts is unpredictable and may encompass all different kinds of microbes, such as various Gram-positive and Gram-negative bacteria, mycobacteria, viruses, yeasts, and ?lamentous fungi.” It would be vital, then, that the antimicrobial compounds produced by the wasps are able to fend off a wide variety of potential contaminants. “The secretion of a blend of antimicrobials with broad-spectrum activity seems to represent an essential frontline defense strategy.”

These beguiling wasps not only have mastered neurochemistry, they have aced microbiology to become proficient parasites. Already, this tiny wasp has given us great insights into brains through the study of its particularly effective zombification strategy. Now, it is shedding light on another field of science. This study is one of the first to suggest that larval parasites possess the ability to protect themselves against microbial pathogens, but the authors suspect many species of insects may have similar strategies. These small creatures may prove a vital new resource for natural products to fight against human diseases. Who knows what other pharmaceutical secrets are being kept by insects like the emerald cockroach wasp, and what ailments we might be able to treat with their chemical arsenal.

Citation: Herzner G., Schlecht A., Dollhofer V., Parzefall C., Harrar K., Kreuzer A., Pilsl L. & Ruther J. (2013). Larvae of the parasitoid wasp Ampulex compressa sanitize their host, the American cockroach, with a blend of antimicrobials. PNAS Early Edition, DOI:

Don’t Pee On It: Zinc Emerges As New Jellyfish Sting Treatment

Biochemist and venom expert Angel Yanagihara

Nothing can turn a fun day at the beach into a nightmare faster than a jellyfish sting, as Angel Yanagihara, researcher at the University of Hawaii, learned firsthand when she was swimming off Kaimana beach in 1997. She had never heard of the nastiest group of jellyfish, the cubozoans (better known as box jellies), until she was badly stung. “I made it back to the beach in excruciating pain but then lost consciousness,” she recounts. “I was bed ridden for days in great pain and none of the approaches I tried brought any relief.” While the encounter was devastating, as a biochemist, Yanagihara was intrigued. She began scouring the scientific literature to find out what caused her traumatic ordeal, only to find out no one knew what was in box jelly venom. She has been studying the animals ever since.

When human flesh brushes up against a jellyfish tentacle, the tiny stinging cells jellies carry, called cnidocytes, can discharge their painful venom in as little as 700 nanoseconds. During the winter months, Australian waters are home to an abundance of the deadliest jellyfish in the world, the box jelly Chironex fleckeri, which has been known to kill a person in less than five minutes.

The deadly but beautiful Chironex fleckeri

Chironex even looks scary, with a bell that can be large as a basketball and tentacles up to ten feet long carrying millions upon millions of stinging cells. But Chironex didn’t earn its title as the deadliest jellyifish in the world based on looks. Anyone who has come in contact with Chironex knows its fearsome reputation is justified, as even mild stings are excruciating. Yet despite decades of research, exactly how Chironex and other jellies deal their sometimes-fatal blows has remained a fearsome mystery.

“For over 60 years researchers have sought to understand the horrifying speed and potency of the venom of the Australian box jellyfish, arguably the most venomous animal in the world,” explains Yanagihara. It’s not that scientists have been unable to isolate any toxins. Yanagihara’s initial work discovered pore-forming toxins called porins in a related species, Carybdea alata, capable of tearing holes in blood cells, and since scientists have found similar porins in every jellyfish species they’ve looked at. The conundrum is that severe sting victims don’t suffer from profound destruction of red blood cells, seemingly counting out the porins as the cause of fatal stings. But if it’s not the porins, what in jellyfish venom is to blame? How does it act so quickly, leading to such sudden cardiovascular collapse? And is there anything we can do to slow or stop its deadly activity?

Now, in a new paper published today in PLOS ONE, Yanagihara and her colleagues from the University of Hawaii have revealed the key mechanism by which Chironex venom—and, specifically, the overlooked porins—quickly dismantle the cardiovascular system. Armed with physiology, the team was able to find a safe treatment that could be used to improve survival in sting victims.

Yanagihara collecting Hawaiian box jellies

It’s virtually impossible to find a good treatment for a toxin without first knowing what it is. So, the first step for Yanagihara was to isolate jellyfish venom and figure out if the porins were the real culprits. For species like snakes and spiders, this is a straightforward process, as they possess glands with large volumes of venom. But you can’t just have a jelly bite a container or pull venom from a cnidocyte using a syringe; minuscule amounts of venom components are stored in microscopic compartments amongst a plethora of other body parts—deadly needles in haystacks of cellular debris. Yanagihara spent years troubleshooting and perfecting a new method of venom extraction to yield potent venom free of as much junk as possible. With this in hand, she was able develop assays to visualize how the venom acts both on red blood cells and in live animal models.

Ring-shaped pores in the wall of a red blood cell from jelly porins

Yanagihara used electron microscopy to visualize the venom’s affects on blood cells, and as suspected, found that venom porins create holes that lead to cell rupture. But as previous clinical research had shown, the cells bursting wasn’t the real issue; Yanagihara found that instead, for several minutes before they break apart, red blood cells leaked potassium. Animal models confirmed that this sudden spike of potassium in the blood stream, termed hyperkalemia, is what leads to rapid changes in heart rate and function and, ultimately, the cardiovascular collapse that causes death by jellyfish. With the physiology of stings revealed, Yanagihara could finally start the laborious task of finding a way to stop the venom in its tracks.

Jellies aren’t the only animals that create porins. “The structural motif of the cubozoan porin reminded me of the bacterial porins,” said Yanagihara. “I scoured that literature to look for inhibitors of the self assembly of those pore forming toxins and discovered studies from the 1940s even as far back as the 1890’s citing zinc ion as useful in the inhibition of bacterial driven lytic reactions.” Yanagihara tested over 100 compounds to see if they inhibited jellyfish venom, and found that one of the safest—zinc gluconate—worked well.

Scientists aren’t 100% sure how zinc compounds inhibit the porins, but they believe that the zinc disrupts the binding domains necessary for the proteins to assemble to form pores. In in vitro models, Yanagihara found that a low dose of zinc gluconate completely prevented the venom’s blood cell busting effect. She then tested the compound in animal models, and found it worked better than the commercially available antivenom for box jelly stings, keeping the mice alive more than twice as long as the antivenom.

Yanagihara preparing the topical treatment for Nyad

Yanagihara is now investigating how to turn the compound into a treatment for human stings. She has created a topical version, which was tested on extreme swimmer Diana Nyad when she attempted to swim a record-breaking 103 miles from Cuba to Florida. Nyad attempted the swim in 2011, but was stung so severely by box jellies on the first day in the water that she had to stop after 29 hours. This time, armed with Yanagihara’s treatment, she swam three nights through dense jelly aggregations, and kept at the swim for a total of 51 hours covering over 60 miles of the trek before she gave in. Nyad reported that the treatment blunted the stings to her exposed lips and face enough to keep her going.

Zinc gluconate isn’t a cure-all; it won’t stop all of the excruciating pain associated with severe stings, and victims are still at risk of going into shock and cardiovascular failure. But, Yanagihara is hopeful that treatment with zinc gluconate might be effective enough at prolonging survival in severe sting victims long enough to get them to medical professionals that can save their lives, and may provide welcome relief to mild sting victims. Currently, vinegar is used to treat jellyfish stings, though it only prevents unfired cnidocytes from contributing to the sting and doesn’t act on the venom itself. Hot water can provide temporary relief, but Yanagihara is hopeful that zinc treatment will prove a more viable solution.

In Australia and other areas of the Pacific where Chironex roams, such a solution is desperately needed. Despite being home to ten of the worlds deadliest snakes, since 1980, there have been more deaths due to jellies than snakebites in Australia, and it’s estimated that 20-50 people die annually from box jelly stings in the Philippines. Worse, in recent years, scientists have reported that expanding ranges for many jellyfish species due to changing ocean currents, over-fishing of other species, and warmer ocean temperatures from climate change. The Nation Science Foundation estimates that jellies cost hundreds of millions or even billions of dollars in medical damages, fisheries effects and tourism loss. This breakthrough treatment is more important than ever, and hopefully, will help relieve the thousands of stings worldwide every year.

Citation: Yanagihara AA, Shohet RV (2012) Cubozoan Venom-Induced Cardiovascular Collapse Is Caused by Hyperkalemia and Prevented by Zinc Gluconate in Mice. PLoS ONE 7(12): e51368. DOI:

Photo credits: Angel Yanagihara taken by Laura Aguon; Chironex fleckeri by Dr. Robert Harwick; Box jelly collecting by Keoki Stender; Pores in RBC from the paper itself; Yanagihara prepping treatment by Christi Barli (all images provided by Angel Yanagihara)

Are lower pesticide residues a good reason to buy organic? Probably not.

A lot of organic supporters are up in arms about the recent Stanford study that found no nutritional benefit to organic foods. Stanford missed the point, they say—it’s not about what organic foods have in them, it’s what they don’t. After all, avoidance of pesticide residues is the #1 reason why people buy organic foods.

Yes, conventional foods have more synthetic pesticide residues than organic ones, on average. And yes, pesticides are dangerous chemicals. But does the science support paying significantly more for organic foods just to avoid synthetic pesticides? No.

A Pesticide Is A Pesticide

 

I’m not saying that pesticides, herbicides, and insect repellants aren’t toxic. I certainly wouldn’t recommend drinking cocktails laced with insect-repelling chemicals, for without a doubt, they can be bad for you. Pesticide exposure has been linked to all kinds of diseases and conditions, from neurodegenerative diseases like Parkinson’s to cancer. What we do know, though, is that natural isn’t synonymous with harmless. As a 2003 review of food safety concluded, “what should be made clear to consumers is that ‘organic’ does not equal ‘safe’.”

I’ve said it before and I’ll say it again: there is nothing safe about the chemicals used in organic agriculture. Period. This shouldn’t be that shocking – after all, a pesticide is a pesticide. “Virtually all chemicals can be shown to be dangerous at high doses,” explain scientists, “and this includes the thousands of natural chemicals that are consumed every day in food but most particularly in fruit and vegetables.”

There’s a reason we have an abundance of natural pesticides: plants and animals produce tens of thousands of chemicals to try and deter insects and herbivores from eating them. Most of these haven’t been tested for their toxic potential, as the Reduced Risk Program of the US Environmental Protection Agency (EPA) applies to synthetic pesticides only. As more research is done into their toxicity, however, we find they are just as bad as synthetic pesticides, sometimes worse. Many natural pesticides have been found to be potential – or serious – health risks, including those used commonly in organic farming.

In head-to-head comparisons, natural pesticides don’t fare any better than synthetic ones. When I compared the organic chemicals copper sulfate and pyrethrum to the top synthetics, chlorpyrifos and chlorothalonil, I found that not only were the organic ones more acutely toxic, studies have found that they are more chronically toxic as well, and have higher negative impacts on non-target species. My results match with other scientific comparisons. In their recommendations to Parliament in 1999, the Committee on European Communities noted that copper sulfate, in particular, was far more dangerous than the synthetic alternative. A review of their findings can be seen in the table on the right (from a recent review paper). Similarly, head to head comparisons have found that organic pesticides aren’t better for the environment, either.

Organic pesticides pose the same health risks as non-organic ones. No matter what anyone tells you, organic pesticides don’t just disappear. Rotenone is notorious for its lack of degradation, and copper sticks around for a long, long time. Studies have shown that copper sulfate, pyrethrins, and rotenone all can be detected on plants after harvest—for copper sulfate and rotenone, those levels exceeded safe limits. One study found such significant rotenone residues in olives and olive oil to warrant “serious doubts…about the safety and healthiness of oils extracted from drupes treated with rotenone.” Just like with certain synthetic pesticides, organic pesticide exposure has health implications—a study in Texas found that rotenone exposure correlated to a significantly higher risk of Parkinson’s disease. The increased risk due to Rotenone was five times higher than the risk posed by the synthetic alternative, chlorpyrifos. Similarly, the FDA has known for a while that chronic exposure to copper sulfate can lead to anemia and liver disease.

So why do we keep hearing that organic foods have fewer pesticide residues? Well, because they have lower levels of synthetic pesticide residues. Most of our data on pesticide residues in food comes from surveys like the USDA’s Pesticide Data Program (PDP). But the while the PDP has been looking at the residues of over 300 pesticides in foods for decades, rotenone and copper sulfate aren’t among the usual pesticides tested for—maybe, because for several organic pesticides, fast, reliable methods for detecting them were only developed recently. And, since there isn’t any public data on the use of organic pesticides in organic farming (like there is for conventional farms), we’re left guessing what levels of organic pesticides are on and in organic foods.

So, if you’re going to worry about pesticides, worry about all of them, organic and synthetic. But, really, should you worry at all?

You Are What You Eat? Maybe Not.

 

We know, quite assuredly, that conventionally produced foods do contain higher levels of synthetic chemicals. But do these residues matter?

While study after study can find pesticide residues on foods, they are almost always well below safety standards. Almost all pesticides detected on foods by the USDA and independent scientific studies are at levels below 1% of the Acceptable Daily Intake (ADI) set by government regulators. This level isn’t random – the ADI is based on animal exposure studies in a wide variety of species. First, scientists give animals different amounts of pesticides on a daily basis throughout their lifetimes and monitor those animals for toxic effects. Through this, they determine the highest dose at which no effects can be found. The ADI is then typically set 100 times lower than that level. So a typical human exposure that is 1% of the ADI is equivalent to an exposure 10,000 times lower than levels that are safe in animal models.

Systematic reviews of dietary pesticide exposure all come to the same conclusion: that typical dietary exposure to pesticide residues in foods poses minimal risks to humans. As the book Health Benefits of Organic Food explains, “while there is some evidence that consuming organic produce will lead to lower exposure of pesticides compared to the consumption of conventional produce, there is no evidence of effect at contemporary concentrations.” Or, as a recent review states, “from a practical standpoint, the marginal benefits of reducing human exposure to pesticides in the diet through increased consumption of organic produce appear to be insignificant.”

Reviews of the negative health effects of pesticides find that dangerous exposure levels don’t come from food. Instead, non-dietary routes make for the vast majority of toxin exposures, in particular the use of pesticides around the home and workplace. A review of the worldwide disease burden caused by chemicals found that 70% can be attributed to air pollution, with acute poisonings and occupational exposures coming in second and third. Similarly, studies have found that indoor air concentrations of pesticides, not the amount on foodstuffs, correlate strongly to the amount of residues found in pregnant women (and even still, there was no strong correlation between exposure and health effects). Similarly, other studies have found that exposures to toxic pyrethroids come primarily from the environment. Children on organic diets routeinely had pyrethroids in their systems, and the organic group actually had higher levels of several pyrethroid metabolites than the conventional one. In other words, you have more to fear from your home than from your food.

Your home probably contains more pesticides than you ever imagined. Plastics and paints often contain fungicides to prevent mold—fungi that, by the way, can kill you. Your walls, carpets and floors also contain pesticides. Cleaning products and disenfectants contains pesticides and fungicides so they can do their job. Ever used an exterminator to get rid of mice, termites, fleas or cockroaches? That stuff can linger for months. Step outside your house, and just about everything you touch has come in contact with a pesticide. Insecticides are used in processing, manufacturing, and packaging, not to mention that even grocery stores use pesticides to keep insects and rodents at bay. These chemicals are all around you, every day, fighting off the pests that destroy our buildings and our food. It’s not surprising that most pesticide exposures doesn’t come from your food.

That said, there are some studies that have found a link between diet and exposure to specific pesticides, particularly synthetic organophosphorus pesticides. Lu et al. found that switching children from a conventional food diet to an entirely organic one dropped the urinary levels of specific metabolites for malathion and chlorpyrifos to nondetectable levels in a matter of days. But, it’s important to note that even the levels they detected during the conventional diet are three orders of magnitude lower than the levels needed in animal experiments to cause neurodevelopmental or other adverse health effects.

While it might seem that decreasing exposure to pesticides in any way could only be good for you, toxicologists would differ. Contrary to what you might think, lower exposure isn’t necessarily better. It’s what’s known as hormesis, or a hormetic dose response curve. There is evidence that exposure to most chemicals at doses significantly below danger thresholds, even pesticides, is beneficial when compared to no exposure at all. Why? Perhaps because it kick starts our immune system. Or, perhaps, because pesticides activate beneficial biological pathways. For most chemicals, we simply don’t know. What we do know is that data collected from 5000 dose response measurements (abstracted from over 20,000 studies) found that low doses of many supposedly toxic chemicals, metals, pesticides and fungicides either reduced cancer rates below controls or increased longevity or growth in a variety of animals. So while high acute and chronic exposures are bad, the levels we see in food that are well below danger thresholds may even be good for us. This isn’t as surprising as you might think—just look at most pharmaceuticals. People take low doses of aspirin daily to improve their heart health, but at high chronic doses, it can cause anything from vomiting to seizures and even death. Similarly, a glass of red wine every day might be good for you. But ten glasses a day? Definitely not.

No Need To Fear

 

To date, there is no scientific evidence that eating an organic diet leads to better health.

What of all those studies I just mentioned linking pesticides to disorders? Well, exactly none of them looked at pesticides from dietary intake and health in people. Instead, they involve people with high occupational exposure (like farmers who spray pesticides) or household exposure (from gardening, etc). Judging the safety of dietary pesticide intake by high exposures is like judging the health impacts of red wine based on alcoholics. A systematic review of the literature found only three studies to date have looked at clinical outcomes of eating organic – and none found any difference between an organic and conventional diet. My question is: if organic foods are so much healthier, why aren’t there any studies that show people on an organic diet are healthier than people eating conventionally grown produce instead?

More to the point, if conventional pesticide residues on food (and not other, high exposure routes) are leading to rampant disease, we should be able to find evidence of the connection in longitudinal epidemiological studies—but we don’t. The epidemiological evidence for the danger of pesticide residues simply isn’t there.

If dietary exposure to pesticides was a significant factor in cancer rates, we would expect to see that people who eat more conventionally grown fruits and vegetable have higher rates of cancer. But instead, we see the opposite. People who eat more fruits and vegetables have significantly lower incidences of cancers, and those who eat the most are two times less likely to develop cancer than those who eat the least. While high doses of pesticides over time have been linked to cancer in lab animals and in vitro studies, “epidemiological studies do not support the idea that synthetic pesticide residues are important for human cancer.” Even the exposure to the persistent and villainized pesticide DDT has not been consistently linked to cancer. As a recent review of the literature summarized, “no hard evidence currently exists that toxic hazards such as pesticides have had a major impact on total cancer incidence and mortality, and this is especially true for diet-related exposures.”

The closest we have to studying the effects of diet on health are studies looking at farmers. However, farmers in general have high occupational pesticide exposures, and thus it’s impossible to tease out occupational versus dietary exposure. Even still, in this high-risk group, studies simply don’t find health differences between organic and conventional farmers. A UK study found that conventional farmers were just as healthy as organic ones, though the organic ones were happier. Similarly, while test-tube studies of high levels of pesticides are known to cause reproductive disorders, a comparison of sperm quality from organic and conventional farmers was unable to connect dietary intake of over 40 different pesticides to any kind of reproductive impairment. Instead, the two groups showed no statistical difference in their sperm quality.

In a review of the evidence for choosing organic food, Christine Williams said it simply: “There are virtually no studies of any size that have evaluated the effects of organic v. conventionally-grown foods.” Thus, she explains, “conclusions cannot be drawn regarding potentially beneficial or adverse nutritional consequences, to the consumer, of increased consumption of organic food.”

“There is currently no evidence to support or refute claims that organic food is safer and thus, healthier, than conventional food, or vice versa. Assertions of such kind are inappropriate and not justifed,” explain scientists. Neither organic nor conventional food is dangerous to eat, they say, and the constant attention to safety is unwarranted. Worse, it does more harm than good. The scientists chastise the media and industry alike for scaremongering tactics, saying that “the selective and partial presentation of evidence serves no useful purpose and does not promote public health. Rather, it raises fears about unsafe food.”

Furthermore, the focus on pesticides is misleading, as pesticide residues are the lowest food hazard when it comes to human health (as the figure from the paper on the right shows). They conclude that as far as the scientific evidence is concerned, “it seems that other factors, if any, rather than safety aspects speak in favor of organic food.”

If you don’t want to listen to those people or me, listen to the toxicologists, who study this stuff for a living. When probed about the risk that different toxins pose, over 85% rejected the notion that organic or “natural” products are safer than others. They felt that smoking, sun exposure and mercury were of much higher concern than pesticides. Over 90% agreed that the media does a terrible job of reporting the about toxic substances, mostly by overstating the risks. They slammed down hard on non-governmental organizations, too, for overstating risk.

What’s in a Name?

There’s good reason we can’t detect differences between organic and conventional diets: the labels don’t mean that much. Sure, organic farms have to follow a certain set of USDA guidelines, but farm to farm variability is huge for both conventional and organic practices. As a review of organic practices concluded: “variation within organic and conventional farming systems is likely as large as differences between the two systems.”

The false dichotomy between conventional and organic isn’t just misleading, it’s dangerous. Our constant attention to natural versus synthetic only causes fear and distrust, when in actuality, our food has never been safer. Eating less fruits and vegetables due to fear of pesticides or the high price of organics does far more harm to our health than any of the pesticide residues on our food.

Let me be clear about one thing: I’m all for reducing pesticide use. But we can’t forget that pesticides are used for a reason, too. We have been reaping the rewards of pesticide use for decades. Higher yields due to less crop destruction. Safer food because of reduced fungal and bacterial contamination. Lower prices as a result of increased supply and longer shelf life. Protection from pests that carry deadly diseases. Invasive species control, saving billions of dollars in damages—and the list goes on. Yes, we need to manage the way we use pesticides, scrutinize the chemicals involved and monitor their effects to ensure safety, and Big Ag (conventional and organic) needs to be kept in check. But without a doubt, our lives have been vastly improved by the chemicals we so quickly villainize.

If we want to achieve the balance between sustainability, production outputs, and health benefits, we have to stop focusing on brand names. Instead of emphasizing labels, we need to look at different farming practices and the chemicals involved and judge them independently of whether they fall under organic standards.

In the meantime, buy fresh, locally farmed produce, whether it’s organic or not; if you can talk to the farmers, you’ll know exactly what is and isn’t on your food. Wash it well, and you’ll get rid of most of whatever pesticides are on there, organic or synthetic. And eat lots and lots of fruits and vegetables—if there is anything that will improve your health, it’s that.

Before you say otherwise and get mad at me for mentioning it, rotenone is currently a USDA approved organic pesticide. It was temporarily banned, but reapproved in 2010. Before it was banned, it was the most commonly used organic pesticide, and now—well, without public data on pesticide use on organic farms, we have no idea how much it is being used today.

Food picture from FreeFoto.Com

Scientists play a large role in bad medical reporting

If you read the headlines, medical scientists are amazing. It seems every day, they discover a new cure for cancer or the genetic basis of some prominent disease. With all the cures, keys, breakthroughs and discoveries, it’s a wonder anyone still gets sick.

Of course, readers soon learn the truth: a lot of science reporting is sensationalized nonsense. Hyping science a vicious cycle. Scientists work hard, get results, and publish. Press officers try to publicize these results, then journalists build off the press releases, and before you know it, your grandmother is wearing a tin foil hat. This is predictably followed by angry scientists and science writers with their rolled up newspapers swatting the noses of the “churnalists” for their bad reporting. People like Ed Yong and I feel forced to don our latex gloves and clean up the crap left on the carpet, all the while sternly saying “Bad, journalist. BAD!”.

But are journalists, as a whole, really that bad at their jobs? No, actually, says a new paper published today in PLoS Medicine. It’s not all the writers’ fault: when they examined the language used in press releases and the studies themselves, instead, it was the scientists and their press offices that were largely to blame.

A team of French scientists led by Isabelle Boutron from the Université Paris Descartes sought to get to the bottom of why medical news is so over-spun. They examined the language in clinical trials along with their associated press releases and news reports for spin—defined as specific reporting strategies emphasizing the beneficial effect of the experimental treatment—to see exactly where the hype comes from.

As expected, they found that the media’s portrayal of results was often sensationalistic. More than half of the news items they examined contained spin. But, while the researchers found a lot of over-reporting, they concluded that most of it was “probably related to the presence of ‘‘spin’’ in conclusions of the scientific article’s abstract.”

It turned out that 47% of the press releases contained spin. Even more importantly, of the studies they examined, 40% of the study abstracts or conclusions did, too. When the study itself didn’t contain spin to begin with, only 17% of the news items were sensationalistic, and of those, 3/4 got their hype from the press release.

In the journal articles themselves, they found that authors spun their own results a variety of ways. Most didn’t acknowledge that their results were not significant or chose to focus on smaller, significant findings instead of overall non-significant ones in their abstracts and conclusions, though some contained outright inappropriate interpretations of their data.

The press releases often built off of the spin in the studies. Of the press releases that contained spin, 93% were from studies that had spin in their abstracts. In fact, spin present in the study was the only significant factor associated with spin in the press release. A whopping 31% of press releases misinterpreted the scientists’ findings, with the vast majority conflating the benefits of the study’s tested treatment.

It’s not news that press releases are skewed. Previous research found that most press releases left out important caveats on safety or applicability of the research, and many flat out exaggerated the importance of results. “Our study adds to these results showing that ‘‘spin’’ in press releases and the news is related to the presence of ‘‘spin’’ in the published article,” say the authors. In other words – the root of the problem lies in how we write up research results in the first place.

The authors were sure to note that while their results are striking, their study has limitations. They ended up with only 41 trials paired with press releases and news articles—a small sample size with which to examine the whole of medical news reporting. They also focused solely on randomized controlled trials, a small subset of all medical research. Still, they feel that their results require further investigation, and that the burden of ensuring scientific rigor in reporting falls on the peer review system. “Reviewers and editors of published articles have an important role to play in the dissemination of research findings and should be particularly aware of the need to ensure that the conclusions reported are an appropriate reflection of the trial findings and do not overinterpret or misinterpret the results.”

All of this is not to say journalists are entirely innocent. Good journalism requires that you look beyond the press release to get at the heart of the study, and great science journalists know to take anything that comes out of a press office with a grain of salt. They read the study itself, and talk to not only the scientists who wrote the study but also other scientists in the field to really understand the importance of the research involved. Churnalism is definitely a problem that needs to be addressed alongside concerns of scientist bias and hyped press releases. Researchers, press officers and journalists all need to take responsibility for accurate and informative science communication.

Citation: Yavchitz A, Boutron I, Bafeta A, Marroun I, Charles P, et al. (2012) Misrepresentation of Randomized Controlled Trials in Press Releases and News Coverage: A Cohort Study. PLoS Med 9(9): e1001308. DOI: 10.1371/journal.pmed.1001308.t004

Is Climate Change To Blame For This Year’s West Nile Outbreak?

According to the Centers for Disease Control, there have been over 1100 reported cases of West Nile virus disease in the US this year, including 42 deaths. If these numbers seem high, they are – in fact, it’s the highest number of reported cases since West Nile was first detected in the US in 1999, and West Nile season has just begun. Given that the peak of West Nile epidemics generally occurs in mid August, and it takes a few weeks for people to fall ill, the CDC expects that number to rise dramatically. But why now?

Though the CDC doesn’t have an official response to that question, the director of the CDC’s Vector-Borne Infectious Disease Division said that ‘unusually warm weather’ may be to blame. So far, 2012 is the hottest year on record in the United States according to the National Climatic Data Center, with record-breaking temperatures and drought a national norm. It’s likely no coincidence that some of the states hit hardest by West Nile are also feeling the brunt of the heat. More than half of cases have been reported from Texas alone, where the scorching heat has left only 12% of the state drought-free. Fifteen heat records were broken in Texas just last week on August 13th.

The heat waves, droughts and other weather events are the direct effects of climate change say leading scientists. As NASA researcher James Hansen explained in a recent Washington Post editorial, “our analysis shows that, for the extreme hot weather of the recent past, there is virtually no explanation other than climate change.” He says that the European heat wave of 2003, the Russian heat wave of 2010 and catastrophic droughts in Texas and Oklahoma last year are all the repercussions of climate change. Confidently, he adds that “once the data are gathered in a few weeks’ time, it’s likely that the same will be true for the extremely hot summer the United States is suffering through right now.”

The fact that the worst US West Nile epidemic in history happens to be occurring during what will likely prove to be the hottest summer on record doesn’t surprise epidemiologists. They have been predicting the effects of climate change on West Nile for over a decade. If they’re right, the US is only headed for worse epidemics.

What Is West Nile Virus?

To understand the connection between climate change and disease, you first have to understand West Nile. First discovered in Uganda in 1937, it is what epidemiologists call a zoonotic disease, that is, one that is transmitted from animals to humans, not from humans to humans. West Nile virus mainly infects birds, which are the virus’ true hosts. We humans (as well as livestock and other animals) are accidental casualties of a bird-mosquito disease. West Nile travels via mosquitos that pick up virus particles when they bite infected birds. These particles stay in the mosquito’s salivary glands, and are transmitted into the next host when the mosquito feeds. Humans and other mammals don’t have high enough numbers of viruses in their blood for mosquitos to pick up the infection from them, which is why we are considered “dead end hosts”.

As anyone who hangs out outdoors is aware, mosquito populations pick up in the summer, when environmental conditions are just right for feeding and breeding. So, too, does West Nile. Though it sounds scary, West Nile is a fairly mild infection. Around 80% of people infected show no symptoms at all. Most of the remaining 20% will present with fever, headache and body aches, and nausea which can last for a few days or several weeks, much like getting the flu. Of course, like any illness, West Nile has the potential to be very serious. Less than 1% of infections result in a condition called West Nile meningitis or encephalitis, where the virus infects the spinal cord and even brain. Severe symptoms include high fever, headache, neck stiffness, a decreased level of consciousness (sometimes approaching near-coma), tremors, convulsions, weakness, numbness and paralysis. West Nile meningitis or encephalitis can last several weeks, and the neurological effects can be permanent. As with the flu, the elderly and anyone with a compromised immune system is at a higher risk of severe symptoms and death.

Currently there are no vaccines or antivirals with which to prevent and control West Nile virus in humans. The best offense in the case of West Nile is a good defense. To protect against illness, don’t get bitten by mosquitos in the first place. Wear protective clothing, use nets and screens, and wear insect repellent whenever you are in an area where people have gotten sick. The CDC has updated maps on where cases have occurred, though if you’re unsure if your area is suspect, caution is better than regret. Of course, if you look at that map, only one state has no West Nile activity: congrats, Vermont, on so far eluding the epidemic.

Turning Up The Heat

Higher temperatures bolster the chances of infection on many fronts. Temperature has a profound effect starting at the source: the mosquito. Studies have found that mosquitos pick up the virus more readily in higher temperatures. Higher temperatures also increase the likelihood of transmission, so the hotter it is outside, the more likely a mosquito that bites an infected bird will carry the virus and the more likely it will pass it along to an unwitting human host. In the United States, epicenters of transmission have been linked closely to above-average summer temperatures. In particular, the strain of West Nile in the US spreads better during heat waves, and the spread of West Nile westward was correlated with unseasonable warmth. High temperatures are also to blame for the virus jumping from one species of mosquito to a much more urban-loving one, leading to outbreaks across the US.

Though you might think that the droughts associated with heat waves would slow down mosquitos, it turns out to be the exact opposite. That’s because the main mosquito now involved in West Nile transmission, city-loving Culex pipiens, actually thrives in drought conditions. C. pipiens tends to breed underground in water that sits in city drains. During a drought, these pools become rich in organic material that C. pipiens needs to survive, whereas rainfall flushes the drains and dilutes the nutrients in the standing pools. Drought also has a negative effect on C. pipiens’ predators like frogs and dragonflies – and where there are less predators, there are more mosquitos. To add to the problem, drought tends to cause birds to cluster around water resources, making them easy pickings for hungry mosquitos and upping transmission rates.

The Real Inconvenient Truth

I can’t sum it up any better than Paul Epstein did in his 2001 review of climate change and West Nile:

“We have good evidence that the conditions that amplify the life cycle of the disease are mild winters coupled with prolonged droughts and heat waves—the

long-term extreme weather phenomena associated with climate change.”

While some politicians are hesitant to accept the scientific consensus of climate change, West Nile is more than happy to reap the rewards of our poor environmental choices. It’s not the only disease benefitting from our flagrant CO2 emissions; epidemiologists predict that many vector-borne diseases, including deadly ones like malaria and dengue fever, will increase in incidence world wide as global temperatures rise.

While the CDC is hesitant to blame this year’s West Nile outbreak on climate change directly, the science is clear. Record-breaking incidences of West Nile are strongly linked to global climate patterns and the direct effects of carbon dioxide emissions. Climate change isn’t just going to screw with the environment, it will continue to have devastating public health implications. In addition to better mosquito control and virus surveillance, we need to focus our efforts on reducing and reversing climate change if we want to protect our health and our well-being.

Image credits: Disease collage by Visual Mozart / purchased from ImageZoo; Transmission cycle image from the CDC website; mosquito danger image from petrafler / 123RF Stock Photo

Stressed Men Like Bigger Butts

Weight is a big issue in America. More than half of Americans are unhappy with their weight, spending 33 billion on dieting and weight loss programs and products every year. This obsession starts younger than we’d like to admit, as 80 percent of 10 year old girls will say they are on a diet. But whether you count the millions dissatisfied with their looks, the percentage trying to lose weight, or the billions wasted on pills and fad diets, the message is the same: being the ideal weight matters, and it matters a lot.

But what is the ideal weight? Doctors say somewhere between a BMI of 20 to 25. Looking at runway models, you’d think it was just this side of starving, as the stick figures that grace our catwalks have an average BMI of only 16. Ask the average man and… well, actually, that will depend on a number of things, including his mood.

A number of factors affect what weight a guy prefers a woman to be, and evolution is to blame. For a long time, scientists have believed that attractiveness is really just our way of interpreting how good a person will be as a mate, starting with genes. “Good-genes theory posits that human judgments of physical attractiveness, particularly in mating contexts, have evolved to respond in part to heritable cues associated with health,” explains Jason Weeden and John Sabini in their scientific review of the topic. As the theory goes, the better someone’s genetic makeup, the more symmetrical and ideal their body becomes.

But being a good potential mate isn’t just dictated by our DNA. Current health status, ability to provide for young, and other variable factors also play a role in how fit a person is as a potential husband or wife. A woman can have all the good genes in the world, for example, but if she’s starving, she won’t have the fat reserves to feed a child, let alone survive pregnancy. So, it makes sense that in times of hardship, men would prefer women better equipped to handle times of scarcity – and by better equipped, I mean with fat reserves.

“A primary function of adipose tissue is the storage of calories, which in turn suggests that body fat is a reliable predictor of food availability,” explain co-authors Viren Swami and Martin J. Tovée in their PLoS ONE paper released today. “In situations marked by resource uncertainty, therefore, individuals should come to idealise heavier individuals.”

But do times of hardship actually shift body size preferences? Science to date has supported this hypothesis, as hungrier and poorer men prefer larger women. But what Swami and Tovée wanted to know was whether the stress had to be related to food scarcity. What about other kinds of stress? Does stress in general shift preferences, or only hardship?

So, the team took college men and had half of them perform a stressful task unrelated to food or money which raised their cortisol levels. They then asked the stressed and unstressed men to take a look at some images of women, and rate their attractiveness. The images varied in body size, from underweight to obese. Finally, they recorded the participants own weight, height, and hunger status, as controlling variables.

The results were clear. The stressed out guys preferred a larger body size than their relaxed counterparts – but that was not all. “Men experiencing stress not only perceive a heavier female body size as maximally attractive, but also more positively perceive heavier female body sizes and have a wider range of body sizes considered physically attractive,” explain the authors.

The wider range of preference was notably one-sided. “This difference was driven by the shift in the experimental group’s upper limit of attractive female bodies,” the authors write. “While there was no significant difference in the lower end of the range, the experimental group appear to have shifted the maximum cut-off for attractive bodies at higher BMIs, which resulted in their wider attractiveness range.”

Why did the stressed-out guys prefer weightier women? Because, evolutionarily, more weight means better able to survive in tough times. “In contexts marked by prolonged stress as a result of resource deprivation, individuals may idealise larger body sizes because such body types are associated with better ability to handle environmental threat.” These results are consistent with cross-cultural studies on attractiveness, which found that ideal body size varies by socioeconomic status and resource scarcity. In other words, our evolutionary past has affected why different cultures throughout the world have very different ideals when it comes to beauty.

Nowadays, of course, the connection between body weight and ability to survive is uncoupled. Unlike our ancestors, Americans generally don’t worry about having the fat reserves to chase down their next meal. Modern medical technologies and an abundance of high calorie foods have made surviving and reproducing much easier. But, this evolutionary leftover does raise some interesting questions about modern life, too. What are the full implications of an economic depression, for example? I wonder if cutting taxes affects what size girls end up with modeling contracts, or if the association goes both ways, and girls on a diet become less picky. More research will have to determine if stressed women prefer larger men, too, or how chronic stress instead of acute stress affects attractiveness ratings.
 

Citation: Swami V, Tovée MJ (2012) The Impact of Psychological Stress on Men’s Judgements of Female Body Size. PLoS ONE 7(8): e42593. DOI: 10.1371/journal.pone.0042593.t001

Tape measure image c/o Fybrid Stock Photos