It is generally accepted as gospel truth among climate scientists and science writers that the world must immediately and drastically reduce carbon emissions in order to prevent apocalyptic climate change. Though RCS's editorial stance toward apocalyptic climate change is one of skepticism -- largely because doomsday prophets, be they the scientific or religious type, have always been wrong -- we freely admit that a catastrophic outcome is a possibility and radical measures may be necessary. (At this time, however, we believe that the best policy is the gradual lowering of carbon emissions through the implementation of a carbon tax.)
Whatever combination of climate solutions the world decides to implement, a new analysis in Environmental Science & Technology reminds us that all policies bear costs and unintended consequences. In the case of greenhouse gas reductions, the unintended consequence may be an increased risk of global hunger.
While it is true that climate change itself in the long run will likely lower crop yields (and hence increase the risk of hunger), ironically the very act of responding to climate change could also increase the risk of hunger. The authors demonstrated this by combining a crop model with an economic model that predicted how climate change and climate policy conspire to affect factors such as crop yields and the cost of food, energy, and land. Using these outputs, the authors were able to compare the risk of hunger in two different scenarios: (1) The Business as Usual (BaU) scenario, in which no climate policy is enacted, and (2) Stringent Mitigation, in which drastic measures are implemented to reduce greenhouse gas emissions.
Even if no climate change occurs, a rather rosy assessment, the authors predict that 90 million people will be at risk of hunger by the year 2050. (Data not shown.) Assuming that climate change occurs and the world continues with Business as Usual, about 2 million additional people will be at risk of hunger. (See bar labeled "World" and "BaU"). This would be entirely due to a decrease in crop yields (green color). However, in the Stringent Mitigation scenario, in which crop yields only decrease slightly, about 14 million additional people will be at risk of hunger. (See bar labeled "World" and "Mitigation.") What is going on?
Climate change policies aimed at reducing emissions may focus on growing more crops for biofuel. That would decrease the available food supply (thus, increasing food prices), as well as increase competition for farm land (which would further increase prices). Higher food prices will increase hunger, and this increase is represented by the blue color.
The biggest impact on hunger comes from mitigation costs (red bar). Replacing fossil fuels with more expensive energy sources or implementing pricey technological fixes such as carbon capture and storage will increase electricity prices. This, in turn, lowers real wages, the effects of which disproportionately impact developing countries. Those in poverty will be forced to choose between eating and keeping the lights on.
Despite their gloomy forecast, the authors conclude that some type of climate change mitigation is still necessary. Yes, this may put more people at risk of hunger compared to inaction, but doing nothing also has its own set of negative consequences, including sea level rise, ocean acidification, increasing the likelihood of extreme weather, and damaging ecosystems. In other words, we're damned if we do, and damned if we don't.
Source: Tomoko Hasegawa, Shinichiro Fujimori, Yonghee Shin, Akemi Tanaka, Kiyoshi Takahashi, and Toshihiko Masui. "Consequence of Climate Mitigation on the Risk of Hunger." Environ. Sci. Technol. 49 (12): 7245–7253. Publication Date (Web): May 18, 2015. DOI: 10.1021/es5051748
The deadliest tsunami in world history struck southeast Asia on Boxing Day 2004 following a behemoth 9.1-magnitude earthquake. Several years later, in March 2011, another tsunami hit Japan, again following a 9.0-magnitude quake. It is not a surprise, then, that geophysicist Gerard Fryer considers earthquakes to be the most common cause of tsunamis. But, they are not the only cause. Landslides are the second most common cause, such as the ones that generated tsunamis in Lake Geneva and Doggerland, a now submerged region of land in the North Sea that once connected Britain to mainland Europe.
Relatively minor causes of tsunamis include volcanic eruptions and meteor strikes. Surprisingly, rare weather phenomena can trigger a tsunami, and new research concludes that it was an atmospheric perturbation that caused a series of meteorological tsunamis ("meteotsunamis") that struck locations all over Europe in late June 2014. The oddest thing about the meteotsunamis is that they occurred during nice weather.
The figure above depicts three factors that conspired to cause the meteotsunamis: (1) Warm, dry air coming in from Africa at a height of ~5,000 feet (see first column); (2) A strong jetstream at ~16,000 feet blowing from the southwest (see second column); and (3) Atmospheric instability (see blue areas in third column). As this weather system moved eastward, so did the occurrence of meteotsunamis (see circles in third column).
The meteotsunamis, some as high as 3 meters (10 feet), hit coastal areas all over Europe. The authors propose that ocean waves generated by air pressure changes and amplified by resonance were to blame. They also suggest that tsunami warning systems should monitor not only earthquakes, but also freak weather conditions. That appears to be a very sensible idea.
Source: Jadranka Šepić, Ivica Vilibić, Alexander B. Rabinovich & Sebastian Monserrat. "Widespread tsunami-like waves of 23-27 June in the Mediterranean and Black Seas generated by high-altitude atmospheric forcing." Scientific Reports 5, Article number: 11682. Published: 29-June-2015. doi:10.1038/srep11682
"The ant colony is a heavily guarded, nearly impenetrable fortress rich with bountiful resources. Intruders attempting to infiltrate the ant society are immediately discovered via chemical cues, overtaken and dismantled. Nothing gets by, except for the few highly specialized, that have evolved the necessary chemical, morphological and behavioural tools to hack the complex recognition and communication system of the ants. Flying under the ant radar represents a huge boon that not only grants free access to the bounty of the colony—including the ants themselves—but further provides a safe and well-protected harbor to develop and live."
University Roma Tre's Professor Andrea Di Giulio and his team of co-authors set the scene remarkably well in their paper recently published to PLoS ONE. The focus of that paper? A group of conniving beetles that somehow manages to infiltrate ant colonies and parasitize the ants without eliciting any retaliation whatsoever. Ladies and gentlemen, meet the ant nest beetle.
There are more than 800 species of ant nest beetle (Paussus), and though almost all of them bear little resemblance to ants, they all manage to live out their lives in ant colonies, feeding on ant eggs, larvae, and even adults by piercing their mandibles into the abdomen of their unsuspecting victims and sucking out the nutritious innards. You might think that nefarious actions like these would provoke an angry response, but the sly beetles have evolved two clever traits that allow them to evade detection.
First, the beetles secrete chemicals that mimic those produced by ants, allowing them to blend in. Second, as Di Giulio and his team just discovered, the beetles produce distinct acoustic signals via organs on their bodies that imitate the signals of workers, soldiers, and even the queen herself!
"The use of highly sophisticated communication systems is the key attribute that enables ants to act as a superorganism, thereby facilitating their dominance of terrestrial ecosystems," the researchers write. By essentially hacking into these networks, ant nest beetles are able to elicit incredible control over their unwitting hosts.
To uncover ant beetles' deception, the researchers placed both ants and beetles into tiny sound chambers with super sensitive microphones and recorded the distinct acoustic signals that they emitted. They found that the beetles produced at least three different signals, matching those produced by each of the different ant castes: worker, soldier, and queen. The researchers then played distinct ant signals, white noise, silence, and the recorded beetle sounds for small groups of ants enclosed in chambers and observed their behaviors. While the control sounds -- white noise and silence -- elicited few behaviors, the beetle sounds elicited antennation at similar or even higher rates than the ant recordings. This is remarkable, as antennation -- basically touching something with the antennae -- is widely regarded as a welcoming, friendly behavior, similar to a handshake.
Even more fascinating, the ant nest beetle sounds and the queen sounds were the only signals to induce guarding behavior, "a posture similar to that adopted when [ants] attend queens or objects of great value to their society," the researchers write.
"Our data suggest that, by mimicking the stridulations (sounds) of the queen, Paussus is able to dupe the workers of its host and to be treated as royalty."
Source: Di Giulio A, Maurizi E, Barbero F, Sala M, Fattorini S, Balletto E, et al. (2015) The Pied Piper: A Parasitic Beetle’s Melodies Modulate Ant Behaviours. PLoS ONE 10(7): e0130541. doi:10.1371/journal.pone.0130541
Hang two pendulum clocks on the same wall, and over time, something strange will happen: the two clocks will tick in synchrony.
Renowned Dutch scientist Christiaan Huygens, the inventor of the pendulum clock, originally noticed this eerie phenomenon back in 1665. Laid up in his bed with an illness, the 36-year-old Huygens found himself entranced by two clocks contained within the same case. No matter how their pendulums were set into motion, within roughly thirty minutes, they would always end up swinging so that when one pendulum reached the apex of its swing the other would reach its apex in the opposite direction. He eventually reasoned that the swings of each pendulum were causing "imperceptible movements" in the beam connecting the two clocks, bringing them into synchrony.
When Huygens shared his observations with the prestigious Royal Society of London, its members were mostly unimpressed. In fact, they took Huygens' observations as a sign that his clocks were fickle, and not as accurate as they were billed to be.
Experiments carried out nearly 350 years later showed that Huygens was mostly correct in his explanation. When Georgia Tech physicists recreated his clock apparatus, they found that the pendulums exerted miniscule forces on each other through the connecting beam, eventually nudging both into synchrony. The below video shows an approximation of the effect with metronomes.
The flimsy apparatus shown above was designed to exacerbate the effect. So what will happen if two pendulum clocks are connected to a sturdier wall? Turns out, they'll still synchronize, but likely on account of a more subtle form of energy: sound.
Scientists Henrique Oliveira and Luís Melo developed a mathematical model for how two pendulum clocks could affect each other by transmitting a pulse of sound energy once per cycle. They then attached two pendulum clocks to a sturdy aluminum base in close proximity and tested their model by observing the clocks' behavior. As it turned out, the clocks did indeed move into periodic synchrony in a way that remarkably matched Oliveira and Melo's predictions.
The scientists also noticed that external noises, from doors closing to the sound of an elevator, could easily unsettle the system. So if you want to try this it home, keep your voice down!
Source: Oliveira, H. M. and Melo, L. V. Huygens synchronization of two clocks. Sci. Rep. 5, 11548; doi: 10.1038/srep11548 (2015).
Release the golden retrievers! A literature review recently published to the journal Frontiers in Psychology finds that animal-assisted therapy helps alleviate depression, post-traumatic stress disorder (PTSD) symptoms, and anxiety.
For all the coverage that therapy dogs receive in the media, scientific research exploring their actual effects is surprisingly scarce. The research team, led by Purdue University assistant professor Marguerite O’Haire, poured through the scientific literature and found just ten studies that empirically measured the effects of animal therapy on psychological symptoms. Of those studies, five focused on dogs, three focused on horses, and two used a variety of farm animals.
Studies varied widely in their design. Some focused on veterans, others on kids. Some lasted just one week, others went on for three months. Most used surveys to gauge the effects of therapy. But though the studies' designs differed, they all showed overwhelmingly positive results. Depression symptoms fell between 19 and 72 percent. Anxiety was reduced 21 to 65 percent. PTSD symptoms fell 13 to 80 percent.
While dogs were easy to work with and consistently improved patients' symptoms, therapy horses actually produced the most remarkable results.
"One study reported a 63% reduction in problem behaviors among 30 children and adolescents," O'Haire found.
She also uncovered a case study in which a veteran of the War in Iraq suffering from PTSD cultivated a relationship with a horse over twelve weeks. At the end of the study, he reported a 180 percent increase in satisfaction with quality of life, as well as vastly improved sleep that lasted well after the treatment's conclusion.
Though the published research supports animal therapy, O'Haire cautions against using it as a sole treatment.
"Given the preliminary nature of the data, we conclude that at present animal-assisted therapy shows promise as a complementary technique, but should not be enlisted as the first line of primary treatment for trauma."
She recommends that future studies recruit larger numbers of subjects, utilize control groups, strive to ascertain whether effects are short or long-term, and find ways to measure outcomes without surveys, such as tracking devices that monitor sleep and arousal. Dose reductions in drugs used to control psychological symptoms would also be a measurable outcome.
PTSD, anxiety, and depression are notoriously difficult to manage, so it's nice to see that a well-trained pooch can provide tangible relief.
Source: O'haire ME, Guérin NA and Kirkham AC (2015). Animal-Assisted Intervention for trauma: a systematic literature review. Front. Psychol.6:1121. doi: 10.3389/fpsyg.2015.01121
In many ways, plankton rule the oceans. A typical liter of seawater doesn't contain any fish, but it holds over 1,000,000 of these microscopic marine organisms! Plankton come in two varieties: zooplankton and phytoplankton. The latter are generally recognized as the foundation for the ocean ecosystem, harnessing the energy from sunlight to convert inorganic carbon into nutritive biomass. For all their selfless work, phytoplankton are rewarded by getting eaten by zooplankton, which then are consumed by filter feeders, which then are consumed by predatory fish. Such is life in the ocean.
But phytoplankton affect far more than just ocean life. According to a new study published in Science Advances, phytoplankton hugely contribute to cloud formation, and, in turn, to how much sunlight the Earth absorbs.
Armed with NASA's Terra satellite, researchers primarily based out of the University of Washington and Pacific Northwest National Laboratory fixed their gaze on the Southern Ocean, comprising the southernmost waters of the Pacific, Atlantic, and Indian oceans, and examined the relationship between the density of water droplets in clouds over the region and natural aerosols released into the air by phytoplankton.
Aerosols are solid or liquid particulates suspended in air. They are key to cloud formation because water droplets can condense around them.
The researchers found that phytoplankton release large amounts of aerosols in the form of organic matter and sulfates, and that, on average, these molecules boost cloud formation by 60% over the course of a year! Moreover, when solar radiation is strongest during the summer, cloud formation may even double.
Denser clouds reflect more heat back into space, keeping the Earth cooler. The researchers estimate that plankton reduce the amount of solar energy absorbed by the Southern Ocean by roughly 4 watts per square meter over the course of a year, a small, but noticeable amount.
Scientists previously revealed plankton's possible role in cloud formation back in 2004. A NASA-funded team found that when the sun was too strong, phytoplankton produced a sulfur-based compound that would get broken down by bacteria and filter into the air, potentially to seed clouds above and provide some measure of shade.
Source: D. T. McCoy, S. M. Burrows, R. Wood, D. P. Grosvenor, S. M. Elliott, P.-L. Ma, P. J. Rasch, D. L. Hartmann, Natural aerosols explain seasonal and spatial patterns of Southern Ocean cloud albedo. Sci. Adv. 1, e1500157 (2015).
(Images: NASA via AP, Daniel McCoy / University of Washington)
Americans think fruits and veggies suck. That, essentially, is the conclusion of a new CDC report which finds that the average American simply does not eat enough of the healthy stuff. And it's not like the federal dietary recommendations are particularly burdensome; merely 1.5-2 cups of fruit and 2-3 cups of vegetables per day are considered adequate for a healthy diet. Despite this, the vast majority of Americans fall short, preferring instead to lick donuts like Ariana Grande.
The CDC analyzed survey data from across all 50 states and DC to determine Americans' fruit and veggie intake. (See charts below, adapted from CDC data.)
As shown above, only 13.1% and 8.9% of Americans eat enough fruits and vegetables, respectively. Additionally, two lessons are immediately visible in the data. First, people who live in the South eat the fewest fruits and veggies. (The nine states with the lowest fruit consumption are all southern states.) Second, Californians eat the most fruits and veggies. But they have no reason to brag: only 17.7% and 13% of Californians eat enough fruits and vegetables, respectively.
There has been much talk of so-called "food deserts" -- places with few supermarkets selling fresh produce. These food deserts have been blamed, in part, for the American obesity epidemic. (Food deserts are a particular problem in the U.S. South.) However, this new data from the CDC throws into question just how much of an effect food deserts have on Americans' dietary choices. Even states with fantastic access to fresh food, such as New York and California, still exhibit pathetically low rates of fruit and vegetable intake.
The reality, it seems, is that most Americans prefer to eat an unhealthy diet, even if other choices are available. As the cliche goes, you can lead a horse to water, but you can't make it drink.
Source: Centers for Disease Control and Prevention. "Adults Meeting Fruit and Vegetable Intake Recommendations — United States, 2013." MMWR 64 (26): 709-713. July 10, 2015.
For more than fifty years, the Food and Drug Administration has gathered data on pesticide exposure via the food that we buy. Inspectors periodically purchase thousands of food items from grocery stores all across the United States, prepare each food for consumption, and then finally examine each item for traces of 300 different pesticides.
Armed with the most recent data, collected between 2004 and 2005 on 2,240 food items, Dr. Carl K. Winter, a food scientist at UC-Davis, estimated Americans' daily exposure to pesticide residues from food. He then compared the exposure for each pesticide to its known chronic reference dose, an estimate of the amount of a chemical an individual could be exposed every day without any appreciable risk of harm over his or her lifetime. These doses are extremely conservative, often inflated by two orders of magnitude to ensure consumer safety.
The FDA examination detected residues from 77 different pesticides in the collection of 2,240 food items. Winter's analysis showed that exposure to every single one of them was well below their respective chronic reference doses (RfD).
"Exposures to 3 pesticides exceeded 1% of the chronic RfD values while exposures between 0.1 and 1 % of chronic RfD values were noted for 14 pesticides," Winter reported. "Another 19 pesticides demonstrated exposures between 0.01 and 0.1% of the chronic RfD values, while exposures for the other 41 pesticides were below 0.01% of the chronic RfD."
The highest exposure relative to its RfD was of the insecticide methamidophos, which reached 16%. Methamidophos is commonly used on potatoes and rice from Latin America. It is no longer used in the United States.
Interestingly, two insecticides that have been banned for decades -- DDT and dieldrin -- were present in foods at 1.3% and 2.0% of their RfDs respectively.
"Their presence in food results from low environmental degradation and uptake from contaminated soil by plants," Winter explained.
A big drawback of the study is that the data is more than a decade old. More recent statistics from the Environmental Protection Agency, although unfortunately only up-to-date through 2007, show that pesticide use has steadily declined in the United States, so there's no indication that Americans' exposure to pesticides through food would have increased.
Overall, the study's findings are good news for the safety of American consumers.
"Consumers should be encouraged to eat fruits, vegetables, and grains and should not fear the low levels of pesticide residues found in such foods," Winter writes.
Source: Winter, Carl K. "Chronic dietary exposure to pesticide residues in the United States." 10 July 2015. International Journal of Food Contamination 2015, 2:11 doi:10.1186/s40550-015-0018-y
In every single country on the planet, women live longer than men. In response to this unpleasant fact, men are fond of replying, "That's because we have to put up with women." Humorous though it may be, that's not the actual reason women live longer than men. In fact, it wasn't until the beginning of the 20th Century that the "mortality gap" between men and women became so striking.
To investigate the underlying reason for the gender gap in life expectancy, a team of researchers examined mortality data for people born between 1800 and 1935 in 13 developed countries. Using this data, they were able to determine changes in the male-female mortality ratio, as well as determine when and why women began to outlive men. (See figure.)
In the figure above, each birth cohort is represented by a single colored line. For example, people born between 1800 and 1819 are represented by 20 different lines, each of which is colored black; people born between 1920 and 1935 are represented by 16 colored lines, each of which is colored red. The chart plots age on the X-axis (i.e., "age at time of death") against the male-female mortality rate ratio on the Y-axis.
The figure shows that the relative mortality rate for men gets worse in subsequent years. Compare the mortality rates at age 60, for instance. The mortality rate ratio for people born between 1800 and 1839 (black and gray lines) hovers roughly around 1.2; that means that about 120 men died for every 100 women who died at age 60. Just a few decades later, a dramatic shift occurs: the male-female mortality rate ratio for people born between 1880 and 1899 (green lines) skyrockets to 1.6, meaning that 160 men died for every 100 women who died at age 60. Then it goes from bad to worse. For the 1920-1935 birth cohort, the ratio is a shocking 2.1 at age 60, meaning that 210 men died for every 100 women.
A simplified version of the above graph is shown below. Note that the male-female mortality ratio increases across all age groups and generally gets worse over time.
Why is this the case? The authors' analysis suggests two major factors: The first is smoking, which is more common among men. (With smoking factored out, the pattern of an increasing male-female mortality ratio still persists but to a lesser extent, as shown above.) The second is cardiovascular disease, a condition to which men seem to be more vulnerable than women. This may be due to gender differences in diet, lifestyle, and even genetics. Indeed, the researchers found that cardiovascular disease was the major factor causing excess deaths among men as compared to women.
Of course, stress is one of the risk factors for cardiovascular disease, so perhaps men can still blame women, after all.
Source: Hiram Beltrán-Sánchez, Caleb E. Finch, and Eileen M. Crimmins. "Twentieth century surge of excess adult male mortality." PNAS. Published online before print: 6-July-2015. doi: 10.1073/pnas.1421942112
One of the most common parasites infecting honeybees is sexually-transmitted, a new study published to Scientific Reports finds.
Nosema is a unicellular fungus that causes nosemosis, the most widespread disease of honeybees. Diseased bees are often afflicted with dysentery, disjointed wings, and an absent sting reflex, among many other symptoms.
The most common way Nosema is passed is via spore-ridden fecal matter. Bees swallow the spores, which make their way to the insects' guts and germinate. But it turns out that spores can also get into the semen of male bees, and when these bees copulate with the queen, she can also become infected.
Researchers primarily based out of the University of Leeds collected sexually mature male bees from 39 colonies infested by Nosema. They then harvested the insects' semen (very delicately, as one would surmise) and inseminated a group of queens. One out of every five of the queens developed nosemosis.
Luckily for the colony, infected queens do not pass Nosema onto their young. None of the 400 eggs laid by queens in the experiment carried the parasite. However, unluckily for parasite-ridden queens, their days are usually numbered once they take on the parasite. An infected queen's ovaries quickly degenerate, severely reducing her egg-laying capacity. Sensing the queen's infertility, workers then set about rearing replacement queens. When one is ready to take the throne, workers encircle the old queen and sting her to death.
"The results provide the first quantitative evidence of a sexually transmitted disease (STD) in social insects," the researchers said of the study.
STDs have been found in insects before, though, unlike vertebrate STDs, which are commonly caused by bacteria or viruses, insect STDS are usually caused by parasites -- mites, nematodes, fungi, and protists.
Source: Roberts, K. E. et al. The cost of promiscuity: sexual transmission of Nosema microsporidian parasites in polyandrous honey bees. Sci. Rep. 5, 10982; doi: 10.1038/srep10982 (2015)