RealClearScience Journal Club

Science Figures Interpreted and Analyzed by RealClearScience

Transplants: First Facial, Now Anorectal

Alex B. Berezow - September 22, 2014

"Defecation is a major activity of daily living." Thus begins a new paper in the journal Scientific Reports, and truer words have rarely been written.

Most humans who are physically unable to drop a deuce, perhaps from cancer or an anatomical anomaly, must undergo a colostomy. In this medical procedure, the large intestine is diverted via a surgically created opening (called a stoma) to an external bag that collects feces, which must be emptied regularly. Many people who wear such devices suffer from psychological problems and a lower quality of life. An alternative -- any alternative -- is preferable.

Japanese researchers have turned to anorectal transplants. In their estimation, this has the potential to surprass traditional reconstructive surgical techniques, which do not properly restore anal function. In order to develop the technique, the authors practiced on four healthy beagles, performing the very first successful anorectal autotransplant in dogs. (In an autotransplant, the transplanted tissue comes from the same animal.)

The researchers removed the existing anorectal region, replaced it with a graft formed from other nearby tissues, and reconnected the blood vessels and nerves. The dogs were then fed a liquid diet. The transplant in Dog #1 was unsuccessful, and it was euthanized the next day. Dogs #2 and #3 survived for three and four days, respectively, before being euthanized.

Dog #4 survived nine days, and the researchers injected it with a dye (indocyanine green) in order to monitor and collect data on blood flow to the tissue graft, a procedure known as an angiography. (See figure.)

As shown above, figure (i) depicts the anorectal region, while figures (j) and (k) depict the anorectal region before and after the angiography had begun. The colored dots represent areas in which the authors were interested in collecting blood flow data. (Red = positive control; Blue = negative control; Yellow = anal canal; Green = perianal skin.) Blood flow, as illustrated in figure (l), was considered satisfactory and compared favorably to the dog that never underwent surgery.

Likely because the dogs were only fed a liquid diet, they did not defecate after the surgery. So, the authors cannot definitively determine if the graft functioned properly. However, demonstrating normal blood flow post-surgery is a positive first step.

Now, you might be wondering, Why did they practice this technique in dogs? Good question. Other researchers have successfully attempted similar procedures in rats and pigs. However, the authors claim that these animal models are unsatisfactory because they do not exhibit the same bowel control that dogs and humans do. In other words, man and man's best friend can poop (or not) on command. Thus, it is best to perfect the technique in dogs.

How about non-human primates? They are problematic because, as the authors write, "they are not usually trained with good lavatory manners." Indeed.

Source: Jun Araki et al. "Anorectal autotransplantation in a canine model: the first successful report in the short term with the non-laparotomy approach." Scientific Reports 4, Article number: 6312. Published: 10-Sept-2014. doi:10.1038/srep06312

(AP Photo)

Artificial Sweeteners Aiding the Obesity Epidemic?

Ross Pomeroy - September 17, 2014

A new study published in the journal Nature has found that zero-calorie artificial sweeteners promote glucose intolerance in mice and humans.

Glucose intolerance is marked by higher-than-normal levels of blood sugar. It's a known precursor to diabetes and a risk factor for cardiovascular disease.

Scientists at the Weizmann Institute of Science in Israel spearheaded the research using a step-by-step approach. First, the researchers added three artificial sweeteners -- either sucralose, aspartame, or saccharin -- to the water of three groups of mice. Three other control groups of mice were given normal water or water with added sucrose or glucose. 11 weeks later, the mice consuming artificial sweeteners exhibited marked glucose intolerance compared to control groups.

Since most artificial sweeteners aren't broken down by the digestive system, the research team hypothesized that the glucose intolerance was mediated by changes to gut bacteria. To test this, they induced glucose intolerance in both lean and obese mice with the artificial sweetener saccharin, then gave both groups antibiotics to kill off gut bacteria. After the antibiotic regimen, both groups of mice saw their glucose response and blood sugar levels return to normal.

For further confirmation that the gut flora was regulating glucose intolerance, the team transplanted gut bacteria from mice fed saccharin to control mice. The control mice exhibited signs of glucose intolerance six days later.

To see whether their results in mice would carry over to humans, the researchers conducted two experiments. First, they examined a group of 381 individuals without diabetes and queried them about their intake of artificial sweeteners. They found that artificial sweetener intake was linked to various metabolic parameters, including increased weight, a larger waist-to-hip ratio, and higher blood sugar levels. Second, the researchers fed seven subjects the FDA’s maximal acceptable daily intake of saccharin per day -- equal to roughly 40 cans of diet soda -- for seven days and observed what happened. Four of the individuals developed significantly poorer glycemic responses. The other three subjects also showed worse responses, however the differences did not reach statistical significance.

The authors of the study feel their results merit a serious "reassessment" of massive artificial sweetener use. But others aren't so sure.

Professor Naveed Sattar, a Professor of Metabolic Medicine at the University of Glasgow, says that the study's findings in regards to mice are interesting, but noted that we should be skeptical of extending them to humans.

“The findings of this study do not prove that sweeteners pose any real risk to humans.  If there are any risks, we need well controlled studies in humans to find them.”

Sir Stephen O'Rahilly, a Professor of Clinical Biochemistry and Medicine at the University of Cambridge, agreed with Sattar.

"The authors report an association between artificially sweetened beverages and markers of diabetes in 381 people; however, a recent study involving more than a third of a million people showed no association between consumption of artificially sweetened drinks and the development of diabetes.  Suez et al also report adverse effects on glucose levels after 7 days of saccharin ingestion. However these experiments were undertaken in only seven people, so must be deemed preliminary."

According to the Harvard School of Public Health, "The health benefits of artificial sweeteners are inconclusive." Some studies show that they are healthy replacements for sugar, while others have showed no net benefits.

While the current study presents intriguing findings in mice, its human components lack rigor. So don't feel compelled to ditch diet soda just yet.

In general, it's probably best to avoid habitually consuming copious amounts of sugar or artificial sweeteners.

Source: Jotham Suez et. al.  "Artificial sweeteners induce glucose intolerance by altering the gut microbiota." Nature. 18 Sept. 2014. doi:10.1038/nature13793

In U.S., Hispanics Live Longer than Others

Alex B. Berezow - September 15, 2014

Today (Monday, Sept. 15) is the beginning of National Hispanic Heritage Month. Undoubtedly, one of the topics that will be discussed in the media over the next 30 days is healthcare in the Latino community. Last year, for instance, CNN ran an article that discussed how Hispanics were less likely to seek out treatment for mental health issues, possibly because of a stigma that exists in the community.

Other sources offer more dire information. In an article titled "The Truth Behind Latino Health Disparities," George Washington University says that Hispanics suffer from obesity, HIV, and other conditions due to "less education, higher rates of poverty, unhealthy living conditions and environmental hazards." The U.S. government apparently believes that ethnic health disparities are a big enough issue to warrant their own bureaucracy: The Office of Minority Health.

Given the information available, one would predict that Hispanics have a shorter life expectancy than whites in America. But, that's not true. The latest CDC data (PDF, Table 8) reveals that the life expectancy at birth for Hispanics in the U.S. is the highest of all ethnic groups examined:

What is going on? How can Hispanics suffer from more health problems than whites, yet somehow live longer? There are at least two, non-mutually exclusive explanations:

1) Immigration, both legal and illegal. Notably, the life expectancy data shown above is at birth. Quite likely, Hispanic immigrants are poorer and less healthy than Hispanics born and raised in the U.S. Furthermore, health studies often do not report immigration status, almost certainly due to political reasons. (More on that below.) Yet, this matters. A person living illegally in the U.S. may be less likely to seek medical assistance out of fear of being deported.

2) Hispanics suffer from a lower quality of life. Quality of life is difficult to measure, but that doesn't mean epidemiologists haven't tried. CDC data shows that Hispanics are likelier to report poorer overall health (e.g., more unhealthy days) than non-Hispanics (PDF, Table 1, p. 34). But once again, the data isn't broken down by immigration status.

Thus, it would be highly advisable for the CDC to investigate how immigration status affects ethnic health disparities. But it probably will not.

In our current political environment, politicians (particularly those on the Left) tend to interpret all disparities among minorities as being the result of bias or discrimination. It is politically convenient, but the life expectancy at birth data for Hispanics suggests something other than social injustice. While "discrimination" is an easy answer, the trouble with easy answers is that they often come at the expense of correct ones.

Source: Centers for Disease Control and Prevention. "QuickStats: Life Expectancy at Birth, by Sex and Race/Ethnicity — United States, 2011." MMWR 63 (35): 776. (Sept. 5, 2014.)

(AP Photo)

Why Aren't More Women Feminists?

Ross Pomeroy - September 12, 2014

It's been called the "Feminist Paradox": Feminism's aim is to improve the lives of all women, yet only about a third of women in the United States identify as feminists. That disparity is even starker when you consider that surveys show three-quarters of women to be concerned about women's rights.

Why do most women eschew the feminist label? Perhaps it's because we don't like to categorize ourselves socially or politically. After all, far more Americans identify as independent rather than align themselves with a political party. It could also be that feminism is not clearly defined. There's liberal feminism, radical feminism, Marxist feminism, ecofeminism, womanism, and a boatload of other different brands, all espousing diverse ideologies and expressing varying levels of activism. The hodgepodge might lead women to remain agnostic. Another reason is the less-then-flattering image of feminists offered up in the popular media, which often depicts them as extremist, unattractive man-haters.

Entering the controversial fray with a new eyebrow-raising explanation -- and even some empirical data to back it up -- is a team of psychologists from Umeå University in Sweden. They hypothesize that the activists who get the headlines and shape feminist attitudes are "generally more physiologically and psychologically masculinized than is typical for women." Basically, there may be "biological differences between women in general and the activist women who formulate the feminist agenda."

"We propose the feminists-as-masculinized-females theory... as a partial explanation for the feminist paradox," they write in a research article published Tuesday to the journal Frontiers in Psychology.

The empirical data that was mentioned originated from a feminist conference in Sweden. The researchers visited the gathering and offered candy in exchange for participation. Subjects answered questions designed to measure social dominance and had their hands imaged with a high resolution scanner so that researchers could measure the ratio of the length of the index finger to the ring finger, which is the most widely used index of prenatal testosterone exposure. Men have a lower ratio (meaning the index finger is shorter than the ring finger) while women have a higher ratio (the index and ring fingers are more similar in length). Participation in the study was anonymous and no demographic data was collected. The researchers did not directly ask the subjects whether they were feminists, thinking that it would deter participation.

25 women -- approximately 35% of attendees at the conference -- took part in the survey. The researchers found their index-ring finger ratio to be vastly more masculine compared to the average for Swedish women. (For the statistics lovers out there, the p values were <0.000001 for the right hand and 0.00016 for the left hand. That's remarkably significant.) The difference indicates that the women were exposed to higher levels of testosterone during development, engendering more masculine characteristics.

The biggest limitation to the study is the small sample size. One simply cannot draw sweeping conclusions based on 25 individuals, a fact the authors admit.

"The target population studied here is not necessarily representative for anyone who sympathizes with feminism or self-identifies as a feminist. As our data pertain to feminist activists, we cannot and do not bring them to bear on women in general."

Source: Guy Madison, Ulrika Aasa, JohnWallert and Michael A.Woodley. " Feminist activist women are masculinized in terms of digit-ratio and social dominance: a possible explanation for the feminist paradox." Front. Psychol., 09 September 2014 | doi: 10.3389/fpsyg.2014.01011

(AP Photo)

Plants May Be Able to Grow in Martian Soil

Ross Pomeroy - September 8, 2014

If we ever want to colonize the Moon or Mars, we're going to need to eat. But an act on Earth as easy as heading to a Wendy's drive-thru is decidedly more complicated in an extraterrestrial environment. Neither locale has an atmosphere openly hospitable to growing food, and any long-term human presence cannot be reliably sustained on a lifeline of deliveries from Earth. The simple fact is that we'll have to find a way to grow food on Mars or the Moon. To that end, a new, preliminary study published in PLoS ONE bears some good news.

Researchers from the Netherlands planted fourteen different plant species in simulated Martian and lunar soils. They found that plants were able to germinate and grow without the addition of any nutrients.

For the experiment, the researchers obtained the same simulated Martian and lunar soils used by NASA. Control soil from Earth, taken from a nutrient poor area near the Rhine River, was also used. Soils were placed in various pots, and seeds from species like field mustard, red fescue, common vetch, tomato, rye, and carrot were planted individually in each pot. In all, there were 840 pots: 3 soils x 14 plant species x 20 replicas for each. All of the pots were kept in the same greenhouse under identical conditions: roughly 60 degrees Fahrenheit with approximately 16 hours of light per day. Only demineralized water was provided.

The research team concluded the experiment after fifty days. Roughly 20% the plants in lunar soil, 50% of the plants in Earth soil, and 65% of the plants in Martian soil were still alive. Overall, crop species were the most successful. 80% of the tomato, rye, carrot, and garden cress plants were alive in Martian soil at the end of the study. Many also produced leaves and a few even grew flowers.

Lunar soil apparently lagged behind the other two soils because of its low acidity and inability to hold water.

The results support an intriguing possibility. Instead of bringing along soil from Earth and maintaining crops within botanical chambers, astronauts could simply erect a dome over an area of the Martian landscape and sow crops within that artificial environment, potentially needing only to apply water and fertilizer.

However, heavy metals are more prevalent in extraterrestrial soils, and they may kill off plants before they can mature, or render the food they produce dangerous for human consumption.

Overall, there are still many, many questions that need to be answered, the researchers say.

"More research is needed about the representativeness of the simulants, water holding capacity and other physical characteristics of the soils, whether our results extend to growing plants in full soil, the availability of reactive nitrogen on Mars and moon combined with the addition of nutrients and creating a balanced nutrient availability, and the influence of gravity, light and other conditions."

Source: Wamelink GWW, Frissel JY, Krijnen WHJ, Verwoert MR, Goedhart PW (2014) Can Plants Grow on Mars and the Moon: A Growth Experiment on Mars and Moon Soil Simulants. PLoS ONE 9(8): e103138. doi:10.1371/journal.pone.0103138

One Thing Short and Tall Men Have in Common

Alex B. Berezow - September 8, 2014

There aren't very many things that short and tall men have in common. Tall men make more money, have a greater choice in women, and are likelier to be elected president than their vertically challenged brethren. For all the talk of "white privilege," maybe it is time for our culture to ponder the implications of "tall privilege." That's because, as a general rule, short guys have received the short end of the societal stick. (No pun intended.)

But, in at least one biological aspect, short and tall men share something in common: A less than ideal immune response.

A team composed of European researchers examined how 130 young Latvian men (aged 19-30) and 65 young Latvian women (aged 18-24) responded to a hepatitis B vaccine. Specifically, they measured how much anti-hepatitis B antibody each person produced. Their results were reported in the journal Scientific Reports. (The graph below shows the data for men.)

As shown in the graph, the male antibody response was strongest when a man was 185 cm tall (roughly 6' 1"), but was weaker for both shorter and taller men. No such relationship existed in women.

It is not entirely clear why this should be the case. The authors hypothesize that a larger body is costly in terms of resources, and thus, tall men have fewer resources to dedicate to maintaining a robust immune response. That is an interesting idea, but it does not explain why short men have a weaker immune response, and it also does not account for the lack of a relationship between height and immune response in women.

The study is further limited by their sample selection. It is possible, due to some genetic quirk, that the discovery applies only to people of Latvian or eastern European descent. The team also only examined young people, and their sample of women was half the size of their sample of men. It is possible that a larger study of women would yield a statistically significant result. Finally, the authors only examined a tiny sliver of the immune response, i.e.., the production of anti-hepatitis B antibodies. A more holistic approach -- for instance, one that examines each person's susceptibility to infectious disease -- is desirable, but would likely require a large epidemiological study.

Though the team's observation is certainly interesting, a lot more work needs to be done to establish a definitive link between human height and immune response.

Source: Indrikis A. Krams, Ilona Skrinda, Sanita Kecko, Fhionna R. Moore, Tatjana Krama, Ants Kaasik, Laila Meija, Vilnis Lietuvietis & Markus J. Rantala. "Body height affects the strength of immune response in young men, but not young women." Scientific Reports 4, Article number: 6223. Published: 28-August-2014. doi:10.1038/srep06223

(Photo: Tall people via Shutterstock)

Conscious Brain-to-Brain Communication Achieved

Ross Pomeroy - September 4, 2014

Futurists and technologists envision a future where talking and typing are obsolete forms of communication. Instead, we'll communicate brain-to-brain.

An international team of roboticists and neurologists has just demonstrated that this is genuinely possible. They report their findings in the journal PLoS ONE.

In the study, one subject in India donned an electroencephalography (EEG) headpiece. EEG uses non-invasive electrodes to record brain activity. At the same time, 5,000 miles away in France, another subject was hooked up to a transcranial magnetic stimulating (TMS) device with his eyes covered by a blindfold. Both set-ups were connected via the Internet.

Next, the subject in India sent the words "hola" and "ciao" -- translated into binary -- to the subject in France. If the subject in India envisioned moving his hands, a 1 was sent to the subject in France, who would see a light appear in their peripheral vision. If the subject in India envisioned moving his feet, a 0 was sent, and the subject in France would see a light in a different location. Slowly but surely, the binary numbers were sent, received, and translated. The total error rate was only 11% the first time around, and when the experiment was repeated, it was just 4%.

The current experiment is a far cry from anything out of science fiction, but it is a solid proof-of-concept.

"Although certainly limited in nature, these initial results suggest new research directions, including the non-invasive direct transmission of emotions and feelings or the possibility of sense synthesis in humans," the researchers say.

"We anticipate that computers in the not-so-distant future will interact directly with the human brain in a fluent manner, supporting both computer- and brain-to-brain communication routinely."

Source: Grau C, Ginhoux R, Riera A, Nguyen TL, Chauvat H, et al. (2014) Conscious Brain-to-Brain Communication in Humans Using Non-Invasive Technologies. PLoS ONE 9(8): e105225. doi:10.1371/journal.pone.0105225

Will Propane-Making Bacteria Revolutionize Energy?

Alex B. Berezow - September 2, 2014

Fire up the grill!

Propane is the fossil fuel of red-blooded Americans. What poolside or tailgating experience would be complete without firing up the gas grill and torching some meat? (I know, I know... there are charcoal devotees out there.) Even metropolitan mass transit systems are getting in on the excitement. Fleets of buses that run on "LPG" (liquefied petroleum gas) are burning a mixture of propane and butane.

Currently, propane is extracted from natural gas or crude oil. But, in the long run, this is neither a sustainable nor an environmentally friendly practice. Burning propane extracted from the earth is also not carbon-neutral, though it is better than combusting oil or coal. Thus, researchers are looking for ways to produce renewable "fossil fuels" through the use of alternative technologies, such as synthetic biology. Last year, for instance, scientists engineered E. coli to churn out a biofuel that resembled gasoline.

A team of researchers led by Patrik Jones of the University of Turku in Finland, however, believes they have invented a better method. Instead of gasoline, they focused on propane. Why? Because propane has the distinct advantage of being easily converted from gas to liquid (and back). Thus, as the bacteria release gaseous propane, the authors argue it should be easy to capture and store it as a liquid. Additionally, because propane does not build up in the culture medium, it will not inhibit the growth of the bacteria.

To create their tiny fuel manufacturers, the authors had to design a propane biosynthetic pathway that does not exist in E. coli. To do so, they borrowed genes from multiple bacteria: Bacteroides fragilis, Mycobacterium marinum, Bacillus subtilis, and Prochlorococcus marinus. (The pathway they created is shown below. Black letters denote chemical compounds, while red letters denote enzymes and proteins.)

The pathway shows that their genetically engineered E. coli can be fed glucose (sugar) and crank out propane. The next step is scaling up for large-scale industrial production. Their current platform is inadequte for that, but some preliminary data suggests reason for optimism. They would also like to transfer the pathway into a photosynthetic microbe, which would essentially convert the energy of sunlight into propane.

There are three big lessons from this research:

First, it is time for anti-GMO activists to grow up and accept biotechnology. The future of everything from food and medicine to renewable energy and high-tech materials will rely on it.

Second, the potential of synthetic biology (recently discussed in the journal Nature Reviews Molecular Cell Biology) may be limited merely by our imagination. Already, synthetic biology has been used by bioengineers to create genetic logic gates -- a useful technology for biosensors and biological computers -- and by systems biologists to further decipher the complexity of the cell.

Finally, the benefit of basic research is unforeseeable. Most likely, the scientists who figured out how these obscure metabolic pathways worked never became famous nor expected their research to change the world. And yet it just might.

Source: Pauli Kallio, Andras Pasztor, Kati Thiel, M. Kalim Akhtar, & Patrik R. Jones. "An engineered pathway for the biosynthesis of renewable propane." Nature Communications. Published 2-Sep-2014. DOI: 10.1038/ncomms5731

Can a Scientist's Writing Style Reveal Fraud?

Ross Pomeroy - August 28, 2014

Diederick Stapel was one of the most celebrated and published scientists in the field of social psychology. But in 2011, it was revealed that much of his career was built on fraud. He had tweaked or completely fabricated data in no fewer than 55 of his 125 published papers! Stapel lost his title as a respected professor and became known for what he truly is: an academic "con man."

Three years after Stapel's deception was revealed, two Cornell researchers, David Markowitz and Jeffrey Hancock, have put Stapel's misconduct to good use. In an analysis published to PLoS ONE, the pair compared the linguistics in 24 of Stapel's confirmed fraudulent papers with the linguistics in 25 of his genuine ones, seeking to find out if Stapel's writing differed between his honest and deceptive efforts. When a scientist hands you lots of lemons, why not make lemonade?

The results of Markowitz and Hancock's study are fascinating.

"The analysis revealed that Stapel’s fraudulent papers contained linguistic changes in science-related discourse dimensions, including more terms pertaining to methods, investigation, and certainty than his genuine papers. His writing style also matched patterns in other deceptive language, including fewer adjectives in fraudulent publications relative to genuine publications," the authors write.

Stapel tended to fortify his methods section with extra description and employ words like ‘‘profoundly,’’ ‘‘extremely,’’ and ‘‘considerably’’ to make his results sound more convincing and dramatic. At the same time, he also used fewer terms that might downplay significance, such as "less," "somewhat," and "merely."

"Stapel presumably attempted to emphasize the novelty and strength of his findings, which ended up being 'too good to be true.'"

Using their findings, Markowitz and Hancock put together a simple model for detecting fraud, and, when tested on Stapel's papers, found it to be 71.4% accurate. Not bad, but still nowhere good enough to be feasible in other contexts. Moreover, the writing patterns found to indicate fraud in Stapel's writing may not apply to other scientists. It would be very interesting to see if they do -- something to examine in a future study!

Source: Markowitz DM, Hancock JT (2014) Linguistic Traces of a Scientific Fraud: The Case of Diederik Stapel. PLoS ONE 9(8): e105937. doi:10.1371/journal.pone.0105937

First Observation of Death Valley's Sliding Rocks

Alex B. Berezow - August 27, 2014

A dry lake in Death Valley, called Racetrack Playa, is home to the famous "sailing stones." These large rocks, some of which weigh up to 700 pounds, leave behind long trails in the dirt, indicating that something -- or someone -- has been moving them. (See photo above.) But how?

Conspiracy theorists and others with active imaginations have implicated aliens (of course), powerful magnetic fields, or just plain old magic as the culprit behind the mysterious phenomenon. More serious speculators suggested dust devils or a combination of rain and strong wind. These explanations, however, are wrong.

Last year, Live Science (and several other outlets) reported that the mystery was solved. Researcher Ralph Lorenz had discovered that a rock with a ring of ice around it is buoyant (PDF). When placed in a shallow pool of water, Dr. Lorenz found that he could move the rock simply by blowing on it. Additionally, if there is sand at the bottom of the shallow pool, the rock scrapes out a visible path. But, as it turns out, this explanation isn't quite right, either.

Now, in a new paper published in PLoS ONE, Dr. Lorenz and a team of researchers led by Richard Norris and Brian Jackson have definitively demonstrated how the rocks move. They placed 15 GPS-equipped limestone rocks on Racetrack Playa. They also installed a weather station nearby, as well as several cameras. Then, the researchers monitored the rocks from late November 2013 to early February 2014.

The team found that their rocks moved on two different days, December 4 and December 20. The most impressive rock weighed in at 36.6 pounds and made a trail 215 feet long, but its speed topped out at a sluggish 0.22 mph. (One of their rocks is pictured below.)

The rocks were located in a shallow pool of water that froze many of the nights. The next morning, when the temperature rose, the large sheet of ice would crack and pools of water would form. Large chunks of ice could then freely move around, dragging the rocks along with them, as the wind blew. In contrast to their hypothesis, the rocks did not become buoyant. Instead, the ice sheets -- which were only a few millimeters thick, but several meters across -- could exert a force large enough to move the rocks.

Not only did the authors collect data on their GPS-equipped rocks, but their cameras also captured several indigenous rocks gliding about.

The authors note that it would be difficult personally to watch the rocks in action due to their glacial pace. Also, the rocks move only on rare occasions; i.e., when a shallow pool of water is present along with freezing conditions and wind. Complicating matters for would-be enthusiasts is the fact that the ice would block anyone's view of the trail, which is formed in the mud below. Only after the water is blown away by the wind do the trails become visible.

Still, people who enjoy the sport of curling also may be inclined toward rock-watching. Now that the conditions under which the rocks can move are known, avid rock-watchers should plan a winter camping trip to Death Valley. Just be sure to pack some extra patience.

Source: Norris RD, Norris JM, Lorenz RD, Ray J, Jackson B (2014) Sliding Rocks on Racetrack Playa, Death Valley National Park: First Observation of Rocks in Motion. PLoS ONE 9(8): e105948. doi:10.1371/journal.pone.0105948

(Photo: Death Valley's "sailing stones" via PLoS ONE)

More Journal Club

A World Cup in Qatar's Heat Won't Be Safe for Players or Even Spectators
August 21, 2014

Qatar is currently scheduled to host the FIFA World Cup in the summer of 2022. There are many reasons why...

Narcolepsy May Be an Autoimmune Disease
August 18, 2014

Fans of the 2001 movie Rat Race will remember that Rowan Atkinson's character suffered from narcolepsy....

First Interstellar Dust Brought Back to Earth
August 14, 2014

Fifteen years ago, the Stardust spacecraft blasted off from Cape Canaveral with the goal of collecting the...

Journal Club Archives