The Great Condom Hypocrisy

If your Baby Boomer parent nags you for the upteenth time about practicing safe sex with a haughty tone of moral superiority, reply with a simple statistic. The rate of gonorrhea was roughly five times higher during the 1970s than it is now. You know, mom or dad, condoms were around back then.

Society is convinced that trouble-making teens need all of the instruction when it comes to sex, but in fact, adults may need more tutoring.

In school, kids are now rightfully bombarded with information about contraception, abstinence, and practicing safe sex, and the barrage seems to be working. From 2000 to 2010, the teen birth rate dropped precipitously across all ethnicities. Over that same period, rates of gonorrhea fell slightly or remained steady, perhaps because 8 out of 10 sexually active boys and 7 out of 10 girls say they used a condom during their last sexual experience. That rate could be as low as 59%, but even so, it still easily trumps that of older Americans.

Yep, the lowest rate of condom use is among people aged 45 and over. Okay, you might say, but that's because more adults are in committed relationships. True, but ninety-one percent of men older than 50 admitted to not using condoms for sex with a date or casual acquaintance, researchers from Indiana University found. Numbers like that undoubtedly contributed to this intriguing statistic: From 2000 to 2010, rates of sexually transmitted diseases such as chlamydia, gonorrhea and syphilis doubled for people in their 50s, 60s and 70s.

Teen still contract far more sexually transmitted diseases than adults. Adolescents ages 15-24 account for nearly half of the 19 million new cases of STD's each year, the CDC reported in July. But -- and this is just my guess -- that number probably says more about the prevalence of sexual activity by age group than it does about safe sex practices. After, all sexual activity declines rapidly beginning in middle age, most prominently due to physical difficulties.

Teens, it seems, are far more inclined to use condoms than adults are. Therein lies one of the greatest sex hypocrisies. Of course, that doesn't mean that sexual education should stop. Far from it! More than 400,000 teen girls aged 15–19 years gave birth in 2009. That's far too high!

But it does mean that the parents of teen students might want to attend a few sex education classes of their own, especially before adopting a "holier than thou" attitude.

(Image: AP)

String Theory: Now Circling the Drain

The largest physics experiment ever built is now testing the nature of reality. String theory, supersymmetry and other theories beyond the Standard Model are under scrutiny. More than 10,000 people have been involved. Total cost is nearing $10 billion. This, of course, is the Large Hadron Collider (LHC), which helped discover the Higgs Boson.

Simultaneously, the ACME experiment, run by a team of less than 50, built for a few million dollars (and much, much smaller), has created a more precise test of these advanced theoeries. This experiment hinges on an extremely painstaking and precise method to picture the shape and size of electrons.

Electron electric dipole moment (EDM) is brutal physics jargon. It's a direct measurement of the interal structure of an electron. If an electron is an absolutely perfect sphere of negative charge with an infinitely small radius, the EDM is 0. If it's actually an extremely tiny speck made of even tinier specks of electric charge sitting next to one another, the EDM is some tiny value greater than 0.

String theory and similar theories require many unknown physics measurements. One of these is the magnitude of the electron electric dipole moment that this experiment probes. Each type of theory requires an EDM within a certain range of values to work.

What we know so far is that every experiment yet built has measured EDM = 0. (Not precisely zero, but zero within a margin of error, effectively establishing a tiny maximum experimental error value as the largest possible size.) Is the electron really like a tiny black hole with infinite density filling infinitely small space? Answering this question sheds light on the bigger one as well: Just how plausible are theories beyond the Standard Model?

Measurements taken for several decades have gradually reduced the possible size of the EDM. This year, the ACME collaboration published the best measurement ever made of the EDM: smaller than 8.7 x 10^-29 cm. (Free copy here.) It's an unimaginably small number. (Imagine an ant's size compared to that of the universe. This number is 100 times smaller than the ant.)

What this tiny value does is severely limit the likelihood of several particular varieties of theory stretching beyond the Standard Model.

de is the EDM, and the blue, red, green and purple areas are various theories beyond the Standard Model. Every theory to the left of the red line is now impossible! Large portions of the imaginary world of String Theory (e.g., versions of SUSY or "supersymmetry" shown above) are being ruled impossible by this machine 1/1000th the size and cost of the LHC. Previously, String Theory was on life support. Now, it might be circling the drain.

A more powerful upgrade of the ACME experiment coming on-line in the next few years may be able to push the elimination line about two more powers of ten to the right, refuting more theories.

Most importantly, finding funding for future physics projects as big as the LHC is going to be difficult. Managably small experiments like ACME relying on clever and precise methods may displace those relying on pure size and power.

Sugar Is Not Toxic

Defending sugar is not something David Katz thought he would ever find himself doing. In his two-decade-long career in public health as an associate professor at Yale University and as the director of the Yale Prevention Research Center, Katz has frequently warned about the dangers of excess sugar consumption. But now, he finds himself having to repeatedly debunk the extremist "sugar is toxic" message that has crowded out the less inflammatory evidence-based message of "just eat less sugar."

Writing last year in the Huffington Post, Katz was very clear.

"Sugar, in general, is not poison," he said. "Breast milk contains sugar. The human bloodstream contains sugar, at all times, and the moment it doesn't, we die."

To label sugar "toxic" is misleading. It implies that the sweet substance is dangerous at any dose. But of course, it's the dose that makes the poison. A single can of Pepsi, a veritable bastion of sugar, is a delicious complement to a slice of pizza. To die from an acute toxic overdose of sugar, the average adult male would have to drink roughly 58 of those cans in rapid succession. That doesn't leave much room for the slice of pizza.

Polonium, on the other hand, now that's a poison. As little as 800 nanograms -- an amount so small you could barely make it out on the palm of your hand -- is enough to kill the average adult male.

I make this comparison not to trivialize the health drawbacks of sugar, only to demonstrate that sugar is obviously not a poison.

The American Heart Association and the World Health Organization recommend that women consume no more than 24 grams of "added sugar" (basically, sugar not found in fruits or non-sweetened milk) each day. For men, that number is 37 grams. Currently, conservative estimates indicate that Americans consume roughly twice the recommended amounts. Much of that sugar comes from nutrient-deficient soft drinks, luxurious desserts, processed food, or candy. Eating too much of any of that stuff increases the risk of fatty liver disease, heart disease, diabetes, and being overweight.

Here's what we know: Eating sugar in excess, as many Americans currently do, is unhealthy. But to take that statement any further down the provocative road is simply not in accordance with the facts. For example, the American Diabetes Association lists the statement, "Eating too much sugar causes diabetes" as one of the biggest diabetes myths.

"The answer is not so simple," they qualify. "Type 1 diabetes is caused by genetics and unknown factors that trigger the onset of the disease; type 2 diabetes is caused by genetics and lifestyle factors."

But what about the scary studies out there showing that high-fructose corn syrup -- predominantly composed of two types of sugar, fructose and glucose -- engenders deleterious effects in rodents? Well, as Katz explains, the scientists orchestrating those experiments fed the animal subjects an amount of sugar equivalent to far, far more than the average human consumes.

"The levels of fructose intake invoked to produce end-organ damage in provocative articles do not occur under real-world conditions. Pushed to comparable extremes of dosing, articles about oxygen would reach far grimmer conclusions, concluding the compound is not just toxic, but uniformly lethal over a span of mere days."

Moreover, as Scientific American's Ferris Jabr points out, rodents are not humans.

"Studies that have traced fructose’s fantastic voyage through the human body suggest that the liver converts as much as 50 percent of fructose into glucose, around 30 percent of fructose into lactate and less than one percent into fats. In contrast, mice and rats turn more than 50 percent of fructose into fats, so experiments with these animals would exaggerate the significance of fructose’s proposed detriments for humans, especially clogged arteries, fatty livers and insulin resistance."

On the other hand, systematic reviews that have examined fructose intake in humans have found no specific effect of the widely used sugar on blood pressure or body weight.

For a quick source of bodily fuel, nothing tops sugar. That's the primary reason sugary sports drinks like Gatorade have been consistently shown to enhance athletic performance. And the intermittent ice cream cone, perhaps with a friend, family member, or significant other, is not just acceptable, but healthy.

In short, sugar is a substance meant to be used strategically and enjoyed occasionally, not avoided at all costs. Don't worry, parents, a little sugar binge tomorrow is not as scarily detrimental to your kids' health as many articles on the Internet would have you believe.

(Image: Shutterstock)

The Math Behind the 'Shoe Size-Age' Trick

The "shoe magic" trick is making its way around the Internet. (See above.) It's a nifty little math trick, and it does indeed work (most of the time). I will first use my own data to demonstrate.

I wear a size 10 shoe.

10 x 5 = 50.

50 + 50 = 100

100 x 20 = 2000

2000 + 1014 = 3014

I was born in 1982.

3014 - 1982 = 1032

It works! Amazing! Indeed, my shoe size is 10, and I am 32 years old.

How does this work? It requires a little bit of algebra to understand. Let's call shoe size "s" and your birth year "y." Now, let's do the trick again, using these letters instead of numbers.

Multiple s by 5: 5s

Add 50: 5s + 50

Multiply by 20: 20(5s + 50) = 100s + 1000

Add 1014: 100s + 1000 + 1014 = 100s + 2014

Subtract birth year: 100s + 2014 - y

Do you see why it works? No matter what your shoe size is, it will always be the first two digits of the answer. If your shoe is 12, then 100s = 1200. If you are European and your shoe size is 36, then 100s = 3600.

The age part should be obvious. 2014 (the current year) - y (your birth year) will give your age.

But this trick does not always work. If you were born in, say, December 1982, then this trick would incorrectly conclude that you were 32 years old. In fact, you would be only 31. The trick also does not work if you are 100 years old or older. If you were born in 1914 and wear a size 10 shoe, the trick would conclude that your shoe size is 11 and that you were 0 years old.

So, if you want to impress people with a math trick this Halloween, just be sure their birthday is sometime before November... and don't show it to any centenarians!

A Psychiatric Evaluation of Michael Myers

For psychiatric resident students taking the REDRUM Psychopathology course at the Robert Wood Johnson Medical School at Rutgers University, weekly homework assignments are a tad more macabre than you might expect. Instead of reading Hales, Gabbard, or Blazer, they're watching Bates, Krueger, and Myers, as in Michael Myers, from cinema serial killer fame.

REDRUM stands for Reviewing [Mental] Disease with a Rudimentary Understanding of the Macabre. Anyone who's seen The Shining knows it better as "MURDER" spelled in reverse. In the REDRUM course, Professor Anthony Tobia harnesses Hollywood to teach his students about mental disorders.

At first, this might seem like a bad idea. Movie portrayals of mental illness are often wildly inaccurate. Thanks to One Flew Over the Cuckoo's Nest, lots of people think electroconvulsive "shock" therapy is a barbaric, ineffective, and antiquated treatment. In fact, it's not. Many other flicks portray psychiatric doctors and asylum directors as uncaring or even malevolent. But Tobia takes great care "not to perpetuate the stereotypes of mental illness often portrayed in cinema."

"Residents are directed not to take the movies at face value," Tobia says. "Instead, we focus on an abstract and symbolic understanding of the plot summary or aspects of character analysis that allow psychiatry residents to discuss major teaching points germane to a full spectrum of adult mental illnesses."

Residents learn, for example, that demonic possession as seen in The Exorcist, is not actually caused by demons. Instead, it's woefully misdiagnosed mania, Tourette’s syndrome, conversion disorder, histrionic personality, or dissociative identity disorder. It might even be a confluence of some or all of these conditions.

Slasher films offer some of the best educational material. The legend of Jason Voorhees from Friday the 13th is a symbolic outcome of the direct effects of fetal alcohol syndrome. Michael Myers losing the ability to speak after murdering his sister showcases conversion disorder, where symptoms of acute or chronic stress manifest physically. Myers also likely suffers from voyeurism and autism.  If you're looking for a metaphorical example of narcolepsy, look no further than A Nightmare on Elm Street, in which nightmares involving maniacal killer Freddie Krueger intrude into wakefulness. Krueger himself likely has pedophilia, psychopathy, and "mother issues."

Zombies -- modern society's favorite monsters -- also offer an invaluable teaching opportunity when viewed through the lens of neurology, says Dr. Susan Hatters-Friedman, a forensic psychiatrist at Auckland University.

"Schlozman suggests the ataxic gait and imbalance in Romero’s zombies could be explained by a cerebellar lesion. Constant hunger could be explained by abnormalities in their ventral medial hypothalamus not receiving or being able to interpret signals. Lack of problem-solving and executive functioning capacity may be explained by prefrontal cortex damage, and their characteristic lack of fear and constant anger by amygdala alterations."

Tis the season for horror flicks. Make it an educational one!

Sources: Anthony Tobia et. al. The Horror!: A Creative Framework for Teaching Psychopathology Via Metaphorical Analyses of Horror Films. Academic Psychiatry March 2013, Volume 37, Issue 2, pp 131-136

Friedman SH, Forcen FE, Shand JP. Horror films and psychiatry. Australas Psychiatry. 2014 Oct;22(5):447-9. doi: 10.1177/103985621454308

(Image: Shutterstock)

Iran's Evil Imprisonment of My Colleague

Omid Kokabee is a physicist and PhD student who sat in front of me in class. He worked on a project much like mine, in the same field, on the same degree, and we even started at the same time. That was four years ago. I'm still here. He went home to Iran to visit his family in January of 2011 and never came back.

The cruel and evil authoritarian states of the world sometimes seem remote. We read about them in news reports. They murder and ruin the lives of people we never see or know. We know how terrible their crimes are, but it's all at a distance. This situation is as personal as a punch in the face.

Kokabee disappeared into thin air on his way back to Texas. Investigation by his American colleagues revealed that he was arrested at an airport in Iran. He was thrown into a prison in Tehran and left for over a year without charges.

Bewildered letters penned by Kokabee from his cell questioned the regime's coercion of his family and himself. He was not allowed a lawyer and openly threatened by the judge before charges were brought. Time went on and Kokabee kept sharp by teaching English, Spanish and physics to his fellow prisoners.

In the summer of 2012 he was convicted in Iran's kangaroo Revolutionary Court of communicating with a hostile government and receiving illegal earnings. These sham charges were brought by the government hoping to force Omid into collaborating with the regime's floundering physics efforts.

Nature reported that 10-15 other people were tried simultaneously, and all plead guilty to the surely fictional crime of collaborating with Israel's Mossad intelligence agency. Kokabee refused to speak.

For his defiance of the Iranian government's demands, Kokabee was sentenced to rot for a decade or more in jail. An extra sentence was tacked on for teaching those prison classes.

Omid was not a close acquaintance. But, the fact that the man who worked in the lab next door to mine every day disappeared into prison and never came back is jarring and incredible. I work closely with another Iranian. He no longer goes home to see his family.

It's not just this one case. Scientists, students, writers and peaceful protesters are held in prison by Iran and other repressive and authoritarian governments. It's a fate that reminds us that brutal and corrupt rulers still control parts of our world. What can we do? Collective pressure can bring hope: Iran's supreme court may take up an appeal of Kokabee's case.

It may not be popular for scientists to say this, but I will: Evil is real. We must fight to ensure it does not triumph.

(AP photo)

Another Big Psychology Theory Fails to Replicate

The bakery at most grocery stores is a minefield. Cakes to the left, muffins to the right, pastries dead ahead, and cookies... cookies everywhere. If you escape without making a purchase, congratulations, you have tenacious self-control. Or you were just lucky. But though your wallet and waistline won't take a hit, according to a leading psychological theory, your willpower will.

Originally put forth back in 1998 by Florida State psychologist Roy Baumeister, the notion of ego depletion states that self-control is a limited resource. Like a muscle, it can fatigue with use, and needs time to recharge. According to the theory, saying "no" to sweets in the grocery store will leave you temporarily vulnerable to subsequent temptations.

In the sixteen years since its inception, ego depletion has been tested and validated in a variety of situations. Psychologists emphasize its role in many arenas, such as dieting, athletics, and consumer behavior. Some even propose that willpower can be trained and strengthened via repeated use, again, just like a muscle.

But critics aren't so sure. They note that much of the research has been done on young, WEIRD (Western, Educated, and from Industrialized, Rich, and Democratic countries) college students, and thus may not carry over to the general population. They also suggest that the effect could benefit from publication bias, the tendency to only publish flashy or positive results. It is in this light that a team of psychologists recently attempted to replicate the ego depletion effect using the two most frequently used measures of self-control in scientific research. Their results were just published to PLoS ONE.

Four groups of participants took part in the study: two diverse groups from the general population with an average age in the mid-forties, and two more groups of young adults with an average age around 20. Each group was assigned to one task, either a grip challenge in which subjects had to hold a grip machine at 70% of their maximum grip strength for as long as possible, or a Stroop Test, in which subjects viewed color words on a computer (like 'green', 'yellow,' or 'blue') that appeared one at a time in a mismatched font color (for example, 'red' may be shown in blue font) and have to press the key corresponding to the font color.

The subjects completed each of their respective tasks twice. In between the repetitions, some subjects engaged in an activity meant to diminish their self-control whilst others performed an easy control activity. The researchers then compared the scores of the subjects who performed the control activity with those who performed the experimental one.

The subjects who had performed the task meant to diminish self-control should have performed significantly worse on their assigned tasks compared to the control groups the second time around. But they did not.

"There was no evidence for significant depletion effects in any of these four studies," the researchers found.

The researchers don't believe their results are in error. The tasks they used were identical to those most frequently employed in past research, and they also used similar sample sizes. In fact, the researchers suggest their study design may be stronger, because, unlike many previous studies which found evidence for ego depletion, they used a control group and recruited subjects from a broader population.

Despite the failure to replicate, the researchers don't believe their study is enough to invalidate ego depletion altogether, just that the effect may be more limited than has previously been theorized.

Ego depletion is not the first big idea in psychology to face replication problems. A key theory, priming, took a notable hit last year.

Some social psychologists, like Harvard University's Jason Mitchell, have fired back at detractors, suggesting that they are impugning the integrity of their colleagues, overstating the problems of publication bias, and are likely producing "false negative" results.

(Images: AP, Nevit Dilmen)

Source: Xu X, Demos KE, Leahey TM, Hart CN, Trautvetter J, et al. (2014) Failure to Replicate Depletion of Self-Control. PLoS ONE 9(10): e109950. doi:10.1371/journal.pone.0109950

The Healthiest Diet 'Proven' by Science

It's actually happened.   

After decades of research filled with millions of meals eaten by hundreds of thousands of subjects, the verdict is in. Science is now ready to proclaim the healthiest way to eat: one diet to rule them all.

So which is it? Atkin's, perhaps? Or Paleo? Low-Carb? Low-Fat? South Beach? Raw? Fruitarian? Veganism?

The answer, my friends, is none of the above. But it could also be all of the above. That's because healthiest diet isn't a specific diet at all. It's the absence of a diet.

This is not a sudden, world-changing, mind-altering finding. It is not well suited to a blaring news headline. It is not share fodder on social media. What it is, however, is a realization that surfaced gradually and methodically: Science will never conclusively prove that a single diet is the best diet.

Author Matt Fitzgerald summarized the finding, or rather, the lack thereof, in his new book Diet Cults:

"Science has not identified the healthiest way to eat. In fact, it has come as close as possible (because you can't prove a negative) to confirming that there is no such thing as the healthiest diet. To the contrary, science has established quite definitively that humans are able to thrive equally well on a variety of diets. Adaptability is the hallmark of man as eater. For us, many diets are good while none is perfect."

Further support for this notion comes from a simple glance back at the history of our species. Mankind has populated almost every corner of the earth, and in every diverse situation, humans were able to survive, even thrive, on whatever food their homes had to offer.

Even more convincing evidence has been found by observing those who have lived the longest. The University of California-Irvine's 90+ Study has tracked thousands of Americans who've made it to age 90 and beyond, yielding an unprecedented wealth of information about their lifestyle habits. For lead investigators Claudia Kawas and Maria Corrada, the most surprising finding they made is that most participants didn't seem to be too concerned with their health. Generally, the 90-year-olds said they didn't really keep to a restrictive diet. Nor did they abstain from alcohol, quite the opposite actually! The researchers found that up two drinks a day -- no matter the type -- was associated with a 10-15% reduced risk of death. They also discovered other things that might disturb ardent dieters. Vitamin supplements did not affect lifespan in any way, and being a little overweight starting in middle age positively affected longevity.

But what if you're already overweight and want to shed some pounds? In that case, pick whatever diet works for you, because they all can work. What matters the most for weight loss is finding a solution that you can adhere to. That much was elucidated in a review recently published to the Journal of the American Medical Association. Scientists reviewed a multitude of randomized trials on popular diets and, lo and behold, found that all the diets helped subjects shed pounds, with minimal differences in weight loss between each diet.

Just like there is no one true religion, there is no one true diet. So why do so many dieters believe that there is?

"The short answer is that people believe what they want to believe," Fitzgerald wrote in Diet Cults. "The complete answer is that people want to believe that a certain way of eating is the best way because it gives them a sense of identity and a feeling of belonging. It's the work of that old, no-saying human impulse to eat according to the rules of a special group, which is often much stronger than the reasoning faculties."

"It feels good to believe in something."

(Image: AP)

How to Make Scientists Publish the Truth

John Ioannidis is (in)famous in the scientific community. Using straightforward logic and statistics, he convincingly demonstrated that most published research articles are wrong. This is not because scientists are liars and crooks, but because studies often do not have large enough sample sizes or are testing unlikely hypotheses. Ioannidis's revelation sent a shock wave through the biomedical community. Partially in response to his findings, biomedical scientists began to embrace reforms in scientific publishing, such as using more open access journals and publishing replications and negative data.

Now, in a new paper in PLoS Medicine, Dr. Ioannidis proposes additional reforms. Some of the more interesting ones include:

Registration of studies. Clinical trials already do this. (See ClinicalTrials.gov.) This would allow researchers to monitor ongoing studies. Others have proposed that all registered studies should be accepted for publication upon their completion, regardless of the outcome of the experiment. This would eliminate "publication bias," the phenomenon in which only "sexy" results are published and negative (or uninteresting) results are ignored.

Adoption of better statistical methods. It is not a big secret that biologists are bad at math. (Well, except for John Ioannidis.) Papers with a lot of mathematical equations are avoided by biologists. The heavy mathematical lifting required in some biomedical fields, such as epidemiology and genomics, is outsourced to biostatisticans. Dr. Ioannidis suggests more stringent thresholds for statistical significance. That is certainly necessary, but there also should be a requirement for all biomedical PhD students to take courses in biostatistics.

Improvements in peer review. Though Dr. Ioannidis does not offer specific details, one group is strongly advocating post-publication peer review. F1000Research publishes papers along with their expert reviews, which are not allowed to be anonymous.

Consideration of stakeholder interests and modification of incentives. Members of the scientific community place different values on research. For example, professors want research that is publishable, while industry wants research that is profitable. Dr. Ioannidis identifies four interests that need to be considered: publishability, fundability, translatability, and profitability. Furthermore, he proposes a radical change in incentives, such as eliminating academic ranks (e.g., tenure).

Scientific publications are on the cusp of a dramatic shift. Thanks to the efforts of Dr. Ioannidis and others like him, the monolithic biomedical establishment is beginning to embrace change. If only the rest of academia was so reflective and self-critical.

Source: Ioannidis JPA (2014). "How to Make More Published Research True." PLoS Med 11(10): e1001747. doi:10.1371/journal.pmed.1001747

Hilariously Stupid Science Questions: Not Another One... Yes Another One!

Back in April, we made a subtle threat: to continue "sharing hilariously stupid science questions until they stop being funny." Granted, it could just be because we have a poor sense of humor, but we're still laughing. And so, today, we make good on that threat.

This will be the fifth time we've shared questions like these. For some of the past lists, click here, here, here, or here. But not here.

It takes some serious forethought to evade logic so skillfully. And so, we tip our hats once again to the brilliant jokesters over at Reddit who thought up these jocular, reason-defying queries.

If Catholics only have mass on Sundays, do they cease to exist the rest of the week?

How can I access my Daylight Savings account?

Why are red-handed people more genetically predisposed to crime?

If 200,000 people die every year from drowning and 200,000 people have already drowned this year, does that mean I can breathe under water?

I just bought a Prius. At what point do I develop a sense of superiority, and will I still be able to eat gluten?

Can we achieve higher education by building taller schools?

If the body replaces all of it cells every 7 years, shouldn't we release all inmates after 7 years as they're not the same person anymore?

How come some mountains look like presidents?

Since humans share 50% of their DNA with bananas, can scientists merge two bananas to create a human?

Why do meteorites always land in craters?

If electricity comes from electrons, does morality come from morons?

Is the Islamic State solid, liquid or gas?

via Reddit

(Image: Secret Ingredient via Shutterstock)

Meet the 'Terror Birds'

Often, when explaining how birds are related to dinosaurs, people compare a picture of a Tyrannosaurus rex with that of a chicken. It's an apt juxtaposition, but it would be more elucidating with another animal wedged in between. I nominate Kelenken.

Standing almost 10 feet tall with a beak 18 inches long, Kelenken was a member of the extinct phorusrhacid family, otherwise known as the "terror birds."

By most accounts, the birds earned their ominous nickname. The vast majority of scientists believe that the "terror birds" were carnivores, and brutal ones at that. Kelenken and its stockier brothers Brontornis and Titanis were something like avian jackhammers, likely using their massive, bone-reinforced beaks to peck their prey to death. Since the birds mustered a top speed of around 27 miles per hour, the pummeling could have been preceded by a chase. It seems more likely, however, that the feathered predators ambushed their prey from forest brush or long savannah grass. Though it's enticing to picture Kelenken taking down and devouring animals the size of deer or antelope, its regular fair probably fell in the size range of modern day dogs or cats.

Fortunately for the mental health of our children, these big birds predated appearing on Sesame Street by a couple million years. Living between 62 and 2 million years ago, the "terror birds" primarily populated South America back when it was still an island, though a few fossils have been unearthed in Europe, prompting paleontologists to scratch their heads and draw arrows on maps.

Though dominant on their South American island paradise, the "terror birds" would eventually meet their downfall when faced with competition from hearty North American animal rivals. When the Panama land bridge formed between South and North America some 2.5 million years ago, it was the beginning of the end for the "terror birds." The best explanation for their demise is that competition from prehistoric dogs and saber-tooth cats drove them to eventual extinction.

(Images: Shutterstock, FunkMonk, Degrange et. al.)

Why Laser Fusion Power Is So Difficult

Is the National Ignition Facility's goal of generating practical electrical power realistic? How long would it take? Having learned the history lesson on inertial confinement fusion and its many acronyms, we know how we got here. Now we can cover the fun stuff: will we ever get it to work as a power plant?

Two distinct types of challenges face ICF "laser fusion" as a commercial power source. Engineering problems tackle transforming established scientific principles into working machines. First though, science has to know what will work. That is the question of what is possible and, if so, how to attempt it.

When we imagine an Apollo project or a Manhattan project for nuclear fusion, we frame the problem somewhat incorrectly. The basic science of using liquid propellant rocket engines and multi-stage rocket platforms had already been figured out well before Apollo began. American Robert Goddard, in parallel with German and Russian rocket scientists, had worked it out decades before. At the end of WWII the US claimed V2 rockets and their designers from Germany; we already had a working small-scale liquid fueled rocket capable of reaching the edge of space. Improving and supersizing rockets enough to blast people into space was primarily by this point an (admittedly enormous) engineering problem. A huge infusion of money and brilliant mindpower accomplished it in a matter of a decade.

So too with the Manhattan project. We believed our scientific concept that a sufficient mass of fissile material could attain a runaway chain reaction. Einstein had already supplied the matter-to-energy conversion factor. Brilliant physicist Enrico Fermi built the world's first nuclear reactor under the bleachers at the University of Chicago football field. This showed the energy output from a neutron chain reaction in fissile material. Manhattan was all about finding an engineering method to get enough uranium and plutonium to build a handful of bombs and to design a bomb mechanism capable of assembling the necessary critical mass instantly and perfectly. This was a tremendously hard issue on its own, even with most of the science previously resolved.

NIF is in a more difficult position. We don't know the science well enough to reach the engineering phase.

Scientific unknowns abound in laser fusion. First, our models of the appropriate conditions needed for useful extraction of fusion energy are not tested by experiment. Currently, we think that we need about 10 megajoules (MJ) of laser energy (i.e., about 138,000 professional tennis serves or a pickup truck hitting a wall at 223 mph) to get net power out of NIF. The facility's record-breaking laser produces only about 1.8 MJ.

Furthermore, 10 MJ from the pellet is just the energy break-even point. To see practically useful gain instead of eeking out almost no power, we think we need a 10-fold increase in power: a 100 MJ laser. Then figure that most heat is lost when being converted to power. So perhaps we need another factor of 10: a 1000 MJ (1 gigajoule) laser. Most historical predictions of the energy required for energy gain with ICF have been vast underestimates too.

The trouble is that designing entirely new lasers is more of a science question. Entirely new laser gain materials such as specialized doped glass that costs thousands of dollars per inch need to be invented. New concepts for generating bursts of photons and controlling their conversion from one frequency to another must be found. The field of optics needs time to develop and understand ideas of how to make more energetic laser pulses. Perhaps better fuel design could lessen the need for laser advances.

The moment of the pellet implosion is another deep science question. At laser impact, a chaotic ball of x-ray photons, nuclear ions, gamma rays and energy is created, which is extremely hard to analyze. An entire field, hydrodynamics, studies these messy situations. We've been working on hydronamics for centuries, but the models we use don't do a good job of predicting what happens in moments of pressure, temperature, and chaos so high as these. The poor projections of these models are part of why we are so far behind our goals with ICF so far.

Say we invent a laser 1000 times more powerful, design new pellets, and new methods to perfectly crush them. We're still not done. Engineering challenges will be an enormous obstacle. Essentially, we'll need a Manhattan Project for fusion.

Currently, NIF fires at a maximum rate of roughly once per hour. For commercial energy production, blasting a pellet roughly ten times per second is required. This means we'll need to figure out a way to run the entire experiment 36,000 times more often and also 36,000 times more quickly. Engineering and science are at play here. Powering up the laser to fire this quickly is a problem for both scientists and engineers: it needs to charge up with power 600 times faster than the current model.

More traditional engineering challenges are also manifold. A method is needed to collect the heat from the target chamber with reasonable efficiency. The fusion energy is currently not collected at all. Then we need a system capable of feeding in a pellet, triggerering the laser, entering the chamber, removing the remnants of the blasted target, replacing it with a fresh one, and resealing the chamber. This needs to happen in 100 ms or less, while also not leaking significant heat out of the chamber.

The fuel pellet too will take some engineering. Roughly 100,000 pellets will have to be blasted every day for every plant. This means we'll need to produce fuel pellets by the millions and at low enough cost not to ruin the economics of the machine. Large and stable sources of deuterium and tritium will be need to be founded.

These are just the foreseen challenges. Entirely unforeseen complications will almost certainly arise. While it is foolish to say that we will never see consistent electrical production from this method, it will be an enormous struggle. The fight is utterly impossible to win within a decade. A more realistic expectation is probably 100 years for a commercially viable powerplant.

That's a shocking number. However, it says more about the difficulty of the project than the quality of the scientists involved and their work. For fusion powr to be successful, we must plan for the long, long haul.

The author would like to gratefully acknowledge discussion with former NIF director Mike Campbell for insight into NIF and ICF energy projects.

(AP photo)

Ebola, Marburg, and a Real Life Cave of Death

The expansive mouth of Kenya's Kitum Cave can appear out of nowhere. Mount Elgon National Park's jagged landscape, coupled with a dense cover of lush, green plant life, conceals the cave surprisingly well from human wanderers until they're right on top of it. Animals, however, have no problem finding Kitum. Fruit bats, insects, and especially elephants are frequent loiterers. The large pachyderms shelter within the 700-foot system, and also seek out a surprising snack. Using their hardened ivory tusks, the elephants dislodge stones from the cave wall and grind them to bits with their teeth, sucking up the stores of salt within.

Kitum Cave is, in fact, a petrified rain forest. Seven million years ago, the volcanic Mount Elgon erupted and buried the surrounding rain forest in ash. Kitum is deep within what was once a molten sea. Mineralized logs jut out from the cave walls. So do all sorts of bones, from animals like crocodiles, hippos, and elephants. It's eerily beautiful.

In 1980, a 56-year-old Frenchman living in Kenya entered Kitum cave and may have been entranced by that beauty. After a few hours of exploration, he left spellbound and in awe. But something evil left with him.

Seven days later, the headache started. Fever and nearly nonstop vomiting arrived three days after that. Then the Frenchman's face became droopy and lifeless, and the skin turned yellow. His whole personality changed. Friends helped him board a plane so he could go to a hospital in Nairobi. Onboard, he continued vomiting, now heaving up a black liquid. He filled up a couple of barf bags. His nose started bleeding... and it wouldn't stop.

When the plane landed, he stumbled off in a stupor, finding the nearest taxi and mumbling the words "Nairobi... Hospital" to the driver.

The taxi driver was kind enough to help the Frenchman into the hospital when they arrived, and made it very plain to the attending nurse that this man was in dire need of help. Assurances were made that a doctor would see him promptly.

It was already too late.

While sitting in the waiting room, the Frenchman suddenly leaned over, vomited up an immense quantity of blood, then fell unconscious. Blood began to seep from every orifice and creep along the floor. The evil entity that originated in Kitum Cave had destroyed its host. Now, it was leaving its mangled, corporeal home in search of a fresh one.

The entity that infected the Frenchman was Marburg (seen right). A filovirus like Ebola, Marburg's symptoms are identical to those of its more notorious sibling. Genetic differences distinguish the two viruses. To date, there have been 467 documented cases of Marburg, resulting in 377 deaths. That's a kill rate of 80.7%. (For reference, the mortality rate of the current ebola epidemic is around 50%.)

The death of the Frenchman, who science journalist Richard Preston dubbed "Charles Monet" in his 1994 book The Hot Zone, was a landmark case for the scientific study of Marburg. Not only was it the first time that Marburg had surfaced since 1975, when it infected three people in South Africa, but it also allowed investigators to try and nail down the source of the infection. Since Monet was a bit of a loner, the investigation proved difficult. Seven years later, however, Marburg reared it's gruesome head again, infecting and killing a young Danish boy in Africa. When it was discovered that the boy had visited Kitum Cave with his family 11 days before his death, investigators decided they had to pay the site a visit.

In Spring 1988, a joint U.S. and Kenyan operation led by the United States Army Medical Research Institute of Infectious Disease's Gene Johnson began. Decked out in full biohazard gear, the team members scoured the cave, capturing bats, birds, and insects, sampling bat guano and elephant feces, and scraping slime off walls. At first, Johnson was sure he'd come to the home of one of nature's most ancient and proficient killers. But when all was said and done, he left empty-handed. If Marburg lurked in Kitum Cave, it had vanished without a trace.

"It must have been a bitter disappointment for Gene Johnson," Preston wrote in The Hot Zone, "so disheartening that he was never able to bring himself to publish an account of the expedition and its findings."

Two decades later, a cave in neighboring Uganda would divulge some of Marburg's secrets. In the wake of an outbreak among miners working at Kitaka Cave, scientists detected the virus in local Egyptian fruit bats, the same species of bat that occasionally shacks up in Kitum. The researchers esimated that 5% of the 100,000 animal colony at Kitaka was infected. It was the first time that hard data implicated bats as a major natural reservoir of the virus.

Scientists still aren't sure precisely how the Ebola strain currently terrorizing Western Africa made the leap into humans. Ebola and its sibling Marburg are as mysterious as they are insidious.

(Images: Richard Preston, Wilson Disease)

Getting Cancer at San Francisco Airport

Last week, on my way back to Seattle, I came across a peculiar sign posted next to Gate 62 at San Francisco Airport. (See photo above.)

I refer not to the maximum occupancy sign, which is itself a bit strange considering there are no well-defined boundaries for Gate 62, but the one next to it: This area contains chemicals known to the State of California to cause cancer and birth defects or other reproductive harm.

So, I looked around. There was a sandwich shop, a few trash cans, and several tired passengers in the waiting area. No cataclysmic carcinogens there. The carpet looked clean. Was the sign referring to carpet cleaner? Or was it referring to the jet bridge, where you might get a brief whiff of jet fuel? The absurdly ominous and vague sign left the threat entirely to your imagination.

Who put the sign there? The people of the State of California. Back in 1986, they passed Proposition 65, an attempt by environmental do-gooders to create a carcinogen-free utopia. In accordance with the law, the governor must publish a list of "known" carcinogens or chemicals capable of causing birth defects. The 23-page-long list (PDF) includes such terrifying molecules as aspirin, alcohol, ganciclovir (an antiviral), metronidazole (an antibiotic), nickel, rifampin (another antibiotic), testosterone, tetracycline (a very common antibiotic), and wood dust. The fact that testosterone made the list is particularly problematic, since every human being alive produces testosterone. According to California, we're all doomed.

However, properly informed citizens know that the dose makes the poison. Carcinogens are everywhere. Many of them are natural compounds. But the vast majority of them are not at concentrations high enough to worry about.

The foolishness in California is the inevitable consequence of a chemophobic society that wields the evil twins of regulation and litigation as weapons in a fruitless effort to achieve the impossible: a life completely free of any risk whatsoever. And while it's easy to point and laugh at California, the truth is the vast majority of Americans favor another equally absurd policy: The labeling of GMOs.

But, just like Proposition 65, the outcome of any GMO labeling law is entirely predictable: Because most (up to 75% of) food at the grocery store contains at least one genetically modified ingredient, the GMO warning label will appear everywhere. And warning labels that appear everywhere are meaningless and absurd, just like the airport cancer sign.

Yet, there are two other pernicious effects to a proliferation of warning labels: First, people start ignoring them. That is not a good thing. Some products, such as cigarettes and drain cleaners, really are toxic and dangerous. Will people ignore warning labels on all products? Second, warning labels encourage manufacturers to seek alternatives to alleged "carcinogens." Unfortunately, the alternatives are often less studied and potentially more carcinogenic.

The irony, of course, is that in their struggle to make people safer, the world's chemophobes may be achieving the exact opposite.

(Photo: Alex Berezow)

New Fusion Reactor Cheaper than Coal? BS.

Inventor: "I've just created my most perfect work: a new type of paper airplane."

Funding agency: "Wow great but what does it do?"

Inventor: "Oh, right now it's useless, but soon I'll just scale up the concept and we'll have a cheaper space shuttle!"

Shame on the University of Washington for hyping its research with this exact logic. Touting the "great potential" of a new cheaper-than-coal fusion plant, they see reality and choose to look the other way. Or maybe they're just incredibly naive.

To quote the press release: "Perhaps the biggest roadblock to adopting fusion energy is that the economics haven't been penciled out". BS! Let's talk about the real obstacle to fusion power. The report itself actually leads us to the culprit.

Among many statements ranging from meaningless to wrong, this is the one that dodges the heart of the matter: "They [researchers] have designed a concept for a fusion reactor that, when scaled up to the size of a large electrical power plant, would rival costs for a new coal-fired plant with similar electrical output." (Emphasis added.)

What is keeping fusion energy from reaching market? The fact we don't understand how to do it well yet. It's impossible with current science. Why? Precisely because we don't know how to scale it up. We can make little demonstration fusion reactors like the one in this report, but expanding them to become large enough to produce useful power eludes our grasp. The entire problem of current fusion devices is scaling them up.

The basic idea in this proposal is a magnetically confined plasma device that uses a geometrical plasma configuration called a spheromak. It's a magnetic field bottle built to trap plasma inside. It's a sibling of the tokamak, the best known and best working of the current fusion devices. The idea is quite old, dating to the 1950s. Several of these machines were built in the 1970s and 1980s, notably by the lead investigator of this work.

The press report claims that the spheromak device is simpler than the tokamak. This is only true to a point. There are fewer external magnets in such a device. Enormous electromagnets require very high electrical current, so the spheromak needs less power. However, part of the magnetic field confinement of the plasma is performed by the magnetic field produced within plasma itself. (Travelling charged ions produce magnetic fields calculable with the laws of electrodynamics.) While this idea sounds simpler, it's actually more difficult in many ways: you have fewer magnets for external control, and the internal plasma configuration is actually much more complicated.

This very difficulty is why the world's biggest and best fusion projects are tokamaks and not spheromaks. The extra control magnets and simpler plasma configuration inside the machines have allowed them to be scaled up to larger sizes much more easily.

So, not only is scaling up the entire problem with magenetic fusion, this device is probably much more difficult to scale up even than current tokamaks, which have not yet been economically scaled up and may not be for several decades.

RCS enthusiastically supports fusion research and increased funding for fusion projects. However, we do not tolerate misleading information being reported.

(AP photo)

Tropical Cave Art Alters Origins of Creativity

The tropical karst landscape of Southern Sulawesi in Indonesia is dotted with sinkholes and caves, forming, in many places, a vast, underground world. You can thank soluble rocks like limestone, dolomite, and gypsum for that. The caves afforded our ancient ancestors cover from the torrential rains that define the island chain's climate. The walls inside also granted them easels to express their creativity.

Archaeologists have known about the wondrous rock art in the Maros–Pangkep caves of Southern Sulawesi for over half a century. What they didn't know was how old they were. A team primarily based out of the University of Wollongong in Australia has just found out, and the answer has altered the timeline and geography of human creativity.

Dr. Anthony Dosseto and his colleagues dated the art to roughly 40,000 years ago. Among the dozen drawings analyzed is a hand stencil that -- at a minimum age of 39,900 years old -- is now the oldest known in the world, and a drawing of half-deer, half-pig looking animal called a babirusa, which may be the earliest figurative depiction in the world.

Dating cave art is easier said than done. Often, due to weathering, the pigments in the paintings themselves don't contain enough carbon for typical dating methods. Contamination can also lead to inaccuracies. So instead, scientists may rely on clues in the immediate vicinity. For example, they might date nearby artifacts or remains and apply those ages to the artwork. Occasionally, scientists will bring out their magnifying glasses and attempt to sleuth out a painting's age by examining its depictions. A drawing of a mammoth, for example, would lend a rough estimation because we already know when mammoths existed.

To gauge the paintings' ages in this case, the researchers dated mineral deposits called speleotherms that had grown on top of the drawings. When formed, the crystal-looking objects contain small amounts of uranium which slowly decay to thorium. Since the scientists already knew that rate of decay, they were able to extrapolate and determine the rough age of the paintings.

For a long time, human creativity was thought to have been born in Europe in a sort of prehistoric Renaissance. Only from there, did it truly flourish. A great many cave paintings dating to more than 30,000 years ago have been discovered in France and Spain, including the amazingly preserved and transcendent artwork in Chauvet Cave and the oldest known artwork in the world, a simple red "disk," found in El Castillo Cave. (The latter may actually have been painted by Neanderthals!) The current finding may require anthropologists to rethink that Eurocentric narrative.

"I think this suggests that modern humans had this creativity, this artistic expression, with them when they spread out of Africa," Chris Stringer, a paleontologist at the Natural History Museum in the United Kingdom told Nature.

“Europeans can’t exclusively claim to be the first to develop an abstract mind anymore. They need to share this, at least, with the early inhabitants of Indonesia,” Dr. Dosseto said.

The discovery was published October 8th to the journal Nature.

(Images: Kinez Riza)

Source: Dosseto et. al. Pleistocene cave art from Sulawesi, Indonesia. Nature 514, 223–227 (09 October 2014) doi:10.1038/nature13422

Will Laser Fusion Power Work? Part I

Crushing tiny capsules of matter into oblivion...

Turning matter into energy via nuclear fusion with massive lasers is a dramatic and beautiful way to create energy. Problem is, it's not working very well. In fact, it's kind of a mess. What happened, and why is it failing? Will laser inertial confinement fusion ever become a viable energy source?

The Past and Present is NIF

Recently, I called the world's biggest and best laser fusion experiment a failure. There is no doubt that the National Ignition Facility (NIF) failed to meet its stated goal of reaching fusion "ignition." Misleading press releases attempted to breathe life into this stone-cold failure: their best shot produced 100 times less energy than was fired in. Worse, the real efficiency as measured from the electrical input was just one unit of power out for 28,000 in. It's not pretty but there is more depth to the story. Examining the history and current status of NIF as the premiere laser fusion project in the world tells us where.

Simulating the heart of the atom bomb

"Laser fusion" is more properly called inertial confinement fusion (ICF). The 1970s and 80s saw a series of big laser programs run at US national labs. These facilities were essentially miniature, primitive NIFs. They tested the preliminary feasibility of ICF: using a massive laser pulse to crush a pellet made of hydrogen atoms with extra neutrons until the atoms fuse. Most of the fused mass forms a helium atom, but some is directly converted from matter to energy. The NOVA laser at Lawrence Livermore National Laboratory and other projects hinted at the requirements to perform ICF on an energy efficient scale. It was clear, despite early over optimism, that vastly more powerful and advanced efforts would be needed to come vaguely close to a viable power source.

Still-secret tests, carried out under the Nevada desert the 1980s crushed deuterium (H + neutron) pellets with nuclear explosions to explore how much energy was needed to produce fusion. Whatever the results were, they were so encouraging that laser fusion began to look much more promising. Physicists decided that a laser 25-250 times more energetic than NOVA could achieve success in a lab, without the A-bombs. However, the billion-dollar price tag and as-yet undeveloped technology required to build such a project made it a very hard sell.

Things changed completely in 1992, when the U.S. stopped all live tests of nuclear weapons. Suddenly, the country faced the prospect of designing, building and maintaining weapons that had never been tested, and running weapons programs with personnel who had never conducted any experimental nuclear test work. That's clearly a scary proposition, even with the goal of reducing nuclear proliferation. The entire concept of non-proliferation rests completely upon the expected ability of the weapons to work. Weapons that don't deliver upon their threat upset the balance of power.

It was clear that facilities to test and simulate conditions found in nuclear fission and fusion reactions were necessary if we were going to design and maintain weapons and train future weapons program technicians. A massive laser system like the one proposed for NIF was seen, correctly, as the only facility capable of producing the heat and pressure conditions to simulate a nuclear or thermonuclear blast. Abruptly, NIF became much more feasible.

Something for Everyone

The real trick with getting NIF started was that it was pitched from several angles simultaneously. To the Department of Defense and more conservative politicians, the project was sold as a testing bed for weapons technology and designer training. To the DOE, liberals, the press and the public, the fusion power aspect was highlighted.

Winning funding to build the facility was a brilliant piece of negotiation and bipartisanship: the department of defense and the weapons program supported a facility that would pursue peaceful civilian energy generation and the anti-weapons lobby and the press supported a facility that would at its core be focused on nuclear weapons. NIF weathered changing political and public sentiments and construction setbacks because it won such broad political, scientific, defense and press support.

Unfortunately, several aspects of the fusion program were botched from the beginning. The project leadership lacked experimental experience. The designers placed too much confidence in numerical models extrapolating from previous experiments into unknown parameter space. These turned out to be wildly optimistic in projecting the capabilities of NIF. Management failed to listen to criticism.

The marketing of the facility as a fusion testbed combined with the mistakes made in pursuing that goal laid the groundwork for embarrassment. The majority of public, press and scientific support (and even the name ignition facility) was based on the possibility of producing fusion breakthrough. To the world at large, it is a fusion experiment.

So, when the fusion results came back late and wildly underwhelming it suddenly looked as though the entire project was a bust. But this isn't entirely true.

Is NIF a success despite never approaching fusion ignition?

How is NIF doing as a platform for weapons testing and research? That's classified! Due to the sensitive nature of nuclear weapons data and testing, it's very unclear. It is easy to believe that data gathered at extremely high temperatures and pressures not available anywhere else on the planet is invaluable to nuclear weapon design. Designers must extrapolate out from available data to the unknown conditions they anticipate; having data that is closer to real conditions can only help tremendously.

I have heard from sources familiar with the field that there have been some major weapons results from NIF. Unfortunately, we members of the public aren't privy to them, so we have to take the vague word of those involved.

Another important consequence of NIF: we now know far more about how to build a successful laser fusion facility in the future. In the same way that tests shed light on nuclear explosions, they also illuminate the correct conditions for a laser fusion facility. For instance, it's now clear that a still larger laser is needed to make any realistic simulacrum of a power plant. We are also learning how to design the fuel, the crushing system and the laser pulses experimentally instead of relying upon speculative simulation.

This brings us back to the big question: can we make laser fusion work? NIF is revealing clues about what it will take to reach practical inertial confinement fusion. In part II, I'll explain this in detail.

-

The author gratefully acknowledges discussion with the former director of the NIF facility, Bruce Campbell, for insight into the NIF facility and ICF research.

The Graying of Obama's Hair: A Scientific Analysis

On April 28th, 2012, President Barack Obama donned a tuxedo for the annual White House Correspondent's Dinner, and standing in front of a ballroom brimming with journalists, celebrities, and politicians, he made a bold prediction.

"Four years from now, I will look like this," he said, as a photo of the suave, white-haired Morgan Freeman appeared on the screen behind him.

President Obama wasn't, of course, insinuating that he would somehow morph into the Academy Award winning actor, but that his hair, which began dark and vivacious at the beginning of his tenure in the oval office, would be gray and listless by the time he left.

A recent study, however, finds otherwise.

Researchers perused hundreds of photos of President Obama taken indoors with comparable lighting, then used Photoshop to focus in on his hair. What they were left with was a collection of 68 photos of Obama's hair -- one for every month of his presidency. They then measured the median gray value of each.

Though the two researchers found a clear graying trend over the course of Obama's presidency -- on average, his hair color grew roughly 0.452% closer to matching Morgan Freeman's each month -- they noted that his hair almost certainly will not achieve the whiteness of Morgan Freeman's by May 2016, four years after he made his prediction.

"If we extrapolate this trendline to the 89th month of his presidency, we estimate that his hair will only be about 61.7% similar to Morgan Freeman’s."

However, the researchers added, there is a coin flip's chance that on at least one day, President Obama's hair color may seem to match Morgan Freeman's. Hair color can vary depending on the time of the year. Sunlight destroys melanin, the pigment in hair follicles, bleaching hair a blondish-white. If Obama spends a lot of time in Hawaii or playing golf during his two lame duck years in office, those odds may go up. Our perception of hair color can also change depending upon how short the hair in question is and the ambient lighting.

One thing that probably won't affect Obama's hair color is anxiety. Contrary to popular belief, it is not proven that stress rapidly grays hair; genes most strongly influence when and how fast that happens. Cells in the skin called melanocytes produce the pigment melanin that colors hair, and there's no direct evidence that stress limits their productivity or reduces their lifespans.

The current study was published in the brand spanking new Proceedings of the Natural Institute of Science (PNIS) an open-access journal that publishes satirical Onion-style articles or genuine research with humorous topics. Other topics tackled in the journal so far include an assessment of how often men think of chicken wings and an analysis of how costly it would be to raise a child like Calvin from the comic strip Calvin and Hobbes.

PNIS editor Matt Michael started the journal because he saw a lack of quality in the cacophony of scientific publishing.

"In 2014, it is estimated that there are now 29,147 active scientific journals, publishing roughly 700,000 papers every year, contributing to the estimated 55 million papers to ever have been published. Yet all of them suck," he joked in the journal's introductory editorial, before predicting that PNIS will crush leading journals Nature and Science in two years.

Michael's actual goals with PNIS are quite worthwhile:

We feel (and we are not alone in this) that science has a problem with effective communication. We believe that part of this problem stems from the view that science is an exclusive club that only a few can participate (and, thus, lacks transparency) and that current scientific studies are beyond the understanding of most people.

To change this view, PNIS uses satire, humor and its open publication policy to demonstrate that: 1) people use scientific concepts (most especially the scientific method) every day, frequently without even realizing it, 2) scientific discoveries are not limited to scientists (much like how playing sports is not limited to professional athletes), 3) most scientific concepts are easy to understand and 4) scientists are all too willing to laugh at themselves.

Source: K. Hernandez and C. Drexler (2014) "Yes We Canities! A quantitative analysis of the graying of Barack Obama's hair." Proceedings of the Natural Institute of Science | Volume 1 | HARD 3

(Images: AP, PNIS)

Women's Farts Smell Worse, and Five More Facts You Need to Know About Flatulence

Flatulence is a fact of life. Americans collectively break wind to the smelly tune of up to 6.3 billion times each day. That's a lot of hot air. For such a ubiquitous activity, it's amazing how taboo it is. Face palms and pinched noses mark the passing of gas in most social settings. Science, however, has no ingrained distaste for flatulence. Here are six facts we've learned about farting.

1. There are three main fart smells. Hydrogen sulfide produces the signature "rotten eggs" note, methanethiol produces hints of "decomposing vegetables," and dimethyl sulfide adds a hint of "sweetness."

2. The average fart is roughly 100 milliliters in volume and lasts approximately two seconds. More interesting than the statistic itself is how it was calculated. Basically, it involved subjects farting into specially designed, airtight, gas-collecting underwear.

3. There's a way to make your farts (mostly) odorless. Marketed as the only "internal deodorant," the over-the-counter drug Devrom, with its active ingredient bismuth subgallate, reduces almost 100% of the odor caused by sulfur gasses, the primary contributors to smelly farts. Bismuth is an interesting metal -- it's extremely dense yet surprisingly nontoxic. The only known side effects of taking bismuth subgallate is a harmless darkening of stools or the tongue, which the user's friends and family undoubtedly describe as "well worth it."

4. Women's farts smell worse. In studies conducted by eminent flatulence researcher Michael Levitt, women's farts consistently sported significantly greater concentrations of hydrogen sulfide. Odor judges have confirmed that -- at similar volumes -- this translates to a noticeably worse odor compared to men's farts.

5. Red meat kicks up a stink. Sulfur compounds contribute the most to flatus malodor, but compounds called thiols also royally reek. Methanethiol is one of the worst. Naturally found in blood, and, in turn, red meat, it can be released via the digestive process and eventually off-gassed via the anus.

6. Holding in your farts won't kill you, but it won't be comfortable either. As Tara from D-News explained, "When we hold farts in, the gas retreats back into our body and gets absorbed into the intestinal walls where it eventually mixes in with our blood. At best, that can cause bloating, abdominal pain, and constipation but if you do it repeatedly it can lead to a distended bowel."

(Image: Shutterstock)

Primary source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013

The Rudest Space Cloud in the Known Universe

In 1999, astronomers controlling the Hubble Space Telescope zoomed in on a section of the Carina Nebula, approximately 8,000 light years away, and snapped the picture shown above. The image depicts a dense cloud of gas roughly two light years in length that has broken off the greater nebula. The cloud is "striking," NASA noted in 2003, "because its clear definition stimulates the human imagination."

"It could be perceived as a superhero flying through a cloud, arm up, with a saved person in tow below."

That's not what I see...

While NASA deserves respect for keeping things G-rated, that cloud can only be accurately described as the grandest middle-finger salute in the known universe; a giant, cosmic "%@#$ you."

A perceptive redditor spotted the vulgar resemblance two weeks ago.

The rude stellar object is a molecular cloud of gas and dust, dense enough and large enough to permit molecules to form. In the image above, the bright red specks shining to the left are young stars that grew inside.

What I will term the "%@#$ you" cloud is part of the vastly larger Carina Nebula (seen above), which spans over 300 light years. The nebula was discovered by French astronomer Nicolas Louis de Lacaille in 1751. Larger and brighter than the more famous Orion Nebula, the Carina Nebula is home to Eta Carinae, the best-studied luminous hypergiant star. Eta Carinae is 100-150 times as massive as our sun and four million times brighter! Because the star is so huge, it flirts with the Eddington limit, the point where a star's radiation is powerful enough to overcome the gravity that keeps it together. In other words, the star is precariously walking the line of exploding as a supernova or even a hypernova. Astronomers actually expect this to occur within the next million years.

Should that happen, we here on Earth will, in all likelihood, be just fine. The "%@#$ you" cloud, however, may suffer a brute form of stellar censorship.

(Images: Hubble Heritage Team (STScI/AURA), N. Walborn (STScI) & R. Barbß (La Plata Obs.), NASA. & ESO)