How to Kill a Scientist

Arguably, it is more difficult to be a scientist today than ever before. Faculty positions are few and far between. A mere 20% of federal grant proposals are actually funded, and the biomedical professors lucky enough to score an R01, the granddaddy of NIH grants, are usually not awarded their first until the ripe old age of 42. In short, there are too many PhD's and not enough jobs and money to support them.

As if that weren't bad enough, now scientists must also avoid being killed -- metaphorically speaking. The power of the internet has fanned the flames of ideology, allowing activists to unite in an unprecedented assault upon our nation's scientists in the form of harassment, intimidation, and character assassination.

The strategy involves the Freedom of Information Act, a law meant to facilitate government transparency, which can easily be abused by troublemakers intent on digging through a scientist's emails in the hope of finding an embarrassing comment or a statement that can be taken out of context. The tactic was famously deployed by climate change skeptics in an attempt to discredit Michael Mann, and it has been perfected by anti-GMO activists bent on destroying an entire generation of biotechnology scientists. According to an editorial in the latest issue of Nature Biotechnology, 40 scientists have been targeted by "the activist organization US Right to Know (USRTK), bankrolled largely by a $47,500 donation from the Organic Consumers Association." (The Genetic Literacy Project claims the figure is at least $114,500.)

Kevin Folta, a plant scientist at the University of Florida, has become Public Enemy #1 for the crazed and implacable anti-GMO movement. An outspoken defender of GMOs, he has become the face of the biotech resistance. For this bravery, he has paid a dear price: His reputation has been dragged through the mud, despite the fact that he has never done anything unethical, let alone illegal. The aforementioned editorial summarized the situation eloquently:

The tragedy is the harassment that [Folta] and his family have experienced in recent weeks will cause many potential researcher/communicators to duck back under the parapet.

This is how demagogues and anti-science zealots succeed: they extract a high cost for free speech; they coerce the informed into silence; they create hostile environments that threaten vibrant rare species with extinction.

So, what can be done about it? The editorial suggests more funding for science communication and that the major journal publishers engage in more public outreach. These are a good start, but they do not go far enough. The solution will require at least three major changes.

First, FOIA must be amended. Unless there is evidence to suggest that a scientist has engaged in wrongdoing, there is no reason to allow a group of activists to blatantly violate his or her privacy by plowing through archives of email. (Alison Van Eenennaam, one of the targeted scientists, wrote on Science 2.0 that "there is something deeply intrusive about a third party requesting years' worth of email correspondence.")

Second, politicians from both sides of the aisle should publicly defend the integrity of the scientific community. Nothing can change the tone of public dialogue -- and bring much-needed attention to a pressing issue -- like a well-placed comment from the President of the United States.

Third, media outlets like Mother Jones, a fetid cesspool of anti-science propaganda, ought to be publicly shamed by science journalists and the scientific community for giving a prominent voice to pseudoscientific malcontents and their anti-corporate conspiracy theories.

As a society, we must choose to protect our scientists from harassment. They deserve a peaceful atmosphere in which they can safely do their important work. If we do not, we risk not only further damaging an already fragile and demoralized scientific community but also America's standing as the world's leading innovator.

Source: "Standing up for science." Nat Biotech 33: 1009. Published online: 08-Oct-2015. doi: 10.1038/nbt.3384

(AP photo)

Disproved Discoveries That Won Nobel Prizes

Over a million papers are published in scientific journals each year, and as Stanford University professor John Ioannidis wrote in a now legendary paper published to PLoS Medicine in 2005, most of their findings are false. Whether due to researcher error, insufficient data, poor methods, or the numerous biases present in people and pervasive in the ways research is conducted, a lot of scientific claims end up being incorrect.

So it should come as little surprise that Nobel Prize-winning discoveries are not immune to being wrong. Though marbleized in prestige, a number of them have been either disproved or lionized under mistaken pretenses.

PERHAPS THE MOST clear-cut example hearkens all the way back to 1926, when Johannes Fibiger won the Nobel Prize in Medicine for "for his discovery of the Spiroptera carcinoma." In layman's terms, he found a tiny parasitic worm that causes cancer. Subsequent research conducted in the decades following his receipt of the award would show that though the worm definitely existed, its cancer-causing abilities were entirely nonexistent. So where did Fibiger go wrong?

Though widely respected and considered to be a careful and cautious researcher, Fibiger fell victim to improper controls and inadequate technology. To elucidate his hypothesized connection between parasites and gastric cancer in rodents, he fed mice and rats cockroaches infested with parasitic worms and observed what he thought were tumors grow inside the rodents' stomachs. Later studies would show that they were not tumors but lesions likely caused by vitamin A deficiency, which resulted from a poor diet.

It's hard to fault Fibiger or the Nobel Committee too much for this blunder. At the time, cancer was much, much more of a mystery than it is today, and Fibiger worked tirelessly to solve it, exploring all sorts of hypotheses, not just those involving parasites.

Analyzing Fibiger's story in a 1992 issue of the Annals of Internal Medicine, Tamar Lasky and Paul D. Stolley were kind in their remembrance:

"We now know that gastric cancer is not caused by Spiroptera carcinoma, and the purported "discovery" of such a relation hardly seems worth a historical footnote, never mind a Nobel Prize. At the same time, it is quite touching to read the speech given by the Nobel Committee on presenting Fibiger with his award. They considered his work to be a beacon of light in the effort of science to seek the truth. Perhaps his work did serve to inspire other scientists to conduct more research and to persist along the path of human knowledge...

...Fibiger's story is worth recounting not only because it teaches us about pitfalls in scientific research and reasoning, but also because it may provide perverse solace for those of us who will never receive the Nobel Prize (but, of course, deserve it)."

THOUGH MOST STUDENTS of science wouldn't recall Johannes Fibiger; they would be well acquainted with Enrico Fermi. Credited with the creation of the first nuclear reactor in Chicago, Fermi etched his name into the history books of quantum theory, nuclear and particle physics, and statistical mechanics. He also won a Nobel Prize sort of by mistake.

Fermi won the 1938 Nobel Prize in Physics "for his demonstrations of the existence of new radioactive elements produced by neutron irradiation, and for his related discovery of nuclear reactions brought about by slow neutrons." The catch, of course, was that he did not demonstrate the existence of new elements. When Fermi bombarded uranium atoms with slow-moving neutrons, and observed a process called beta decay, he thought he did, and even labeled the new elements he supposedly saw Ausonium and Hesperium. But what he actually and unknowingly accomplished was nuclear fission! The uranium atoms split to become lighter elements!

Considering that this was a big discovery, one that would eventually earn German scientist Otto Hahn the 1944 Nobel Prize in Chemistry, one can't really bemoan Fermi his Nobel Prize. Moreover, once he realized his mistake, he admitted it. Radioactive elements beyond uranium were actually created in 1940, beginning with element 93, Neptunium.

THE FIRST TIME a Nobel Prize was awarded jointly was in 1906. The prize in Physiology or Medicine went to both Camillo Golgi (pictured) and Santiago Ramón y Cajal "in recognition of their work on the structure of the nervous system". The decision was controversial, as the two men were bitter adversaries, both endorsing competing views about the structure of the nervous system. Golgi thought that the nervous system was a single continuous network, while Cajal proposed that the nervous system was composed of individually-acting, linked nerve cells, or neurons as he called them. Though many members of the Nobel Committee considered Cajal's work to be superior to Golgi's and likely the correct interpretation of how the nervous system functions, they ended up electing the diplomatic path and awarding the prize to both men.

Golgi, however, was decidedly undiplomatic during his Nobel lecture. In it, he directly attacked Cajal's theories, taking sophisticated swipes at his colleague throughout the presentation. Time would prove Cajal correct, however, and Golgi wrong.

THERE IS NOTHING wrong with being wrong, of course. The history of science is filled with incorrect ideas, far more than correct ones, in fact. The pursuit of knowledge follows a darkened path riddled with dead ends. That's why it's okay that science's top prize has occasionally been awarded for false claims.

(Imagse: AP, Wikimedia Commons, )

Four Facts About Gun Violence That Will Alarm You, Surprise You, and Make You Think

The tragically large amount of gun violence in the United States grants researchers a plethora of data to dig into. As a small community in Oregon recovers from yet another mass shooting in the United States, already the 294th this year, let's review a few of scientists' findings about mass shootings and gun violence.

1. Mass shootings may be "contagious." Earlier this year, researchers at Arizona State University examined whether or not mass killings involving firearms beget more mass shootings. They hypothesized that media coverage of these tragic events might trigger at-risk individuals to carry out killings of their own. Applying a contagion model to numerous data sets charting such events, they found that each mass killing involving a firearm may incite at least 0.3 new incidents. Put another way, about every three mass shootings leads to another being committed.

2. The U.S. firmly leads the developed world in firearm deaths. In a 2011 study examining WHO data on firearm deaths in 23 populous, high-income countries, researchers found that 80% of all firearm deaths occurred in the United States. Women and children were particularly affected. "86% of women killed by firearms were US women, and 87% of all children aged 0 to 14 killed by firearms were US children," the researchers reported. The data was from 2003, however, so it requires updating.

3. Survivors of mass shootings are prone to post-traumatic stress disorder (PTSD). Around lunchtime on October 16, 1991, George Hennard crashed his pickup truck through the front of a Luby's Cafeteria in Kileen, Texas and opened fire on the patrons inside. Twenty-three died in the attack, the third-deadliest shooting in U.S. history.

One month after the horrific shooting, psychologists interviewed 136 survivors of the massacre. Twenty percent of the men and thirty-six percent of the women met the criteria for PTSD, though most had no history whatsoever of psychiatric illness. Similar rates of PSTD were found amongst survivors of the Virginia Tech shooting.

PTSD, an anxiety disorder characterized by nightmarish flashbacks, hyperarousal, and constant stress, is typically seen in combat veterans subject to repeated traumatizing events during warfare, but it also can be triggered by single events, such as mass shootings.

4. Strict gun control laws worked wonders in Australia. In the wake of one of the deadliest shootings ever carried out by a single person, the Port Arthur massacre, Australia's government passed bipartisan gun control legislation that involved buying back 600,000 semi-automatic shotguns and rifles, prohibiting private sales, and requiring that all weapons be registered, among other provisions. The apparent effects of the reforms, as revealed by two large observational studies conducted over a decade later, were astounding. The firearm homicide rate fell by 59 percent and the firearm suicide rate fell by 65 percent. Moreover, while there were 13 mass shootings in the 18 years before the legislation was passed, there have been only two since.

(Image: AP)

Animal Rights Activists Behave Like Stalkers

Every person and organization needs a reality check. It is healthy and vital to seriously consider the opinions of those with whom we disagree. For individuals, this prevents arrogance and promotes humility; for organizations, it prevents corruption and groupthink and promotes transparency. Unsurprisingly, when a viable and intellectually robust "loyal opposition" is absent, bad things can happen.

Take my city, Seattle, for example. Here, politics comes in two flavors: The left and the far left. There are few, if any, moderates or conservatives in positions of authority. Any person with even a hint of moderation or conservatism in their background is labeled a Republican (which, in this city, takes on the connotation of an ethnic slur) and purged from public office. There is effectively no loyal opposition in Seattle to keep the left in check.

Within this toxic milieu, and as a direct result of this ideological imbalance, no left-wing politician or political cause is considered too extreme to be socially acceptable. That explains why, on the first day of May each year, left-wing anarchists feel free to attack police officers and destroy property. It explains how nearly 51% of the electorate put a self-described socialist (actually a Trotskyite communist), Kshama Sawant, on the city council. And it explains why the cultish Lyndon LaRouche movement has a contingent of followers.

Sadly, such left-wing extremism is not confined to the political arena; it has contaminated science, as well. Though Seattle is one of the world's leading cities for biomedical and biotechnological research, a substantial proportion of the population is anti-GMO, anti-vaccine, and pro-alternative medicine. Organic, gluten-free food and herbal supplements are the norm here, even though this pricey lifestyle runs contrary to modern science. Dedication to environmentalism trumps human safety, even during "Snowpocalypse" 2008, when then-Mayor Greg Nickels refused to salt the roads in order to protect Puget Sound (a salt body of water).

Such extreme thinking has reared its ugly head yet again. On Friday of last week, an animal rights group called "No New Animal Lab" protested near my alma mater, the University of Washington. When I was there working on my PhD in microbiology, occasionally we would receive notices from the administration warning that activists were in the neighborhood. We were told to avoid them.

That was good advice. Animal rights activists and their environmentalist allies have a bad track record in Seattle. In 2001, the Earth Liberation Front, an ecoterrorist group, firebombed a facility at the University of Washington. The same group was also responsible for burning people's homes.

Thankfully, No New Animal Lab has not resorted to those tactics. At least not yet. Instead, they have chosen to behave like creepy stalker ex-boyfriends, screaming over the phone at the executives of Skanska (the firm contracted to build a new animal lab at the University of Washington) and protesting outside their homes. Similar tactics with the intention to intimidate and harass were used against University of Washington faculty. According to the Seattle Times, Skanska successfully obtained a protection order against the group, preventing some of its members from protesting outside their executives' homes.

The court's decision was wise. Other similar incidents in California turned violent. It may be only a matter of time before like-minded groups in Seattle once again resort to violence.

The irony, of course, in the animal rights movement is that every single person on the planet has benefited from animal research. If you have ever been to a hospital, clinic, or Walgreens, then you have directly reaped the benefits of animal research. If you have ever taken medication, you have benefited from animal research. Scientists do their best to minimize animal suffering and, when possible, to replace animal studies with in vitro or in silico research.

Frustratingly, in Seattle -- a city that prides itself on being one of the most educated in America -- such nuance and logic falls on deaf ears. This is not simply a result of its left-wing ideology (because right-wing ideologies produce their own noxious messes), but more a result of the siloed thinking inherent in ideologically pure organizations. The penalty of political polarization is that each side is controlled by its more extreme elements.

One wonders if things might be different if more places had a loyal opposition willing to bring the fringes back from the edge.

(AP photo)

You Probably Don't Need to Treat a Fever

Sometimes, the human body runs hotter than normal. Whether due to infection or some other condition, the hypothalamus increases the body's temperature above the normal range, which is between 97.6 °F and 99.5 °F. Blood vessels near the skin constrict to retain heat and shivering may begin, leading the muscles to generate heat. The resulting state is called pyrexia, but it's more commonly known as a fever.

Here is about as far as the condition usually goes. A hand on the forehead reveals the situation, while a thermometer under the tongue confirms it. What comes next, recommended by Dr. Moms everywhere, is a dose of ibuprofen or acetaminophen. These drugs block fever-inducing signals from the hypothalamus, causing the body's temperature to return to normal.

But is this the correct course of action? While treating a fever seems like commons sense, science presents a more nuanced view.

Fever isn't some bodily malfunction; it's a purposeful response that evolved as many as 400,000,000 years ago. According to a report published in the journal Pediatrics:

"Fever retards the growth and reproduction of bacteria and viruses, enhances neutrophil production and T-lymphocyte proliferation, and aids in the body’s acute-phase reaction... Most fevers are of short duration, are benign, and may actually protect the host. Data show beneficial effects on certain components of the immune system in fever, and limited data have revealed that fever actually helps the body recover more quickly from viral infections..."

Despite the potential benefits of fever, many people suffer from what researchers have termed "fever phobia." Though there is no evidence that fever worsens an illness or causes long-term neurologic complications, many believe that it does. As Clay Jones reported at Science-Based Medicine:

"A 2001 paper revisiting fever phobia published in Pediatrics revealed that 91% of caregivers thought fever could cause harmful side effects including seizure (32%), brain damage (21%) and death (14%). Coma and blindness also made the list. With the exception of febrile seizures, a common and benign entity seen only in young children, fever just doesn’t do these things."

Much of the misunderstanding arises from confusing another condition, hyperthermia, with fever. They are not the same. Hyperthermia is potentially life-threatening, uncontrolled bodily overheating on account of an outside source. Fever is a controlled, internal process, which rarely, if ever, raises the body's temperature to a point where actual harm can be done.

A surprisingly small amount of scientific studies have actually examined the effects of treating versus not treating a fever in humans. According to a recent systematic review of randomized, controlled trials, there doesn't seem to be any difference in outcomes in treating or not treating a fever in critically ill patients. On the other hand, treating a fever may slightly increase the severity of the common cold and pneumonia.

So if fever is potentially beneficial and rarely harmful, does that mean one should never treat a fever? Not at all. Fever can be very uncomfortable and can lead to lost sleep and dehydration, and when you're sick, both sleep and hydration are vital to getting better.

You should also call your doctor for a fever over 102 °F in children and 104 °F in adults.

In the end, treating fever should not be about reducing a number, but rather about alleviating symptoms. If you or a child is feeling miserable at 102 °F, go ahead and take that fever-reducer. If you decide to let the fever run it's course, that's okay, too.

(Image: AP)

Why You Might Just Want a Hookworm Infection

Hookworm infection has been essentially eradicated in the United States... and that might be a problem.

But wait, you might be thinking, aren't hookworms parasitic creatures that infest our guts, suck our blood, infect up to 740 million people in poor countries, cause a host of disconcerting symptoms, and have surprisingly sharp and frightening teeth?


So why, in the name of Science, could it possibly be problematic that these blighters no longer reside inside most inhabitants of the developed world?

Good question!

Looking back over the past century, many scientists have noticed that as hookworm infection has steadily declined in the United States, the rate of autoimmune disorders has skyrocketed. Autoimmune disorders, which include allergies, arthritis, and celiac disease, currently affect as many as 50 million Americans! Now, correlation is not causation, of course, but there's actually good reason to speculate that the two trends might be linked.

When hookworms infect a human, often by burrowing through the skin or catching a ride onboard a parcel of food, they make their way straight to the gut and set up shop. There, the five to ten millimeter-long worms will start to mate and proliferate, with the female releasing tens of thousands of eggs per day. Despite those alarming numbers, the population of hookworms in the gut generally remains small, as almost all of the eggs leave the body via the host's feces. Usually, no more than a few hundred hookworms actually dwell inside the gut, where they dine on blood siphoned from the intestinal mucosa. In fact, the overwhelming majority of hookworm infections are largely asymptomatic.

Still, one has to wonder why the host's immune system doesn't kick those freeloading worms to the curb. The answer is that the worms have evolved an ingenious way to subvert the immune system. By releasing certain immunoregulatory molecules, they inhibit the function of cells that recognize and target foreign invaders just enough to fly under the radar.

The worms' tampering, however, might have a much broader effect than merely keeping the immune system's army of cells off their backs -- this is where autoimmune disorders come in. The hypothesis is that human and hookworm have likely been living together for many thousands of years, possibly even co-evolving to an extent. Scientists speculate that hookworms may have been keeping an over reactive immune system in check, like a wise mentor urging restraint from an easily provoked protégé. Now, lacking the worms' influence, human immune systems in the developed world are running wild, deciding that your very own healthy cells are foreign invaders, and mercilessly attacking them.

The narrative is compelling, though still in the hypothetical stages. To test it, scientists are now infecting willing subjects with hookworms, and checking to see whether or not the worms can potentially alleviate conditions like celiac disease, Crohn's disease, multiple sclerosis, ulcerative colitis, and atherosclerosis. Results are inconsistent thus far, but with almost all of the trials being conducted in only the last decade, the research effort is still in the very early stages.

An intriguing conundrum researchers are grappling with is precisely how significant of an infection is required to curtail the host's immune response. Too few worms may do nothing, but too many worms may create more problems than they solve. After all, severe hookworm infection can cause diarrhea, anemia, birth defects in babies, constipation, emaciation, or even heart failure in malnourished individuals.

Symptoms like these are why philanthropists like Bill and Melinda Gates are funding the effort to create a vaccine against hookworm, with the ultimate goal of eliminating it in the developing world. So, as some scientists toy with the idea of bringing hookworm back, many more are dedicated to destroying the parasite once and for all. Some scientists, like James Cook University professor Alex Loukas, straddle both desires. He wants to both create a vaccine to eliminate hookworm infection in undeveloped countries and harness the secrets of the immune-suppressing molecules that the worms secrete to treat autoimmune diseases in developed ones.

While lessening the burden of hookworm infection in the developing world is undoubtedly a noble and worthwhile cause, it's also worthwhile to remember that humans are superorganisms, collections of trillions of living organisms in addition to ourselves. And just like the environment, which is also a collection of living organisms, when one thing changes, there are inevitably downstream affects. We may not be able to foresee the full effects of eliminating hookworms.

After experiencing the positive effects of hookworm infection in a recent year-long clinical trial, eight sufferers of celiac disease turned down medication that would kill the worms, electing instead to keep their parasites, scary teeth and all.

(Images: CDC, AJC1/flickr)

Our Know-Nothing, Anti-Science, Anti-Intellectual Presidential Candidates

Other than getting a major fact wrong, the worst possible feeling for a journalist is the gut-wrenching notion that all of one's efforts are for naught. For me, it has become increasingly difficult to escape this dreadful feeling, given the state of politics in America.

Last week, it was revealed that pediatric neurosurgeon Ben Carson, who is currently one of the frontrunners for the Republican nomination, believes the Big Bang to be a fairy tale and the theory of evolution to be the work of the devil. This comes on the heels of a debate performance in which Dr. Carson voiced disagreement with the vaccine schedule for children, though there is no evidence to support his view that vaccines should be delayed. Rand Paul, an ophthalmologist, expressed similar skepticism.

Even worse, Donald Trump, who is currently leading the GOP race, announced that vaccines cause autism. This dangerous nonsense has been debunked so many times, that it's not even fun to debunk it anymore.

Lest you come to the conclusion that only Republicans reject modern scholarship, the Democrats haven't done any better.

Hillary Clinton, the current Democratic frontrunner, joined the anti-vaccine bandwagon during her first presidential run in 2008. Amazingly, she has done a complete about-face on the issue, something worthy of both praise (because she is endorsing good science) and ridicule (because she acts as if she can hide from what she said in 2008). More recently, Mrs. Clinton came out against the Keystone XL pipeline because of climate change, as if a single pipeline will spell doom for the planet. Notably, Mrs. Clinton's opinion is in direct opposition to a State Department report that concluded the pipeline would have no effect on global greenhouse gas emissions.

Bernie Sanders, the champion of the far left, is also not terribly fond of science. He favors GMO food labels, even though America's finest scientists and medical doctors disagree. Mr. Sanders also opposes building more nuclear power plants, even though they are safe and constitute a necessary part of any serious climate policy. Then there's the whole socialism thing, even though the overwhelming majority of economists endorse capitalism as the best method to allocate resources efficiently in an economy.

We can shake our heads in disbelief at people like Dr. Carson, Dr. Paul, Mr. Trump, Mrs. Clinton, and Mr. Sanders. But, in a democratic society, our politicians are simply a reflection of who we are. They are us. Our politicians hold idiotic beliefs because a substantial proportion of Americans hold idiotic beliefs.

Until that changes, we will continue to get the government that we deserve.

(AP photo)

What the Fear of Death Does to Your Beliefs

Over a decade ago, psychologists John Jost, Jack Glaser, Arie Kruglanski, and Frank Sulloway launched a far-reaching and in-depth analysis of politically conservative belief. Sweeping through the scientific literature, they sought to determine what psychological variables predicted conservatism. Their resulting meta-analysis consisted of 22,818 cases from twelve countries. What they found was intriguing, yet unsurprising. Conservatism was tied to a need for order, structure, and closure as well as intolerance of ambiguity. It negatively correlated with being open to new experiences. 

One trait in particular led the pack, however, and this one was somewhat surprising. That trait? Death anxiety. That's right. There was no stronger predictor of conservative political belief than a "persistent fear of one's own mortality."

Death is inevitable, a fate destined for all of us. Yet despite its universality, it is the ultimate unknown. We can peer into the distant reaches of space and time, yet we will almost certainly never see past our own mortality. Death is an end. As conscious, living beings, it is only natural to be afraid of it.

And those who fear it more, it seems, tend to be ideologically conservative.

They also tend to be more religious.

A 2011 study measured religiosity and fear of death in college students in Malaysia, Turkey, and the United States. The researchers behind the study discovered that subjects who reported a greater fear of death were also more religious. Interestingly, another study examining death anxiety within religious groups showed that parishioners who reported stronger belief showed reduced death anxiety than those reporting weaker belief. Fear may drive people to religion, but religion also alleviates their fear of death.

Even nonbelievers aren't completely immune to death's belief-altering effects. In 2012, psychologists at the University of Otago in New Zealand brought students into the lab and asked them to write about their own deaths. Freshly primed with their own mortality, both religious and nonreligious students unconsciously showed increased levels of belief. In a subsequent computer questionnaire, religious participants were faster to press a button to affirm God’s existence than those who weren't primed with thoughts of death, but non-religious participants were slower to press a button denying God’s existence compared to their control counterparts.

"While death-priming made religious participants more certain about the reality of religious entities, non-religious participants showed less confidence in their disbelief,” Associate Professor Halberstadt stated in a press release.

So why does fear of death seem to drive people to conservatism and religious belief?

The intuitive answer where religion is concerned is that religion offers a reason not to be afraid of death, specifically, that death is only a doorway to a wondrous new existence in some sort of shimmering afterlife. However, after conducting a longitudinal analysis of 155 men and women in San Francisco, Wellesley College psychologists Paul Wink and Julia Scott reported a more nuanced explanation, "that firmness and consistency of beliefs and practices, rather than religiousness per se, buffers against death anxiety in old age."

What about conservatism?

"The core ideology of conservatism... is motivated by needs that vary situationally and dispositionally to manage uncertainty and threat," Jost and his colleagues explained. Look at the issues American conservatives tend to tout: a strong military, border control, the 2nd Amendment. All of these are intended to foster safety and security, and ultimately guard against mortality. 

(Image: Shutterstock)

New DNA Storage System Can Store 490 Exabytes Per Gram and Allows Data to Be Rewritten

By 2020, the amount of digital data produced will total 40 trillion gigabytes! Yet that mind-blowing amount of information can be crammed into just 82 grams of DNA.

For that, we can thank Professor Olgica Milenkovic and the many other researchers who have made DNA data storage a reality. Milenkovic, who is based out of the University of Illinois, believes that DNA-based storage will be the primary storage mechanism of many archival systems in the near future.

"The media is extremely durable and has exceptionally high storage density," she says.

Exceptional is a bit of an understatement. While the hard drives of many desktop computers sold today can store one terabyte of information, three years ago, researchers at Harvard created a technique that could store 700 terabytes on a single gram of DNA. A year later, another team raised the capacity to 2200 terabytes.

Just last week, however, Milenkovic and her team detailed a new system capable of storing 490 exabytes on a single gram, which is equal to 490 billion gigabytes! Besides smashing the previous storage record, her technique also permitted data stored on DNA to be selectively accessed and rewritten, two huge advances over previous DNA storage methods.

With DNA storage, data is first translated to binary (1s and 0s), then in-turn converted to DNA bases (A, G, T, and C). Once the information is laid out, DNA is synthesized to match the data. To read the information contained inside, scientists simply sequence the DNA and convert the data back to binary. 

Milenkovic and her team synthesized DNA with 1000 bytes per strand. Each strand was tagged with two address sequences at each end. These sequences allowed the team to identify specific DNA strands, enabling them to access specific information. Once they amplified the strand they wanted, they could re-write the information contained within using conventional DNA editing techniques.

The researchers tested their technique on Wikipedia.

"We encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools," they wrote.

While Milenkovic's DNA storage method improves on prior techniques, it is also much more expensive. Encoding and storing 17 kilobytes (KB) of data cost $4,023. A previous technique stored 739 KB of data for $12,600.

This comparison points out a glaring problem with DNA storage.

"The costs of synthesizing DNA (i.e. recording the information) are prohibitively high at the moment to allow this technology to be competitive with flash or other memories," Milenkovic told RCS in an email.

Costs are declining extremely fast, however.

"Within the last seven months, the cost of synthesizing 1000 bps blocks reduced almost 7-fold," the researchers wrote.

When DNA data storage costs drop significantly enough, the method may be of use to large governmental, scientific, or historical organizations. The Large Hadron Collider, for example, creates 15 petabytes of data each year!

But could DNA storage ever find a home in personal computers?

"I do not see how one could directly connect classical computers with DNA storage media at the moment, as one needs to do some processing on the DNA output to make it readable by a computer," Milenkovic told RCS. "But this processing may be incorporated into new generations of DNA sequencers - I am not aware of anyone working on this subject at the moment, though."

Source: Tabatabaei Yazdi, S. M. H. et al. A Rewritable, Random-Access DNA-Based Storage System. Sci. Rep. 5, 14138; doi: 10.1038/srep14138 (2015)

(Image: Shutterstock)

Sex Would Be Simpler if We Were Bonobos

Sex is complicated. A lack of communication, psychologists and couples therapists are fond of telling us, is largely to blame. If only we were more open about our desires and intentions, then women wouldn't be wonder, "Will he ever call me?" and bemused men wouldn't speculate, "Is she flirting with me?" The era of mixed signals would be over. Though likely too good to ever be true, this utopian vision of human sexual relations could become a reality if only we learned some communication skills from our bonobo cousins.

Like humans, bonobos are Great Apes, a group that also includes orangutans, chimpanzees, and gorillas. However, there is a particular behavioral trait that is quite distinctive among the bonobos. Smithsonian describes it best:

"While chimpanzees and gorillas often settle disputes by fierce, sometimes deadly fighting, bonobos commonly make peace by engaging in feverish orgies in which males have intercourse with females and other males, and females with other females. No other great apes... indulge themselves with such abandon."

Wow. The free-loving hippies at Woodstock couldn't hope to achieve such hedonistic levels of orgasmic bliss.

Not only are bonobos liberal in their lovemaking, they also aren't shy about requesting it. Researchers report in the journal Scientific Reports that wild female bonobos will make blatant gestures asking for genital-on-genital rubbing. Subtlety is not their specialty. The two moves the scientists observed were foot-pointing, in which the female used her foot to point at her genitals, and the "hip shimmy," in which she wiggled her genitals to mimic rubbing. Some 83% of the time, another female responded, giving the signaller exactly what she wanted.

See? Communication works.

So, what is the point of female genital-on-genital rubbing? It appears to be a mechanism to reduce social tension and to increase cooperation, particularly when limited food supplies are at stake.

Finally, it may be worth noting that despite their rampant canoodling, bonobos are an endangered species, mainly because their human cousins don't care enough to protect them or their habitat. On the bright side, at least they're going out with a bang.

Source: Pamela Heidi Douglas & Liza R. Moscovice. "Pointing and pantomime in wild apes? Female bonobos use referential and iconic gestures to request genito-genital rubbing." Scientific Reports 5, Article number: 13999. Published: 11-Sept-2015. doi:10.1038/srep13999

(AP photo/National Geographic)

The Biggest Myth About Vitamins

Many health-minded Americans suffer from a mild, yet persistent case of "vitamania," a borderline obsessive infatuation with vitamins and minerals. And that's understandable. As kids, most of us ate breakfast from cereal boxes plastered with claims about the wealth of vitamins and minerals contained inside. A curious glance at the label reaffirmed our decision to eat a second bowl. Each serving contained Vitamin A, Iron, Riboflavin, Thiamin, Niacin, and Vitamin B12! Wow!

Adulthood would see vitamania come into full swing. Out in the world with only vague directions to eat fruits and veggies and avoid too much sugar, we locked onto labels as guides on our personal health quests. A peek at the side of our multivitamin bottle would comfort us again: 100% of our daily values of dozens of vitamins and minerals. Health targets attained!

But with every label that's read, a latent myth is perpetuated. That myth? The notion that we need 100% of our daily value -- or Recommended Dietary Allowance (RDA) -- of every vitamin and mineral to attain optimal health.

As journalist Catherine Price explained in her recent book Vitamania, that simply isn't so.

"In fact, the RDAs themselves -- which many of us use as personalized scorecards for our diets -- are actually not meant to be personal at all. Instead, they're designed to meet the nutritional needs of 97 to 98 percent of all people, which means that the majority of us could get by just fine on less."

How much less? The average healthy American adult would probably do just fine consuming the Estimated Average Requirement (EAR) of vitamins and minerals each day. These EARs are roughly 20-30% lower than the RDA amounts used on food labels.

But if you can't eat the EAR, don't worry. It's not actually imperative to eat certain vitamins every day. These are the fat-soluble vitamins: A, D, E and K. When consumed, they are readily stored in the body for long periods of time and siphoned off as needed.

The vitamins we need more regularly are Vitamin C and the seven B complex vitamins. These are water-soluble, meaning that whatever our body doesn't need is excreted in urine. Luckily, water-soluble vitamins are fairly easy to come by. The B vitamins are found in whole grains, meats, and leafy green vegetables, while just a single orange will supply enough Vitamin C for one day.

With all the constant hubbub about vitamins, have you ever wondered why we actually need them? In Vitamania, Catherine Price succinctly summed up the reason. The human body functions thanks to ever-churning chemical reactions. These reactions are sped with the help of compounds known as enzymes. Vitamins are essentially the fuel for these enzymes. Without vitamins, the enzymes can't jumpstart the chemical reactions. And without the chemical reactions, the body ceases to function properly.

Considering how vital vitamins sound, it's easy to see why everyone seems to think we need so much of them. But, in reality, you don't need nearly as much as your box of Fruity Pebbles leads you to believe.

(Image: AP)

Why Women Are Always Cold

Women complain of feeling cold at much higher rates than men. Here, myriad anecdotes parlay neatly into actual evidence. In a 2012 review of the scientific literature, Sami Karjalainen of the VTT Technical Research Centre in Finland found that, "Females are more sensitive than males to a deviation from an optimal temperature and express more dissatisfaction, especially in cooler conditions."

This dissatisfaction occurs everywhere: in bed, at the movie theater, outside, at work, etc. As easy as it would be for men to simply dismiss women's complaints as overstated, there are actually a lot of physiological reasons why their gripes are legitimate.

For starters, women's bodies produce less heat than men's. Men generally expend more calories than women, about 23 percent more in fact. Spent calories are essentially burnt fuel. The body is a furnace, and a male body runs far hotter. Most of this variance is explained by the fact that men contain much more heat-generating muscle, but even if body composition and activity are accounted for, women's bodies still run 3 to 10 percent cooler. A good chunk of the energy we expend gets dissipated as heat, and this heat warms our skin, clothes, and immediate surroundings. As walking, talking space heaters, men are much more powerful.

Of course, being cold is not necessarily the same as feeling cold. But in this arena, women are also prone to feeling frigid. Owing perhaps to their more limited distribution of muscle, women's extremities run much cooler than men's. A 1998 study found that women's average hand temperature hovers around 87.2 °F, while men's averages 90 °F

"Women may be built to keep their vital organs nice and toasty, but their chilly fingers and toes could explain why they perceive themselves as cold more readily than men," io9's Robbie Gonzalez proffered.

Exacerbating women's feelings of coldness is the fact that an industry-standard formula used by many building managers to set indoor temperatures is based on a fifty-year-old estimate of the metabolic activity for a 155-pound male. This translates to a cooler building than most women would likely prefer.

"We should no longer neglect the more rigorous requirements that females have for indoor thermal environments. Gender differences indicate that females have, on average, a greater need for individual temperature control and adaptive actions than males," Karjalainen wrote back in 2012, adding, "If females are satisfied it is highly probable that males are also satisfied."

(Image: AP)

The Proxmire Amendment May Be the Most Anti-Science Law Ever Passed. It's Still in Effect Today.

Earlier this year, testing conducted by the New York State attorney general's office revealed that four out of every five herbal supplements sold at GNC, Target, Walgreens, and Walmart did not contain any of the herbs advertised on their labels. Instead, the products were filled with things like powdered rice, asparagus, garlic, and radish.

The damning discovery left some wondering, "Where was the FDA? Shouldn't they prevent this sort of wrongdoing?"

The disconcerting answer is that the FDA was doing precisely what Congress permits them to do: absolutely nothing. That's right, according to current law, the FDA is essentially prevented from regulating the supplement industry unless there's clear evidence that one of their products is harmful to consumers.

For this sorry state of affairs, you can thank what may be the most influential anti-science legislation ever passed in American politics: the Vitamin-Mineral (Proxmire) Amendment to the Federal Food, Drug, and Cosmetic Act. Journalist Catherine Price succinctly summed up the law's effects in her new book Vitamania:

"Enacted on April 23, 1976... the amendment made it illegal for the FDA to ever establish standards for supplements, classify them as drugs, or require that they only contain useful ingredients. It forbade the FDA from ever setting limits on the quantity or combination of vitamins, minerals, or other ingredients that a supplement could contain, unless the FDA could prove that the formulation was unsafe."

The amendment's primary architect and namesake, long-serving Wisconsin senator William Proxmire, whose patented Golden Fleece Award for wasteful government spending frequently -- and many say unfairly -- targeted basic scientific research, touted the legislation as a win for consumer freedom, guaranteeing access to health-giving vitamins, minerals, and supplements. FDA commissioner Alexander Schmidt called the amendment "a charlatan's dream."

Schmidt's assessment turned out to be correct. Basically unrestricted, the supplement industry saw their sales boom, and their political power grow. In 1994, with key support from Republican Senator Orrin Hatch and Democratic Senator Tom Harkin, the industry helped author the Dietary Supplement Health & Education Act, more commonly known as DSHEA. The law added insult to the injury inflicted by the Proxmire Amendment. DSHEA once and for all removed the FDA's middling power to review and approve dietary supplements before they hit the market. Moreover, manufacturers were permitted to make unproven claims about the effect of their products on the structure or function of the body.

Largely thanks to Proxmire and DSHEA, the supplement industry has ballooned to roughly $30 billion in revenue today, with more than 85,000 products on store shelves worldwide. The vast majority of them do not improve health in any way. Even vitamin and mineral supplements, by far the most popular with consumers, don't seem to have much effect, if any, on a variety of health outcomes, including bone density, cancer, cardiovascular disease, or mortality.

A lack of effect, either positive or negative, is a key reason why Proxmire and DSHEA have remained laws of the land. Most dietary supplements aren't harmful; they're just useless. Evidence-based claims of ineffectiveness are easily drowned out by unbelievable anecdotes and flashy marketing. Unfortunately, unless news of widespread supplement industry abuses or dangerous supplements grows to a clamor, it's unlikely that politicians will place the supplement industry under evidence-based regulation.

(Image: Shutterstock)

Parents Frivolously Sue School over Nonexistent Wi-Fi Sickness. Why Are Courts Actually Giving Them a Chance?

Last week, the Worcester Telegram & Gazette reported that the Fay School in Southborough, Massachusetts was sued by parents who believe that the school's wi-fi signal is making their 12-year-old son sick. According to the article, their child:

"...suffers from Electromagnetic Hypersensitivity Syndrome, a condition that is aggravated by electromagnetic radiation. The boy was diagnosed after he frequently experienced headaches, nosebleeds, nausea, and other symptoms while sitting in class after the school installed a new, more powerful wireless Internet system in 2013, the suit says."

Reading this drivel has given me a headache, nosebleed, and nausea. (Maybe I suffer from Ignoramus Hypersensitivity Syndrome?)

Let's be crystal clear: There is no such thing as Electromagnetic Hypersensitivity Syndrome (EHS). Your body is not affected in any meaningful way by the microwaves used by wi-fi devices or cell phones. This is because they are very, very low power. A microwave heats food using about 1,000 watts, while a wi-fi router emits about 1 watt. Also, keep in mind that this power dissipates rapidly with distance, which explains why WiMAX towers don't cook people. In addition to all of this basic physics and logic, a systematic review published in 2010 in the journal Bioelectromagnetics concluded that "repeated experiments have been unable to replicate [electromagnetic hypersensitivity] under controlled conditions."

In short, physics and biology have proven beyond a shadow of a doubt that EHS is not real. So case closed, right? Alas, probably not.

A few years ago, a parent in Portland, Oregon (where else?) sued the public school system for the exact same reason. Though the suit was eventually dismissed, the school district spent $172,000 fighting it. The Fay School had better prepare for a protracted battle, especially since the plaintiffs are seeking $250,000 in damages.

All of this goes to show one thing: The judicial system is too slow, inept, and scientifically illiterate to handle medical lawsuits. An entirely separate legal system should be set up to handle such cases. Instead of judges and juries, doctors and scientists should determine verdicts. And verdicts ought to be rendered rapidly, particularly for lawsuits based on pseudoscience.

Much fuss has been made over the years in regard to tort reform. Creating a medical court system would be a great way to start.

(Image: AP)

Most Gluten-Sensitive Individuals Can't Tell If They're Eating Gluten-Containing Flour or Gluten-Free Flour

As the gluten-free fad presses on amongst consumers, a new study shows that only a minority of patients diagnosed with non-celiac gluten sensitivity experience adverse symptoms in response to gluten intake.

The research, spearheaded by a team of gastroenterologists based out of the University and Spedali Civili of Brescia in Italy, joins the growing ranks of studies casting doubt on the prevalence and even the existence of the condition.

Non-celiac gluten sensitivity (NCGS) is diagnosed in patients confirmed not to have celiac disease but who still experience symptoms like bloating, diarrhea, headache, and abdominal pain after eating gluten-containing foods. Wide-ranging estimates have placed its prevalence between 0.6 and six percent of the population, meaning that as few as two million to as many as twenty million Americans could be affected. The condition's existence remains in doubt, however, as no discernible biomarkers have been found. The only evidence we have is sufferers' self-reported symptoms.

In the new study, published to the journal Alimentary Pharmacology & Therapeutics, researchers placed the diagnosis of NCGS under rigorous scrutiny. Thirty-five subjects diagnosed with NCGS who maintained a strict gluten-free diet for six months prior took part in a multi-week gluten challenge. As the authors described:

The challenges comprised gluten-containing and gluten-free flours. These were dispensed in sealed sachets labelled A and B. The participants and the investigators were blind to the contents of the sachets... Each sachet contained 10 grams of flour, and instructions were given to the participants to sprinkle the contents of the sachet over pasta or soup; one sachet to be consumed each day for 10 consecutive days. This was followed by a 14-day washout period, and then by a further 10-day challenge, when participants were crossed over to receive the other flour.

Participants were instructed to rate their symptoms of pain, reflux, indigestion, diarrhea, and constipation on a scale of 1 (no discomfort) to 7 (very severe discomfort) throughout the trial and to write down any other adverse symptoms. At the end of the challenges, participants would be asked to guess, based on their symptoms, which of the flours contained gluten. If they guessed correctly, they would be classified as having NCGS.

At the conclusion of the trial, just twelve of the 35 subjects correctly identified the gluten-containing flour and reported experiencing worse symptoms during the challenge involving gluten. Of the remaining subjects, 17 identified the gluten-free flour as causing symptoms and six reported no adverse symptoms during the trial whatsoever. (Below: The figure shows the symptom scores for the patients who correctly identified the flour containing gluten.)

Also of note, most subjects tended to experience very mild symptoms throughout the trial. On average, participants rated the majority of gastrointestinal symptoms at 3 or lower on the aforementioned scale.

The study was plagued by the usual methodological limitations of nutrition research. Symptoms were self-reported, the sample size was somewhat small, and the researchers relied on subjects to carry out the challenges faithfully. It is also somewhat strange that the researchers would consider correctly identifying the gluten-containing flour to be evidence of NCGS, as it seems logical that fifty percent of subjects would guess the flour by pure chance.

Ultimately, what the study shows, in line with previous scientific research, is that the majority of people who believe they are sensitive to gluten in fact are not, and if adverse symptoms do develop they may be attributed to the nocebo effect or the presence of hard-to-digest carbohydrates called FODMAPS.

"Our study has shown that gluten challenge leads to a recurrence of symptoms in only a third of patients fulfilling the recognised diagnostic criteria for the clinical diagnosis of NCGS," the researchers concluded. "Consequently, NCGS is likely to be the correct diagnosis in only a minority of those who do not have celiac disease, but whom themselves choose to follow a gluten-free diet."

Source: Zanini, B., Baschè, R., Ferraresi, A., Ricci, C., Lanzarotto, F., Marullo, M., Villanacci, V., Hidalgo, A. and Lanzini, A. (2015), Randomised clinical study: gluten challenge induces symptom recurrence in only a minority of patients who meet clinical criteria for non-coeliac gluten sensitivity. Alimentary Pharmacology & Therapeutics. doi: 10.1111/apt.13372

(Image: Shutterstock)

Great Theologian Quotes on Science

From the pulpit of University Presbyterian Church in Seattle, the now-retired Rev. Earl Palmer once said, "We have nothing to fear from science." It was one of the many times I knew that I had found the right church for me.

Rev. Palmer is not alone in his embrace of science. Indeed, theologians past and present have found science to be illuminating rather than threatening, undermining the widespread notion of a "war" between science and religion. Just as many prominent scientists have spoken kindly of religion, many theologians have spoken highly of science. We have compiled some of the best quotes below:

"Men...have a moral responsibility to be intelligent. Must we not admit that the church has often overlooked this moral demand for enlightenment? At times it has talked as though ignorance were a virtue and intelligence a crime."
     --Rev. Dr. Martin Luther King, Jr. (Strength to Love)

"Art along with science is the highest gift God has given [man]."
     --Pope Benedict XVI

"No doubt those who really founded modern science were usually those whose love of truth exceeded their love of power."
     --C.S. Lewis (The Abolition of Man)

"Science can purify religion from error and superstition; religion can purify science from idolatry and false absolutes."
     --Pope John Paul II (Letter to Director of the Vatican Observatory)

"Even the other sciences and their development help the church in its growth in understanding... The thinking of the church must recover genius and better understand how human beings understand themselves today, in order to develop and deepen the church's teaching."
     --Pope Francis (Interview with America magazine)

"I don't think that there's any conflict at all between science today and the Scriptures. I think we have misinterpreted the Scriptures many times and we've tried to make the Scriptures say things that they weren't meant to say, and I think we have made a mistake by thinking the Bible is a scientific book. The Bible is not a book of science."
     --Rev. Billy Graham

"Those theologians who are beginning to take the doctrine of creation very seriously should pay some attention to science's story."
     --Rev. Dr. John Polkinghorne

(AP photo)

The Average Body Temperature Is Not 98.6

Pioneering 19th century German physician Carl Reinhold August Wunderlich is widely known for championing the empirical observation of hospital patients and sagely spreading the idea that fever is a symptom, not a disease, but he is best known for persistently sticking a one-foot rod in the armpits of thousands of people.

The rod was a thermometer, of course, and the temperature measurements he recorded led him to reveal in his 1868 magnum opus, Dos Verhalten der Eigenwarme in Krankenheiten, a number that remains with us even today: 37 °C, or 98.6 °F, the average human body temperature.

Except that number is wrong.

In 1992, researchers at the University of Maryland used the latest equipment and employed rigorous methodology to determine the average human body temperature from a sample of 148 healthy men and women aged 18 through 40 years. Taking over 700 temperature readings spaced out at various times throughout the day, they found that the average human body temperature is closer to 98.2 °F, and, in a bold conclusion that flew in the face of 120 years of common knowledge, stated:

"Thirty-seven degrees centigrade (98.6 °F) should be abandoned as a concept relevant to clinical thermometry."

Even if Mr. Wunderlich is truly wrong -- and it's not looking good for him -- one can't really denigrate his effort. His readings reportedly came from a sizable sample group: as many as 25,000 individuals! But unfortunately, they also came from a now antiquated device. Wunderlich's data almost certainly was hindered by shoddy thermometers.

"Thermometers used by Wunderlich were cumbersome, and had to be read in situ, and, when used for axillary measurements [under the arm] required fifteen to twenty minutes to equilibrate," the researchers noted.

With 98.2 °F now considered to be the correct average human body temperature, it's worth mentioning that, if you took your temperature right now, it almost certainly won't be that number. Your body's temperature fluctuates throughout the day, from roughly 97.6 °F at six in the morning to 98.5 °F at six in the evening. In fact, a temperature as high as 99.5 °F is still considered healthy.

The human body relies on consistent temperatures to function properly. A body temperature ten degrees too warm or twenty degrees too cool more often than not results in death. But there are notable exceptions to the norm. Willie Jones of Atlanta, Georgia survived after reaching an internal temperature of 115.7 °F during a bout of heatstroke. Two-year-old Karlee Kosolofski's body temperature dipped down to 57.5 °F after spending five hours outside in a Canadian winter, yet she lived to tell the tale.

(Image: Shutterstock)

Punish and Reform the EPA

EPA says it focuses on environmental protection. The Animas River disaster shows that it is more concerned with protecting itself.

The accidental spill of toxic wastewater into Colorado’s Animas River is an ironic case study: The very organization meant to protect Americans from environmental catastrophes was responsible for perpetrating it. How should the Environmental Protection Agency be held accountable?

Colorado, and the states downstream of the spill, should sue the EPA. But, instead of merely recovering the cost of environmental damage, the lawsuit should focus on taming the leviathan the EPA has become.

Created in 1970 by President Richard Nixon, the EPA, at its best, has been an important part of improving air and water quality. Clear standards, enforced in a straightforward way have been successful. The fact that the American environment is cleaner and safer than it has been in a century is partially due to EPA action.

In recent years, however, the EPA has moved away from those clear standards, preferring to exercise vague discretion in a way that is costly and often ineffective.

After the Gulf oil spill, the agency was vindictive in its treatment of BP. It banned the oil company, as well as 21 subsidiaries unconnected to the spill, from obtaining new federal contracts due to a “lack of business integrity.” The ban was lifted only after BP sued the EPA. In total, BP paid $54 billion in settlements, including $5.5 billion to the EPA for violating the Clean Water Act.

To be clear, it is not vindictive to hold BP – or anyone else – accountable for environmental damage. But, it is not responsible for the EPA to strain its authority to engage in a self-serving money grab.

The situation with the Animas provides more evidence that EPA’s desire to expand or protect its power can too often trump environmental stewardship.

For example, EPA Director Gina McCarthy told reporters, “The good news is [the Animas River] seems to be restoring itself.” Imagine the (justifiable) outrage from the EPA had BP made such a claim only a few days after the Gulf spill was capped when much of the damage had yet to be assessed.

And it’s not just British oil companies the EPA targets. The EPA threatened a Wyoming man with a $75,000-per-day fine for building a pond on his own property. Such behavior led a Washington Post editorial to observe, “The EPA is earning a reputation for abuse.”

The EPA often argues that money should be no object when protecting the environment. The same agency, however, has been circumspect about paying the significant costs for the damage it caused.

The wide gap between the cavalier attitude toward businesses and personal property rights and their own squeamishness to hold themselves accountable demonstrates that institutional – rather than environmental – protection is playing a decisive factor in EPA decision-making.

If EPA chooses to protect is own, rather than holding employees accountable, can we accuse Director McCarthy of a “lack of integrity”? To what standard will she be held?

The contrasting way the EPA dealt with BP and its own damage at the Animas River demonstrates that agency motives are not always entirely pure. They are quick to demand others pay and give them power, using the environment as a lever. But when their own funding and power is questioned, they minimize the environmental damage and cost. Director McCarthy even had the lack of awareness to tell the people of Colorado not to worry because the “EPA is here.”

The bottom line is that while the EPA has done much good, it has come to associate environmental protection with its own aggrandizement. Now is the time to make it clear that environmental protection, not a self-serving power grab, is what the public wants.

(AP photo)

What Would It Be Like to Fall Through Jupiter?

Gas giants are remarkable planets, if for no other reason than they are so unlike our own. But their name is misleading -- most of the matter in gas giants is not in gaseous form. Owing to the immense temperatures and pressures seen within these stellar monstrosities, most matter exists as neither liquid nor gas, but as a supercritical fluid, which shares the properties of both. 

The two gas giants in our solar system -- Jupiter and Saturn -- are both visible with the naked eye. But seeing, and imagining, are far from actually being there. No human has yet experienced the grandeur of Jupiter from up close. The Galileo spacecraft is one of the few manmade objects to have touched the Jovian atmosphere. On September 21st, 2003, it plunged into the planet at a speed of thirty miles per second, never to be heard from again.

What happened to Galileo? Did it disentegrate? Is its wreckage floating in Jupiter's supercritical atmosphere? We can only speculate. And on that matter, some of the best speculation recently appeared over at the AskScience section of Reddit.

In response to the question, "Could you stand on a gas planet or would you fall to the center?" user Astromike23 described what it would be like, as a human, to fall through Jupiter's atmosphere. A fair warning, if you're not a fan of heights, this story may disturb you:

You start falling through the high, white ammonia clouds starting at 0.5 atmospheres, where the Sun is still visible. It's very cold here, -150 C (-240 F). Your rate of descent is roughly 2.5x that of Earth, since gravity is much stronger on Jupiter.

You emerge out the bottom of the cloud deck somewhere near 1 atmosphere. It's still somewhat bright, with sunlight filtering through the ammonia clouds much like an overcast day on Earth. Below, you see the second cloud-deck made of roiling brown ammonium hydrosulphide, starting about 2 atmospheres.

As you fall through the bottom of this second cloud deck, it's now quite dark, but warming up as the pressure increases. Beneath you are white water clouds forming towering thunderstorms, with the darkness punctuated by bright flashes of lightning starting somewhere around 5 atmospheres. As you pass through this third and final cloud-deck it's now finally warmed up to room temperature, if only the pressure weren't starting to crush you.

Emerging out the bottom, the pressure is now intense, and it's starting to get quite warm, and there's nothing but the dark abyss of ever-denser hydrogen gas beneath you. You fall through this abyss for a very, very long time.

You eventually start to notice that the atmosphere has become thick enough that you can swim through it. It's not quite liquid, not quite gas, but a "supercritical fluid" that shares properties of each. Your body would naturally stop falling and settle out somewhere at this level, where your density and the atmosphere's density are equal. However, you've brought your "heavy boots" and continue your descent.

After a very, very long time of falling through ever greater pressure and heat, there's no longer complete darkness. The atmosphere is now warm enough that it begins to glow - red-hot at first, then yellow-hot, and finally white-hot.

You're now 30% of the way down, and have just hit the metallic region at 2 million atmospheres of pressure. Still glowing white-hot, hydrogen has become so dense as to become a liquid metal. It roils and convects, generating strong magnetic fields in the process.

Most materials passing through this deep, deep ocean of liquid metallic hydrogen would instantly dissolve, but thankfully you've brought your unobtainium spacesuit...which is good, because it's now 10,000 C (18,000 F). Falling ever deeper through this hot glowing sea of liquid metal, you reflect that a mai tai would really hit the spot right about now.

After a very, very, very long time falling through this liquid metal ocean, you're now 80% of the way down...when suddenly your boots hit a solid "surface", insomuch as you can call it a surface. Beneath you is a core weighing in at 25 Earth-masses, made of rock and exotic ices that can only exist under the crushing pressure of 25 million atmospheres.

(Image: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute, Ron Miller via NASA)

Scurvy Is Much Worse Than You Think

Scurvy doesn't just turn your skin yellow.

In fact, in the later stages of the disease, the skin turns black, often right before you die, horribly, from massive internal hemorrhaging near the brain or heart.

While in middle school health class, you probably learned that sailors of centuries past suffered scurvy when they didn't eat enough oranges. But what you didn't hear was that between 1500 and 1800, an estimated two million of them died from it!

"It was such a problem that ship owners and governments counted on a 50 percent death rate from scurvy for their sailors on any major voyage," science journalist Catherine Price wrote in her book Vitamania. "[A]ccording to historian Stephen Bown, scurvy was responsible for more deaths at sea than storms, shipwrecks, combat, and all other diseases combined."

British Commodore George Anson's celebrated voyage around the world may have earned him fame and fortune, but it also resulted in the deaths of 65% of his crew. 1,300 sailors, stationed across six ships, lost their lives, the vast majority of them to scurvy. Nice job, commodore.

Scurvy wasn't simply a recurring nuisance, it was an appalling scourge, one exacerbated by its gruesome symptoms.

"Scurvy starts with lethargy so intense that people once believed laziness was a cause, rather than a symptom, of the disease," Price wrote. "Your body feels weak. Your joints ache. Your arms and legs swell, and your skin bruises at the slightest touch. As the disease progresses, your gums become spongy and your breath fetid; your teeth loosen and internal hemorrhaging makes splotches on your skin. Old wounds open; mucous membranes bleed."

That's not even the worst of it. Two separate accounts -- one from a chaplain and the other from a surgeon -- describe how the gums engorge and grow over the teeth. If not cut off, the tissue may protrude from the mouth and start to decay. The dying tissue endowed sufferers with the worst possible breath imaginable.

The scope and severity of scurvy was remarkable, especially considering how easy it is to prevent and treat. Scurvy results from a deficiency of vitamin C, which is commonly found in citrus fruits, peppers, and a variety of other plant sources. As little as 10 milligrams of the vitamin per day -- one-fifth the amount found in a single orange -- administered over a week or so, can bring a scurvy sufferer back from the brink of death.

But back when scurvy was at its deadliest, humanity was unaware of the existence of vitamins. Ships on long overseas voyages were also ill equipped to store fresh fruits and vegetables. Moreover, cooks onboard didn't know that vitamin C is destroyed by heat, as well as cutting, and even exposure to air. But perhaps the biggest impediment to solving scurvy was a slow spread of information. Journals of ship physicians dating back to the 17th century reveal that a good few stumbled upon the healing powers of oranges, limes, lemons, and cabbage, but their discoveries never made it to common knowledge.

Scurvy's decline began in 1747, when James Lind demonstrated and publicized citrus' power to treat the disease. In the 19th century, scurvy dwindled at a healthy pace. Today, scurvy is mostly a defunct disease of the past, but citizens of underdeveloped countries -- particularly children -- are still susceptible.

(Image: CDC)

Source: Price, Catherine. Vitamania: Our Obsessive Quest For Nutritional Perfection. Penguin Press, 2015. The science and technology focused Alfred P. Sloan Foundation helped make this book possible.