Why We Need to See the Stars

Staring up at the night sky, unfettered by artificial light, in all its majesty, is a transcendent experience. But sadly, it's an experience of which many citizens of the developed world are deprived.

“Eight of ten kids born in the U.S. will never live where they can see the Milky Way," author Paul Bogard recently lamented to The Atlantic.

"We’ve taken what was once one of the most common human experiences, which is walking out your door and coming face-to-face with the universe, and we’ve made it one of the most rare human experiences."

The culprit, of course, is light. Like cunning thieves in the night, street lamps and other light sources beguile us with a warm, tender glow, permitting us to comfortably navigate our darkened surroundings. But at the same time, they steal away our view of the heavens. Through illumination of the close, we are blinded to the beyond.

"When you get to where there’s no light around except what Nature’s giving you, the sky is amazing because there’s stars everywhere. There’s color in the stars. There’s so much range of brightness… Familiar constellations can become lost," says Scott Kardel, an astronomer and director of the International Dark-Sky Association.

Our ancestors were not blindfolded as we are today.

“Before we devised artificial lights and atmospheric pollution and modern forms of nocturnal entertainment we watched the stars," Carl Sagan wrote. "There were practical calendar reasons of course but there was more to it than that. Even today the most jaded city dweller can be unexpectedly moved upon encountering a clear night sky studded with thousands of twinkling stars. When it happens to me after all these years it still takes my breath away.”

Throughout all of history, the stars have served as humanity's quintessential source of curiosity.

What happens when we are shielded from celestial inspiration, and from truly seeing our place in the Cosmos? Do we look down and wonder less? Do we lose our sense of scale? Does our ingrained drive to explore dwindle?

While the philosophical drawbacks may never be entirely known, the physical effects are far easier to quantify. When light pervades our nights, it messes with our circadian rhythmicity, disrupting sleep, and potentially increasing rates of depression, cancer, and weight gain.

Luckily, the cures are simple, none more so than simply flipping a switch. New forms of city lighting are also more directional and efficient. LED streetlights, for example, aim their glow at the ground, preventing pollution of the sky.

The dark can be a frightening place. But though we may fumble and fall, we are never lost or alone. As Carl Sagan reminded us, we are all made of starstuff. When we look skyward, we see our family, our friends, and, in a way, ourselves.

It's time to get reacquainted.

(Image: Shutterstock)

Universe's First Molecule Is Shrouded in Mystery

Somewhere, out there, is the first molecule thought to have formed in the universe. But strangely, scientists have yet to see it in the wild.

That molecule is the helium hydride ion.

Roughly 300,000 years after the Big Bang, the first atoms formed. Back then, there was only hydrogen, helium, and lithium. In this time of chemical simplicity, when the average temperature of the universe was a scorching 4000 degrees Kelvin, helium bonded with an ion of hydrogen. It was the first time two different elements joined together: the original cosmic love story.

13.7 billion years later, chemists here on Earth created the helium hydride ion (HeH+) in the laboratory, concocting an extraordinary molecule that defies the odds. Helium is the most unreactive substance in existence. With two electrons completely filling its innermost energy layer, the element has no desire to seek out electrons from, or share its own, with other elements. Yet, with some prodding, helium renounces its selfish ways to form the stable helium hydride ion. The molecule contains just two electrons: one is solely for helium and the other is shared between helium and the single hydrogen proton. The arrangement renders the helium hydride ion extremely prone to react. In fact, the molecule is the strongest known acid in existence.

Reactivity may be one reason the helium hydride ion has evaded detection by astronomers and astrophysicists. In the lab, it exists only in isolation, as it will protonate any molecule it contacts. Another reason is that the helium hydride ion seems very prone to photodissociation, whereby a photon of light smacks the molecule and grants it enough energy to break the link between the conjoined atoms -- in this case, helium and hydrogen. The helium hydride ion is also notoriously difficult to spot with a spectroscope.

"One of its prominent lines, the ‘fingerprints’ used to identify chemical elements optically, overlaps with lines in the spectrum of the CH methylidene radical," Chemistry World's Brian Clegg explained.

All of these difficulties have repeatedly thwarted astronomers' efforts to detect the helium hydride ion. To date, there have been several attempts, non conclusive. Astronomers will keep trying, however. Finding the elusive helium hydride ion, likely within a nebula or a white dwarf star, would inform long-held models of primordial star formation. It's thought that the molecule acts as a molecular coolant of sorts.

Models predict that the universe is flooded with helium hydride ions. But models can be wrong, a fact that atomic physicist Jérôme Loreau readily admits.

"The abundance of HeH+ mystery remains very much unsolved and it is hoped that our calculations, as well as observations from infrared telescopes such as the Herschel Space Telescope and the Spitzer Space Telescope, can shed some new light on the issue. Indeed, observations... could invalidate current models of the appearance of the first molecules, with consequences on our theories of the formation of stars and galaxies in the early universe."

(Image: NASA)

Is Breathing Oxygen a Terrible Idea?

Okay, Evolution. I have a bone to pick with you.

Oxygen... why?

Of all the elements on the periodic table, we breathe and persist on one that's inherently corrosive and readily promotes combustion*. Toxicologists writing in the journal Trends in Pharmacological Sciences even labeled it as "one of the most toxic compounds known."

In fact, as a result of its intrinsic chemical reactivity, oxygen rips electrons from bodily sources, mucking with cellular processes and even causing mutations in DNA, which can give rise to cancer.

There's plenty of nitrogen in the atmosphere... why can't we breathe that?

I mean, really, Evolution... Are you drunk?

As it turns out, the qualities that make oxygen look terrible on paper are the same qualities that grant it's life-fueling powers.

For example, it's precisely because oxygen is so reactive that complex life has evolved to use it. Oxygen loves gobbling up electrons. It hungrily accepts them from all sorts of other elements and compounds. Selfishness like this is exactly what's needed to make cellular respiration work.

The table above shows electronegativity, which is basically a proxy for how much an element loves electrons. See oxygen at the far right?

But as stated earlier, chemical reactivity is a double-edged sword. In its hunt for electrons, oxygen occasionally inflicts some collateral damage. Scientists call it oxidative stress. In response, the body maintains an arsenal of antioxidants, which donate their electrons to sate oxygen's cravings.

There are other chemicals nearly as electron-hungry as oxygen, like fluorine, chlorine, and nitrogen. Theoretically, these elements could also be used to power some alien form of respiration. Life based on such processes would be radically different, however. Nitrogen isn't as reactive as oxygen on account of its reinforced triple bond, but let's pretend that - somewhere in the vast universe -- there are life forms that breathe it. They might very well reside on a world with high atmospheric pressures and oceans of ammonia, instead of water.

Chlorine might be a better candidate than nitrogen. Life that breathes chlorine might live on a world tinted green and filled with hydrogen chloride. There wouldn't be an ozone layer, so the organisms would need to be resistant to radiation.

Compared to such faraway, exotic worlds, Earth may seem mundane. But it is undeniably pleasant. Breathing oxygen on a blue-green world isn't so bad.

Evolution, I owe you an apology. I'm sorry for calling you drunk.

(Images: AP, Ittiz)

*Correction 1/29: This article originally referred to oxygen as "highly flammable." It is not itself flammable, however it does make combustable substances much more prone to combust.

Why Science Is Anti-Antioxidant Supplements

In the 1990s, preliminary studies showed that people who regularly consume antioxidant-rich fruits and vegetables have lower rates of cancer, cardiovascular disease, and vision loss compared to people who eat fewer of those foods. The rest, as they say, is media hype and marketing.

Today, a great many consumers down antioxidants as if they were candy. Sales of antioxidant supplements account for $500 million of the massive $32 billion supplement industry.

For the uninitiated, antioxidants, including Vitamin C, Vitamin E, beta-carotene, selenium, manganese, and coenzyme Q10, are chemicals that corral potentially harmful free radicals. Free radicals are a fixture of life. They're everywhere: in your food, in the air, inside you. They come in many shapes and sizes, but what they all share in common, as described by the Harvard School of Public Health, is a "voracious appetite for electrons, stealing them from any nearby substances that will yield them." This "theft" can alter the instructions in a strand of DNA, potentially leading to a cancerous mutation, or cause "bad" LDL cholesterol to become trapped in a cell wall, increasing the risk of cardiovascular disease. Free radicals are like raucous, mischievous six-year-olds: you don't want them running amok.

That's where antioxidants come in. They appease free radicals by selflessly donating their own electrons.

So... more antioxidants is better... right?

Here is where the science gets lost in translation. It's ever alluring to view optimal health as attainable via a simple fix, pill, or diet -- that's what the supplement industry capitalizes on. But it's not that one-dimensional.

The same goes for antioxidants. They all perform complex roles in differing niches; their mission isn't solely to combat free radicals. Analyses have demonstrated that, in certain circumstances, antioxidants may actually serve as pro-oxidants, grabbing electrons instead of giving them up. A couple of the factors that determine how they act include the chemistry of their surroundings and the amount of antioxidants present.

Other scientists also raise the point that oxidative stress from free radicals isn't as detrimental as it's been made out to be.

"Oxidative stress is not only an inevitable event in a healthy human cell, but is responsible for the functioning of vital metabolic processes, such as insulin signaling and erythropoietin production," a team of researchers wrote in 2012.

Moreover, pretty much all of antioxidants' supposed benefits, optimistically touted decades ago, have not materialized. Antioxidant supplements do not improve cognitive performance or reduce the risk of dementia. There's little to no evidence that they reduce cancer risk. None of them is a "silver bullet" against cardiovascular disease, although Vitamin E might have a tenuous benefit.

To the contrary, antioxidant supplements may do more harm than good. Examining longitudinal studies of beta-carotene, Vitamin A, and Vitamin E supplementation, scientists were surprised to discover a small, but noticeable increase in mortality.

Make no mistake, consuming antioxidants via a diet rich in fruits and vegetables seems to be nothing but healthful. But available evidence shows that further supplementation is entirely unwarranted.

"In the light of recent physiological studies it appears more advisable to maintain the delicate redox balance of the cell than to interfere with the antioxidant homeostasis by a non-physiological, excessive exogenous supply of antioxidants in healthy humans," researchers from Gottfried Wilhelm Leibniz University in Germany wrote.

In layman's terms, that translates to "don't waste your money on those Vitamin A pills."

(Image: AP)

Thinking About Sex vs. Actually Doing It

People put an awful lot of effort into having sex. Other than hunger, thirst, and seeking shelter, it is perhaps the strongest urge that humans have. It's no wonder that young men spend time in the gym and buy fancy sports cars; equally, it's no wonder that young women spend hours in front of the mirror and buy lots of clothing. We have a strong biological impulse to get laid, and these are the most widespread ideas we have to accomplish that.

Setting all those countless hours of grooming and flaunting aside, how much time do we spend thinking about sex? And how does that compare to the time we spend actually having sex?

It is an oft-repeated myth that men think about sex every seven seconds. Two studies reported in BBC Future came to radically different -- and far more realistic -- conclusions. The first suggested that college-aged men think about sex 19 times per day, while college-aged women think about it 10 times per day; the second suggested that adults thought about sex merely once per day. The latter study was designed in such a way that the result is almost certainly an underestimate. So, let's pick 10 times per day as the average.

How long does a sexual thought last? There is almost certainly no (reliable) data on that. We will have to estimate. A sexual thought could be as fleeting as a momentary feeling of lust after seeing an attractive person, or it could be an elaborate mental fantasy. Let's assume, just for the sake of argument, that the average thought lasts 10 seconds. On average, then, we spend about 100 seconds per day thinking about sex.

How much time do we spend actually having sex? First, we need to figure out how often we have sex. The numbers, from the Kinsey Institute, vary greatly. In general, the younger and married have more sex than the older and unmarried. Contrary to popular belief, single people aren't getting much action, while married couples (particularly newlyweds) are knocking boots quite often. Since the statistics vary so much, it is probably easiest to restrict our analysis to college students, who get most of the attention from psychologists, anyway.

Though worried parents lament the college "hook-up" culture, Live Science reports just under 60% of college students have sex at least once per week. That means 40% have sex less often than once per week, and some may not be having sex at all. Again, we will have to estimate. Let's say that the average college student has sex once every 10 days.

Finally, we need to know how long the average whoopee session lasts. As it turns out, not long. Dr. Harry Fisch says that the average encounter is 7.3 minutes (438 seconds).

Now, we can do the math.

In the span of 10 days, the average college-aged student will think about sex for 1,000 seconds, but will only have sex once. That session, on average, will last 438 seconds. Excluding the countless hours of behaviors (grooming, shopping, etc.) that prepare us for sex -- not to mention watching porn, which consumes a considerable portion of the average college-aged male's time -- this back-of-the-envelope calculation suggests that young adults spend roughly twice as much time thinking about sex as actually having sex. If those other behaviors were factored in, then the ratio could be as high as 50 or even 100 to 1!

Just imagine if an equivalent effort was put into studying!

(Image: Sex via Shutterstock)

The Smartest Person Who Ever Lived

Who was the smartest person to ever live? There are certainly many worthy contenders. Today, the very name of "Einstein" is synonymous with genius. Others may suggest Stephen Hawking. Those who appreciate literature and music may proffer William Shakespeare or Ludwig van Beethoven. Historians may recommend Benjamin Franklin.

Before I submit my own suggestion, we must first discuss what we even mean by smart. Colloquially, we routinely interchange the words smart and intelligent, but they are not necessarily the same thing. There is an ongoing debate among psychologists, neuroscientists, and artificial intelligence experts on what intelligence actually is, but for our purposes here, a simple dictionary definition will suffice: "capacity for learning, reasoning, understanding, and similar forms of mental activity; aptitude in grasping truths, relationships, facts, meanings, etc."

Implicit in this definition of intelligence is general knowledge. An intelligent person capable of understanding quantum mechanics is useless to society if he is completely ignorant. So, a truly smart person will know a lot of things, preferably about many different topics. He should be a polymath, in other words.

Finally, there is the element of creativity. Creative people think in ways in which most other people do not. Where society sees a dead end, a creative person sees an opportunity.

Which person from history was the bodily manifestation of intelligence, knowledge, and creativity? Our blog's namesake, Isaac Newton, of course!

What was Newton's IQ? It's impossible to say. IQ tests didn't exist in the 17th Century, and if they had, Mr. Newton certainly would not have deigned to spend 90 minutes filling out ovals on a multiple choice test. Besides, he likely would have finished the test early and then spent the remaining time correcting errors and devising more difficult questions.

Nobody doubts that Isaac Newton was an intelligent man, but he also exhibited in spades the two other characteristics outlined above: knowledge and creativity.

Newton was a true polymath. Not only did he master physics and mathematics, but he was also a theologian. He was obsessed with eschatology (end-times prophecy), and he calculated -- based on his interpretation of the Bible -- that Jesus Christ would return to Earth in 2060. His dedication to religion was so great that, according to Nature, more than half of his published writings were on theology.

He also became well versed in alchemy. Do not hold that against him. Many great scientists of his time believed that any metal could be transmuted into gold. The Economist explains why the notion was not entirely unreasonable in Newton's time:

Alchemical theories were not stupid. For instance, lead ore often contains silver and silver ore often contains gold, so the idea that lead 'ripens' into silver, and silver into gold, is certainly worth entertaining. The alchemists also discovered some elements, such as phosphorous.

Furthermore, later in life, Newton dabbled in economics. James Gleick, author of the truly excellent biography Isaac Newton, wrote that "[h]e wrestled with issues of unformed monetary theory and international currency." As Master of the Mint, Newton was tasked with tracking down currency counterfeiters, which he did, as Gleick wrote, "with diligence and even ferocity." He showed no pity in his relentless pursuit of justice. When notorious counterfeiter William Chaloner attacked Newton's personal integrity, he doubled down his efforts to catch him. Mental Floss reports:

Acting more the grizzled sheriff than an esteemed scientist, Newton bribed crooks for information. He started making threats. He leaned on the wives and mistresses of Chaloner's crooked associates. In short, he became the Dirty Harry of 17th-century London.

Newton's sleuthing worked. Chaloner was caught and hanged.

Impressive as all that, what truly separates Newton from other luminaries was his unparalleled creativity. He created multiple tools that simply never existed before. For example, in order to study acceleration, the change in velocity, a tool beyond basic algebra was required. That tool, called the derivative, is the most basic function in calculus. It didn't exist in the 17th Century. Newton invented it.

In order to find the area beneath a curve, another tool beyond basic algebra was needed. That tool, called integration, is the second most basic function in calculus. Like the derivative, it did not exist in the 17th Century. So, Newton invented it. He also invented a reflecting telescope and the ridges on coins, which serve as an anti-theft measure that prevents "coin clipping."

Newton's inventiveness is perhaps best summarized by the epigraph to Gleick's biography, which was written by his niece's husband in 1726:

I asked him where he had it made, he said he made it himself, & when I asked him where he got his tools said he made them himself & laughing added if I had staid for other people to make my tools & things for me, I had never made anything...

Sadly, despite his fame, Isaac Newton led a very lonely life. His incomparable brilliance came at a hefty cost; his reclusive and anti-social nature strongly suggest that he was autistic, and his obsessive and disagreeable nature suggest mental illness, perhaps obsessive-compulsive disorder. Mental Floss not-so-charitably describes Newton as suffering from "everything":

[H]istorians agree he had a lot going on. Newton suffered from huge ups and downs in his moods, indicating bipolar disorder, combined with psychotic tendencies. His inability to connect with people could place him on the autism spectrum. He also had a tendency to write letters filled with mad delusions, which some medical historians feel strongly indicates schizophrenia.

The more I study Isaac Newton, the more fascinating he becomes. In my opinion, the genius of the precocious boy from Woolsthorpe has never been, nor ever will be, surpassed.

(Image: Great Idea via Shutterstock)

What a Ginormous Laser Can Teach Us About the Insides of Massive Planets

Man will never intimately know what it's like at the center of Jupiter. The massive planet's core, which measures in at searing temperatures of 64,300 °F and crushing pressures equal to 30 million Earth atmospheres, is inhospitable, to say the least. But with the help of the largest laser ever created, scientists can begin to comprehend the nature of things at such blistering extremes.

Operating the gigantic, powerful laser at the National Ignition Facility (NIF) may well be the dream of every young child who ever burned objects with a magnifying glass. In well under a second, the device can deliver a 500 terawatt flash of light that applies pressures up to 100 billion atmospheres and heats objects to temperatures six times hotter than the core of the sun.

The best part is that the laser is far from ornamental. Science at NIF is often fairly straightforward: You place the object you want to study into the chamber. You aim the laser, which is composed of 192 smaller lasers, at it. And finally, you fire the laser (while observing what happens)!

Yesterday, in the journal Science, a team of physicists reported the results of one of these experiments. They wanted to know what would happen to silica, one of the most common constituents of planetary interiors, when it's blasted under the kind of temperatures and pressures seen at the core-mantle boundaries of Uranus, Neptune, or a "Super-Earth" about five times more massive than Earth. The results were fascinating.

At 500 gigapascals, equal to the pressure of five million Earth atmospheres, the melting point of silica rose to 8300 Kelvin.

"We therefore conclude that silica and magnesium oxide are solid in the deep interior of icy giants like Uranus and Neptune, as well as—with extrapolation—in the rocky core of Saturn and Jupiter," the team, led by UC-Berkeley's Marius Millot, announced.

The result further supports the idea that the cores of gas giants like Jupiter and Uranus are solid.

Millot and his team also discovered that molten stishovite, a polymorph of silica, is conductive, and may contribute to the generation of planetary magnetic fields. Such fields are important, as they shield planet surfaces -- as well as any life that subsists on them -- from the energetic, charged particles constantly careening through space. Magnetic fields are created by a moving charge. Thus, planet-sized magnetic fields arise when a conducting, metallic liquid in the interior is sloshed around, often by planetary rotation.

Millot's work is not the first-of-its-kind to be conducted at NIF. Last summer, researchers vaporized diamond, the least compressible material known to man. Several more experiments of a similar nature are slated to take place this fall, including one on the intriguing topic of primordial nucleosynthesis, the production of atomic nuclei that many cosmologists believe occurred between 10 seconds and 20 minutes after the Big Bang.

[Image: LLNL/NIF/NASA/E. Kowaluk(LLE)]

Why Rich People Don't Care About You

Examine the income ladder of the United States, and you'll soon stumble upon a surprising fact: Rich people donate a smaller portion of their income to charity than poor people. In 2011, people in the bottom 20% donated 3.2 percent of their earnings. People in the top 20% donated just 1.3 percent.

These numbers don't seem to be anomalous, but there is some nuance. Data from the National Center for Charitable Statistics shows that taxpayers making less than $60,000 donate around 3.75% per year, while those making between $200,000 and $10 million donate less than 3%. However, those making more than $10 million are the most generous of all, donating nearly 6% of their income.*

Psychologists have examined this dynamic even further.

"What we've been finding across dozens of studies and thousands of participants across this country is that as a person's levels of wealth increase, their feelings of compassion and empathy go down, and their feelings of entitlement, of deservingness, and their ideology of self-interest increases," Paul Piff, an Assistant Professor of Psychology and Social Behavior at the University of California, Irvine, announced in a 2013 TEDx talk.

In one study, Piff brought rich and poor members of the community into the lab and gave them each $10. Participants were told that they could keep the money or share it with a stranger. The poorest subjects, those making less than $25,000 a year, gave 44% more than those making between $150,000 and $200,000.

In another instance, Piff and his associates ventured out to a California crosswalk, where motorists are required by law to stop for pedestrians waiting to cross the street. They then watched the actions of drivers as one of the researchers would stand at the side of the road, plainly waiting to walk across. Over 152 observations, they noted that not a single driver in the least-expensive car category buzzed through the crosswalk -- they all stopped. On the other hand, 50% of drivers in newer, more expensive cars like BMWs and Priuses drove right through.

Perhaps the most publicized of Piff's studies involved that quintessentially American board game: Monopoly. Piff brought numerous pairs of students into a small room and filmed them as they played the game. At the outset, students flipped a coin to determine whether they would be in a poor or privileged position. The privileged players were given more cash, collected twice as much money as they passed Go, and were permitted to roll the dice twice. Piff described what happened:

"The rich players actually started to become ruder toward the other person, less and less sensitive to the plight of those poor, poor players, and more and more demonstrative of their material success, more likely to showcase how well they're doing."

Piff also noticed something fascinating in the privileged players' responses at the conclusion of the game.

"They talked about what they'd done to buy those different properties and earn their success in the game, and they became far less attuned to all those different features of the situation, including that flip of a coin that had randomly gotten them into that privileged position in the first place."

Piff and his colleagues theorize that the reason the rich seem to be less caring and compassionate compared to their peers is that their wealth affords them the luxury of not having to rely on others. Over time, their sense of empathy can grow less sensitive.

Amazingly, UC-Berkeley psychologist Dacher Keltner has found a physiological manifestation of this deficit. He said in a recent Figure 1 video:

"Our lab and other labs are interested in something called the vagus nerve. It's the longest bundle of nerves in the human nervous system. In our research on compassion, the feeling of caring for someone in need activates the vagus nerve. Lower class individuals, if we show them images of suffering, they have a vagus nerve response. You don't see that in upper class individuals."

So what does it matter if the wealthy don't care about the rest of society? The problem is that they are increasingly making society's decisions. For the first time last year, more than half the members of the U.S. Congress were millionaires, with roughly twice the net worth of the average American household.

Of course, the wealthy aren't doomed to be Scrooges. For instance, the studies did not examine if there were behavioral differences between those who earned their wealth versus those who simply lucked into it. Also, Keltner insists that the human brain is hardwired to care. The wealthy just have to consciously work to be more cognizant of their fellow humans.

(AP photo)

*Section added 1/22

Meet the Deniers of Newton's Third Law

Last fall, news broke of a NASA test that gave credence to a seemingly impossible new rocket engine that produces output thrust from no input fuel. Of course, it was utter junk science. That's really bad, considering NASA shouldn't have been duped by a fantasy rocket.

This month, the inventors of the "engine" uploaded a new paper to their site claiming to explain why it is so hard to make good measurements of the engine. Does this new information shed light on why the device might be feasible after all? Have critics been missing something?

Let's examine this the way a physicist would.

The paper begins with a force diagram, which attempts to lay out a picture of the physical system so that Newton's laws can be applied to the forces acting upon it. Thrust pushes out the back of the engine, away from the device. As the more alert people at NASA will tell you, this causes an equal and opposite reaction force pushing the engine forward, due to Newton's third law. This is the central principle of all rocketry. Now, a test rig is proposed where the engine is hung from a scale, like putting a fruit on a supermarket produce scale. (Perhaps a better analogy is that the setup should be like hanging meat from a scale with a hook, since the engine should not sit in a pan, but rather hang with nothing below.)

Now, if the rocket is producing thrust, some of the force of gravity pulling down on the string will be negated by the upward push of the engine, right? The apparent weight of the engine will be less, just as the apparent weight of a person on a downward accelerating elevator is less on a scale. (Try this yourself. It's true!) Similarly, an upward accelerating elevator will make your apparent weight increase. You could even turn the engine upside-down and thrust downward on the scale and measure the increase in apparent weight. The point is: Testing rocket thrust with a scale is a simple, accurate and well-known method.

Strangely, the anonymous writer of the document disagrees. He claims that the rocket thrust and the resulting force pushing up on the engine cancel out. At first take, this looks like a misunderstanding of measurement techniques. But in reality, it is appears to be something much worse: The author is in denial over Newton's third law itself.

So, in one corner, we have the time-tested physical laws of Sir Isaac Newton; in the other corner, we have an anonymous proponent of fantasy rocketry.

A closer reading supports this conclusion. In the next force diagram, the thrust force and the resulting Newton's third law force pushing the rocket the other way again "cancel." This is like a freshman level introductory physics homework solution that earns 1/10 credit simply for not leaving the paper blank. It's not a small error in calculation; it is an absurd misunderstanding of the entire underlying principle.

The paper then veers into a discussion of an unnecessarily complicated measurement, followed by an overly simplified discussion of thermal expansion. This portion of the report would be entirely gratuitous if the authors were not demanding an unnecessary method of measurement.

Thus, the paper presents no additional evidence to believe in the success of a miraculous rocket thruster. By comparison, claiming that we could power the Earth using rainbows and unicorn tears looks far more credible.

(AP photo)

Man Is Sabotaging His Best Friend. It's Time to Rethink How We Breed Dogs.

150 years ago, Bulldogs were much different. They were slender-legged, with a longer snout and a livelier demeanor.

Today, it's nearly impossible for a purebred Bulldog to reproduce without assistance. Most have to be artificially inseminated since the female cannot support the male's weight during mating. 80% of Bulldog puppies are delivered by caesarian section; their signature large, flat heads are simply too large for the birth canal. Many Bulldogs also struggle with breathing problems on account of their stubby noses, and 71.6% have hip dysplasia, an abnormal formation of the hip socket that often results in painful arthritis or even lameness.

All of that, to get this:

The Bulldog isn't the only breed of canine facing health issues. Up to one-third of Cavalier King Charles Spaniels have a skull that's simply too small for their brains. In a 2009 BBC One documentary, Veterinary Neurologist Dr. Claire Rusbridge described the brain as a "size 10 foot that's been shoved into a size 6 shoe." 38.5% of Boxers die from cancer. The ears of Basset Hounds are so long that, as puppies, they trip over them and accidentally chew on them while eating.

The blame for all of these health problems lies squarely with the leash holder. Through years of highly questionable breeding practices, humans have bred the genetic diversity out of a great many dogs, almost entirely for cosmetic reasons. As dogs lose their hybrid vigor, deleterious genetic traits seep in and grow commonplace. Man is inadvertently sabotaging his best friend.

A study published in 2008 to the journal Genetics drove this point home. Examining breeds like Boxers, Chows, Bulldogs, Golden Retrievers, Greyhounds, and Labradors, the research team discovered that in just six generations of breeding, the dogs had lost 90% of their genetic variation. Nine of the ten breeds studied were extremely inbred. The only one that wasn't was the Greyhound.

In 2013, scientists from the University of California-Davis examined the prevalence of 24 genetic disorders, including various types of cancers, orthopedic problems, allergies, cataracts, and epilepsy, in 90,000 mixed- and pure-bred dogs. They found that 13 of the 24 disorders occurred at comparable levels in each group, ten were found more frequently among purebred dogs, and only one was more common among mixed-breeds.

A more recent study found that inbreeding reduced longevity and reduced litter size in seven French dog breeds.

Dr. Patricia McConnell, an applied animal behaviorist, dog expert, and a professor at the University of Wisconsin-Madison is not a fan of this unfortunate situation.

"Heaven help me, because I know I’ll take flack for this, but as a biologist and a dog lover, I just have to comment that there is something terribly wrong with the way we are defining 'pure bred' dogs now," she wrote.

She has some ideas about how to fix our pooches' predicament. One of them is to do away with the dogmatic, idealized notion of "purebred."

"Insisting on 100% “purity” of blood lines is relatively new: It was common in the past, less than 100 years ago, to mix and match lineages and breeds to combine desired traits and keep the lines healthy. The idea of bringing in new genetics, if necessary, was considered to be a good thing, not something that would destroy the breed."

Another is to take a science-based approach to purebred breeding. One of the biggest problems is the sire phenomenon, where popular male dogs are mated to a host of females. This practice should be limited. Genetic testing is also becoming increasingly available and affordable. The best breeders, like the Seeing Eye in Morristown, New Hersey, keep a genetic database for all their dogs and actively monitor and enhance the gene pool.

Of course, if dog breeders are going to enact needed changes, they need to be pressured by consumers. Dog buyers are frequently unable to see past the floppy ears and big eyes of adorable puppies to the underlying health of the animals. Potential owners of purebred dogs must educate themselves on genetics and purchase only from responsible breeders.

(Images: Public Doman, Quizillafreak,

The Beautiful Math Inside All Living Things

Watching a cell divide is mesmerizing.

But it gets even more spellbinding when you realize there's a mathematical pattern underneath it. Over many divisions, the generations of cells reproduce mathematician Blaise Pascal's legendary triangle, in which the two numbers directly above add up to the one below them.

At first there is one cell: A. Then there are two cells: A and B. At the next division, A will have produced two "B" cells, and the lone B cell from the first division will have produced one cell, which we'll label "C." So there are now four cells: 1 "A", 2 "Bs", and 1 "C." The pattern continues. At the next division there will be 1 A, 3 Bs, 3 Cs, and 1 D. And so on and so forth.

Mathematical patterns manifest across all corners of life. Sometimes they're broad and sweeping, like Swiss biologist Max Kleiber's Law. It states that an organism's basal metabolic rate -- the amount of energy it consumes at rest -- is roughly equal to its mass raised to the three-quarters power. Kleiber's Law has also been found to apply roughly to lifespan in both plants and animals! In general, the larger an organism, the longer it lives. There are many exceptions to Kleiber's Law, prompting some biologists to take umbrage with its generalized nature. Nevertheless, the notion definitely has traction, receiving thousands of citations in the scientific, peer-reviewed literature.

In some places, mathematics is less observational and more integral. Fractals, exponentially branching figures, can be found in all sorts of places: "leaves, gills, lungs, guts, kidneys, chloroplasts, and mitochondria, the whole-organism branching architectures of trees, sponges, hydrozoans, and crinoids, and the treelike networks of diverse respiratory and circulatory systems," physicist Geoffrey West noted.

http://user.engineering.uiowa.edu/~yoyin/index_files/breathingLung.gif

But why? A prominent explanation, which I previously wrote about, is that fractal-like shapes are fantastic at maximizing surface area. This extra area allows nutrients and gasses (like oxygen in your lungs) to be transported more efficiently within and throughout biological entities and structures.

For most evolved life, efficiency is everything. It is in pursuit of this perfection that some of nature's most astounding patterns have arisen. Ever count the petals of a flower or the spirals of a pinecone? Each will almost always* be a number from the Fibonacci Sequence, in which the previous two numbers add up to the next: 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, etc.

At first, this is mind-boggling. Why would Nature do this? But as YouTube educator Vi Hart pointed out, the reason is beautifully simple. Plants want to maximize the amount of sunlight they receive, so logically, each petal should never completely block another out. Thus, petals are placed based upon an irrational "Golden" ratio: 1.61803, which the Fibonacci Sequence closely mimics!

"This design provides the best physical accommodation for the number of branches, while maximizing sun exposure," the University of Georgia's Nikhat Parveen described.

The Golden Ratio also appears on you! Various proportions of the human body -- on our face, fingers, and arms -- roughly average out to 1.61803. How quaint: humanity is inherently irrational!

*Correction 1/19: An earlier version of this article made it seem like the Fibonacci numbers were a constant in plants. They are not constant, just extremely common.

(Images: Flickr/Didier.bier, NikonU, Youbing Yin, UGA)

That's Not Food. That's a Disease!

Cancer cells or coffee beans?

Thankfully, the Brenner tumor cells shown above are often benign. Scientists spot this rare ovarian tumor by the characteristic coffee bean nuclei of its cells. Afterwards, they grind the cells up, blend them with hot water, and guzzle them down with an apple danish. (Not really.)

The example illustrates one way how food analogies crop up in medical terminology. It's not the only one. There are dozens, in fact, if not hundreds!

Chocolate-colored blood occurs due to high concentrations of methemoglobin, not an infusion of cocoa. Blueberry muffin rash is a discoloration of a baby's skin resulting from blood formation outside of bone marrow. Disordered metabolism of an amino acid called methionine causes urine to smell like an oasthouse, a building where hops are dried. When the abdominal wall muscles are absent or underdeveloped, often due to a genetic birth defect, the stomach is described as a prune belly. Maple syrup urine might sound tasty, but you don't want to squirt it on your pancakes.

Gwinyai Masukume, a public health researcher at the University of the Witwatersrand, and Lisa Kipersztok, a fourth-year medical student at Tufts University, recently described 38 food-related metaphors in a delectable, pun-laden paper published to the Malta Medical Journal.

"These terms have utility in broadening differential diagnoses when confronted with symptoms and signs of disease, and are easily memorable for rapid recognition in classic cases," they noted. "Food-related eponyms continue to be taught in medical school classrooms and physical diagnosis courses, and are likely to be retained in the ‘visual specialties’ of radiology and pathology."

"We follow in the footsteps of hungry giants," Masukume humbly wrote in Improbable. "In the late 1970s, Terry and Hanchard in their seminal paper, titled 'Gastrology: the use of culinary terms in medicine', appearing in the British Medical Journal, offered the real first course of food-related medical terms in medical literature.

Indeed, in their paper, Terry and Hanchard reviewed the scientific literature and noted 121 allusions to food and beverage items, a veritable "harvest." Within the cornucopia were references to strawberry tongues, spinach stools, onion skin, chicken fat clots, teapot stomachs, and honeycomb lungs, among other sumptuous offerings.

However, the duo warned that "Any new attempts to introduce culinary terminology in medicine should be done with caution and the terminology should be kept simple rather than exotic."

Got that, future doctors? Even though that newly discovered bacterium may resemble a rambutan or a jabuticaba, for sanity's sake, do not name it after a rambutan or a jabuticaba.

H/T Improbable

(Images: Nephron, Lisa Kipersztok)

The Biggest Myth About Debunking Myths

Sugar doesn't make you hyper. A penny dropped from the top of the Empire State Building won't kill you. We don't just have five senses. Napoleon wasn't short. Caffeine doesn't dehydrate you. The Great Wall of China is not visible from space.

Everything you know isn't wrong. But a lot of it is.

Myths are everywhere: on the Internet, at your work, in your head. Even worse, they're difficult to dislodge. Psychologist Stephan Lewandowsky at the University of Bristol has made a career out of trying to loosen their extensive hold. After decades of effort, he's learned a lot. His biggest discovery, perhaps, is that attempting to debunk a myth can backfire, resulting in the myth being strengthened instead of removed.

He also disproved a huge myth about debunking myths: "that removing [a myth's] influence is as simple as packing more information into people's heads."

It's simply not that easy.

"This approach assumes that public misperceptions are due to a lack of knowledge and that the solution is more information - in science communication, it’s known as the 'information deficit model'," Lewandowsky wrote in The Debunking Handbook. "But that model is wrong: people don’t process information as simply as a hard drive downloading data."

Often, the problem is not a lack of information, but too much of it. With such a bounty now available, both of credible and dubious origins, people can pick the "facts" that fit their preferred ideology or worldview. How is the layperson to tell truth from untruth?

Lewandowsky has some recommendations.

"To successfully impart knowledge, communicators need to understand how people process information, how they modify their existing knowledge and how worldviews affect their ability to think rationally. It’s not just what people think that matters, but how they think," he says.

To the human mind, facts are minutiae. What matters most is the overarching narrative. For a single fact or even a group of facts to topple a mindset is an immense task, like David facing off against Goliath... if Goliath was twice as tall and encased in graphene body armor.

So to help dispel a myth, use these three steps. First, emphasize the core facts of the topic without even mentioning the misinformation. Take the 10% brain myth, for example. Simply say, "humans make complete use of our brains, this is clearly demonstrated with brain scan technology." Second, state the myth, but first preface it with an explicit warning. "There is a lot of pervasive misinformation about the brain. For example, 65% of the public falsely believes that we only use 10% of our brainpower." Third, present an alternative explanation for why the myth is wrong. Neuroscientist Barry Beyerstein can help here: "Brain cells that are not used have a tendency to degenerate. Hence if 90% of the brain were inactive, autopsy of adult brains would reveal large-scale degeneration." Yet we don't see this.

One myth down, a multitude to go.

Source: The Debunking Handbook, by Stephan Lewandowsky and John Cook

(Image: Shutterstock)

Are Tasers Torture?

Ever wonder what it's like to be Tased? Fox News' Mike Straka was curious. So he gave it a try.

"It hurt like a bitch." he candidly reported.

"I felt my body stiffen like a board, and lost all motor skills; however, I was completely cognizant of what was going on. It was the strangest feeling in the world."

A Taser is a conducted electrical weapon, firing two dart-like electrodes, which remain connected to the main unit via conductors, to deliver electrical current that disrupts muscle control. According to Taser International, the company that manufactures them, Tasers are used by over 17,000 law enforcement agencies, and are deployed 904 times each day.

As you might've heard, they're also a tad controversial.

Jude McCulloch, a professor of criminology at Monash University in Australia, contends that Tasers are not the "non-lethal" alternatives to deadly firearms that they've been made out to be. She cites numerous cases of Taser-related deaths and argues that -- in the hands of police -- Tasers are "inherently open to abuse" and the use of excessive force.

Recently published results from a yearlong investigation into the use of Tasers by Miami Police, Miami-Dade Police, and Miami Beach Police seem to corroborate that narrative. The Miami New Times found that cops used Tasers against the homeless in order to make them move, against the mentally-ill when they couldn't understand instructions, and even once against an unruly six-year-old. All told, over the 8-year period examined, 11 men died in the wake of being Tased.

"The reality is simple. Tasers are not suitable police tools in a democratic society," Professor McCulloch wrote.

Is she correct?

Dr. James Jauchem, an expert in conducted electrical weapons and a former Senior Research Physiologist from the U.S. Air Force’s Directed Energy Division, believes that's a matter best left to policymakers. But he also thinks that both opponents and proponents of Tasers should at least get the science right.

In a recent review article published to the journal Forensic Science, Medicine, and Pathology, Jauchem examined a great many misconceptions about Tasers and their use. One of the biggest, he says is that Tasers have been causally associated with death. That's more up to interpretation. Taser-related deaths are extremely rare and overwhelmingly occur in subjects with pre-existing cardiovascular disease or under the influence of illicit drugs. In those cases, the mode of death is usually listed as an overdose or heart condition. It's unknown how a Taser may or may not have contributed.

The most rigorous study yet conducted on Taser use indicates that they are relatively safe. In 2007, researchers from Wake Forest University examined 1,000 instances of Taser use and found that in 997 there were no injuries, or just mild injuries that did not require hospitalization. Of the three odd instances that required hospitalization, two were tied to the fall resulting from the Taser shot and the third was unverified.

Another misconception is that Tasers reduce cognitive ability in those hit by them. In fact, all uses of force temporarily cause a small, temporary drop in cognitive functioning. The Taser is no different. This is likely the result of a brief, highly stressful stimuli engaging the human fight-or-flight response.

Critics also commonly state that Tasers "electrocute" people with "50,000 Volts." In fact, a Taser only delivers 1,200 Volts to the body, at an amperage that's one-tenth that of a strong electric shock.

At the more extreme end of misconceptions, critics have contended that the use of a Taser constitutes torture. But the United Nations Committee against Torture defines the act as only occurring when "severe pain or suffering... is intentionally inflicted on a person for such purposes as obtaining... information or a confession, punishing... or intimidating or coercing..." If law enforcement officials use Tasers as intended, it's clearly not torture.

But are Tasers being used by law enforcement as intended? That's the key question. Critical stories in the media make plain that they occasionally aren't, but misuse doesn't seem to be the norm.

"In general, the use of conducted electrical weapons is not usually associated with a greater number of liability claims of excessive force against police departments, at least in the U.S.," Jauchem noted.

So, for now at least, the use of Tasers will continue relatively unfettered.

How Chemistry Overcame Its Greatest Blunder

By the waning days of summer 1666, London had been mired in drought for ten months. Wooden beams that comprised most of the structures were bone dry. So it was no surprise when, on Sunday September 2nd, a fire that broke out in a bakery quickly spread to nearby buildings. For four days, fires raged, consuming houses and shops like they were dry leaves. By the time the flames finally died down, 60% of the city had been razed to the ground.

The devastating event raised some glaring, unanswered questions: "What is fire? And what causes it?" Scientists still didn't know. Just a year later, German alchemist Johann Joachim Becher provided an answer. He proposed that a specific substance called terra pinguis could be found within all inflammable materials. It was this substance that burned, he claimed. In 1703, a student of Becher's, German chemist Georg Ernst Stahl, renamed terra pinguis to phlogiston. Phlogiston, he suggested, was odorless, colorless, tasteless, and largely weightless. When a substance containing phlogiston burns, the phlogiston is simply released into the air, where it becomes so diffuse that it can't ignite. Wood, charcoal, phosphorus, and sulfur must contain copious amounts of phlogiston, he reasoned.

Stahl, a professor at the University of Halle, likely proposed his theory with noble, truth-seeking intentions, but little did he know that he was inadvertently sending scientists up what physicist Jim Al-Khalili described as "one of the greatest blind alleys in the history of chemistry.”

Phlogiston did not exist.

The theory of phlogiston propagated slowly at first, but by the time Stahl passed away in 1734, the vast majority of Europe's most influential chemists accepted it. That wouldn't change for fifty years.

"The initial success of the phlogiston theory was its being the first consistent general theory that tried to explain chemical reactions in general and combustion in particular, as well as being a broad conceptual scheme into which could be fitted most of the chemical phenomena known in the eighteenth century," Ben-Gurion University's Jaime Wisniak explained.

It also helped that phlogiston was so ephemeral. Chemists' tools were relatively limited back then, making it difficult to disprove the existence of a gas that's "odorless, colorless, tasteless, and largely weightless."

There was an inescapable piece of evidence contradicting the theory of phlogiston, however. When certain substances, like magnesium or tin, burned, they got heavier! If phlogiston was released during burning, their masses either shouldn't change or should decrease! Phlogiston proponents temporarily weaseled their was out of this issue with flawed logic, insisting that, as a near weightless or even "negative-mass" substance, phlogiston would buoy the compounds in which it resides, making them lighter.

It wasn't until the mid-1770s that scientists would properly start to scrutinize phlogiston. One scientist in particular, Antoine Lavoisier, performed the brunt of the debunking. In a key experiment, he heated mercury inside a sealed container, finding that it turned into another substance, which we now know as mercury oxide. He then measured the amount of air taken in by the compound. Afterwards, he heated the mercury oxide and measured the amount of air released. He found that the two amounts were identical. He concluded that the substance being taken in and released couldn't be phlogiston; it must be a new element. Indeed it was. It was oxygen. Lavoisier would spend years gathering a bounty of evidence to support a new theory of combustion, and in 1783, he mounted a full scale attack on the theory of phlogiston, asserting that it was time "to lead chemistry back to a stricter way of thinking" and "to distinguish what is fact and observation from what is system and hypothesis." It took ten years, but Lavoisier's notion of combustion won out, supplanting phlogiston for good.

Today, with over 118 elements on the periodic table and an advanced world largely forged through mixing them, it's difficult to look back on the time and effort lost in the fruitless search for phlogiston with too much regret. In the end, chemistry overcame its tall hurdle, and science as a whole was stronger because of it.

(Image: Shutterstock)

When an Avalanche of Trash Killed 143 People

IN THE EARLY morning of February 21st, 2005, residents living in the shadow of the massive Leuwigajah dumpsite near Bandung, Indonesia awoke to the sound of muffled blasts -- one, two, three of them -- in rapid succession. The blasts heralded a new sound, akin to an oncoming freight train. On it came, deafening, unceasing. For many of those people, it was the last sound they would ever hear.

In a torrent of flame, refuse, and melted plastic, 143 people and 71 houses were buried in a matter of seconds, crushed under an avalanche of trash that, when settled, extended 1000 meters and rose over thirty feet high. In a painful piece of irony, almost all those who died were scavengers who made a living by scouring the gigantic dumpsite for anything of value. They lost their lives to that which had previously sustained them.

The waste avalanche at Leuwigajah was not the first tragedy of its kind, nor was it the deadliest. That macabre distinction goes to the Payatas disaster, which occurred five years prior in the Philippines and claimed the lives of 218. The Leuwigajah incident was the largest, however -- over 2.7 million cubic meters of waste material was jettisoned during the slide. More recently, it became the best studied, with a new, probing report just released in the journal Geoenvironmental Disasters.

IN THE DAYS, weeks, and months preceding the avalanche, the Leuwigajah dumpsite was receiving nearly 4,500 tons of solid waste per day. Though the site started off well managed when it was set up in 1987, it eventually fell into bureaucratic disarray. By 2000, three different authorities were depositing trash in a haphazard fashion. Each shirked responsibility.

As the mountain grew, smoke began emanating from the hulk and fires sprung up at odd times. Decomposing waste produces methane, a flammable hydrocarbon gas which can ignite with just a spark. Franck Lavigne, a physical geographer at the University of Paris and lead author of the report, also suggests that oxygen may have started accumulating deep within the mountain of trash, creating a contained combustion zone. This centralized uptick of oxygen likely fueled the rise of aerobic bacteria, which then produced heat, along with toxic gases like carbon monoxide.

According to Lavigne, the build-up of gasses likely transformed the dumpsite into a ticking time bomb that could be set off with the slightest prod. That prod eventually came, surprisingly, in the form of rain.

Here's what Lavigne thinks happened on that fateful February morning:

(1) Heavy rainfall on the dumpsite led to increased water percolation through the waste material.

(2) The percolating water reached the top of the combustion zone deep inside the waste mass.

(3) The vaporization of the water produced an increase of gas pressure at the base of the dump that probably lifted up a part of it, resulting in a sound-like explosion during the liberation of the accumulated pressured gas.

(4) The vaporization and the lift up processes combined their effects to bring air to the combustion zone. The second and probably the third explosion then occurred.

(5) As the destabilized mass began to slide, the fire propagated to the whole moving mass, helped by the significant amount of methane trapped in the waste material (numerous swirling flames were reported by eyewitnesses and were probably due to the gas trapped in the plastic bags).

(6) The combustion of each plastic bag within the moving mass produced gas that allow an expansion of the whole mass to occur. Once started, this mechanism, as well as water vaporization, triggered a self maintaining expansion during the transport. This self maintaining expansion... determined the unusually high mobility of the moving mass and can also explain why the whole material was deeply burned.

Lavigne also thinks it's possible that the added weight of the water alone could have triggered the avalanche, which subsequently released the trapped methane and carbon monoxide gasses, creating the flaming inferno that accompanied the torrent of trash.

AFTER THE 2005 avalanche, the Leuwigajah dumpsite was abandoned and has since been reclaimed by natural vegetation. Greenery now conceals the scarred landscape. From far overhead, one can hardly surmise that the area once hosted such a horrible tragedy.

Sadly, local officials seem to have forgotten, too.

"Most of the other open dump sites in Indonesia are still uncontrolled, and therefore constitute a significant hazard for many communities," Lavigne says.

In fact, four more slides have occurred at other dumpsites in Indonesia since the Leuwigajah waste avalanche, resulting in 33 deaths.

Lavigne urgently recommends that all uncontrolled dumpsites be transformed to controlled landfills. People don't always consider how much or what they throw away, but it's vital that we manage that waste responsibly.

Source: Lavigne et al. "The 21 February 2005, catastrophic waste avalanche at Leuwigajah dumpsite, Bandung, Indonesia." Geoenvironmental Disasters 2014 1:10.

(Images: F. Lavigne, Google Earth)

Weird Diseases People Acquired from Animals

 

A recent issue of the CDC's Morbidity and Mortality Weekly Report told of two curious cases of infectious disease that were transmitted from animals to humans, also known as zoonotic infections.

The first was a real-life manifestation of the cliché "no good deed goes unpunished." A truck carrying 350 very young calves overturned in Kansas, killing many of them and injuring others. Fifteen emergency responders arrived to help. A few days later, six of them became severely ill, with diarrhea, cramps, and vomiting. A subsequent epidemiological investigation revealed that they were infected with Cryptosporidium, a chlorine-resistant parasite that is transmitted via the fecal-oral route. Emergency responders who either carried injured calves and/or came into contact with feces were far more likely than the other responders to become sick.

Cryptosporidiosis is a rather nasty disease. The diarrhea it causes can last for more than a week. In 1993, Cryptosporidium contaminated the Milwaukee water supply, sickening 403,000 residents and killing 69 of them. The public health disaster prompted The Onion, which was founded in nearby Madison, to create a T-shirt in commemoration.

The second case the CDC described involved a child who was scratched by his pet rat. The child began vomiting and developed a fever. His doctor misdiagnosed him as having viral gastroenteritis, and two days later, he was dead. The subsequent investigation determined that one of his rats carried Streptobacillus moniliformis, the bacterium responsible for "rat-bite fever."

Oddly, most domesticated rats carry this bacterium, and according to the CDC, 0.1% of U.S. households have pet rats. Why this child became infected and died -- while so many other pet rat owners never even become infected -- is a bit of a mystery. A quirk of genetics, however, may to be blame. It is known, for instance, that a single mutation can double a person's chance of catching the flu. Perhaps the child carried a similar mutation that made him susceptible to this particular bacterium.

It should be noted that many of our pets can make us sick. Dog bites can transmit Capnocytophaga, and cats can transmit Bartonella (cat-scratch fever) and Toxoplasma. While our fluffy furballs are cute, always remember that they often possess noxious microbes.

Source: Centers for Disease Control and Prevention. "Outbreak of Cryptosporidiosis Among Responders to a Rollover of a Truck Carrying Calves — Kansas, April 2013." MMWR 63 (50): 1185-1188.

Source: Centers for Disease Control and Prevention. "Notes from the Field: Fatal Rat-Bite Fever in a Child — San Diego County, California, 2013." MMWR 63 (50): 1210-1211.

(AP photo)

The Science of the Booty Call

It's New Year's Eve. So make those resolutions (even though they'll likely fail) and celebrate like it's almost 2015!

Today is one of the top drinking days of the year, which also probably makes it one of the most popular days for booty calls. Unaware of booty calls? They're casual, brief, sexual relationships arranged through text or call, often between friends or former partners. While the timeless one-night-stand is a chance occurrence hatched with an acquaintance, the booty call involves a bit more forethought and emotion. It's a novel no-strings-attached sexual encounter for a technology-enabled generation. Around half of college-aged individuals have been involved in one.

In the prestigious annals of sexual research, there have been just three studies examining the booty call, and University of Western Sydney psychologist Peter Jonason has conducted every single one. An up-and-comer in the world of psychology, Jonason recently was awarded an Ig Nobel Prize for research showing that night owls are more likely to be narcissistic, psychopathic, and anti-social.

Fundamentally, the booty call is a compromise, Jonason says.

"For men who engage in this type of relationship, a booty call offers sexual access at a low, although not minimal, cost. For women, a booty call relationship offers more affection than a one-night stand."

In 2009, Jonason found that female college students at the University of Texas reported receiving roughly six booty call requests in the past year. Males only reported receiving one. Phone calls were the most prevalent mode of solicitation, followed by text messages. At the time, Jonason predicted that text requests would soon eclipse phone calls as the primary method.

"Text-based communication may allow individuals to send out multiple booty call requests at once, thus increasing their odds of successfully finding one," he sagely predicted.

Also in 2009, Jonason asked students at the University of New Mexico to explain their thought processes in accepting or rejecting a booty call. Physical attractiveness and timing was key to both men and women, but especially men. Men were generally receptive to booty call requests, simply citing availability as the biggest factor in accepting or declining. Women were more selective. They were more likely to accept a booty call from a friend, and more likely to reject a booty call if the solicitor was known to be promiscuous or arrogant.

The findings, Jonason said, supported the idea that the booty call is a compromise between stereotypical mating strategies. Men viewed booty calls as a relatively easy way to have sex, whereas women were interested in both sexual gratification and the potential of finding a long-term mate. If the booty call were placed on a multi-axis continuum of short-term mating and long-term mating components, it would lie somewhere around here:

"The booty call may be a modern-day by-product of not only the conflict between the genders created by different evolved psychologies, but also the availability of modern communication technologies with which to develop and solicit repeated sexual relations," Jonason wrote.

By extension, one has to wonder whether the technological ease of arranging sexual encounters is contributing to American adults' newfound single lifestyle. In August, the Bureau of Labor Statistics reported that unmarried adults outnumbered married adults for the first time since data began being collected in 1976.

(Images: AP, Jonason, Li, & Cason)

Are Conservatives Really Conspiracy Whackjobs?

From Birtherism to Benghazi, Left-leaning media sites love to accuse conservatives of believing conspiracies at far higher rates than liberals. But does data bear this claim out?

According to a 2013 poll by Public Policy Polling (D), on many issues, it does. Conservatives are more likely to believe that global warming is a hoax, that elites are seeking to form a New World Order, that Obama is the Anti-Christ, that Obama is trying to stay in office after 2017, that Muslims want to implement Sharia Law in the United States, and that Big Pharma creates diseases so they can profit off of making drugs to combat them. They were even ever-so-slightly more inclined to believe that vaccines cause autism, a belief typically associated with the Left.

Now, this is just one poll, so what are we to make of the data? Are conservatives really more conspiratorial? Mother Jones, Salon, MSNBC, and Huffington Post would undoubtedly agree. Two of the foremost experts on conspiracy theories aren't so sure, however.

University of Miami political scientists Joseph Uscinski and Joseph Parent recently conducted an unprecedented analysis into conspiratorial thinking, publishing their results in a book: American Conspiracy Theories. From a large Internet survey with thousands of participants, they found that predisposition to conspiratorial thinking is fairly even across the liberal-conservative spectrum. It also extends across party lines.

"Those ranking high on the conspiracy dimension range between 24 and 40 percent on the Democratic side and between 27 and 33 percent on the Republican side." they reported.

Uscinski and Parent also asked about a hypothetical situation involving voter fraud.

"When we asked respondents how likely voter fraud would have been involved if their preferred presidential candidate did not win the election, 50 percent of Republicans said it would be very or somewhat likely, compared to 34 percent who said it would be very or somewhat unlikely. For Democrats 44 percent said fraud would have been likely compared to 37 percent who said it would be unlikely. Again, it is about a tie."

It seems that when the task of gauging conspiratorial thinking is broken down to a basic level and stripped of bias, we find that conservatives aren't the "anti-intellectual" and "tribal" beings who Richard Hofstadter, a respected historian, described.

"We believe that the notion of asymmetry has persisted because academics and journalists align largely with the left," Uscinski and Parent explain. "This pushes these two institutions to disproportionately dwell on conspiracy theories held by the right but overlook conspiracy theories closer to home... In political science at least, much of the study of conspiratorial beliefs has focused on conspiracy theories accusing actors on the left, especially since Obama's election in 2008."

"The cumulative effect is that our knowledge-generating and knowledge-disseminating institutions make the right look chock-full of cranks and the left look sensible and savvy. There is no conspiracy here; ideology drives the worldviews of professors and journalists like it does everyone else."

(Image: AP)

Surprising Most Cited Physics Research of All Time

Recently, top dog science journal Nature published a list of the top 100 scientific papers. How do they measure a study's greatness? Papers on Nature's list are ranked by number of citations: how many times other peer-reviewed published manuscripts have referred to their authority.

The list categorizes papers into fields. The very top entrants are mostly classified as "Biology Lab Technique." Honored manuscripts include bioengineering breakthroughs, papers on statistical methods for data analysis and materials science works on simulating and measuring crystals. No astronomy or math papers crack the list.

What was the top physics paper? Maybe you'd guess Einstein's famous work describing absorption and emission of light in photon quanta. Or maybe his manuscript on Brownian motion, or his work on relativity. Perhaps a paper on black holes, on nearing absolute zero, or uncertainty in quantum mechanics.

Surprisingly, it's carbon nanotubes. The third highest physics paper is about graphene which is essentially unrolled nanotubes. Why would these tiny structures make the biggest physics papers of all time? The answer reveals how the inner workings of peer-reviewed research shape the notoriety and perceived importance of discoveries.

Academic researchers grade their papers by how many citations those papers receive in other published work. Citation count has its flaws as a metric, but nearly every researcher will use it as the first judgment of someone else's work. The career fate of most academics rests almost solely on the citation counts their papers accumulate. That's not to say that it has no merit. Outside of fields I know, I'll look at the graphs in a paper, but in general if it has many citations, my intial guess will be that it is an important work.

Back to the original question. Why nanotubes? This particular paper, which I have cited myself, is something of an anomaly in modern research. Several physicists around the world noticed nanotube-like structures in their microscope images in the late 20th century. These were generally noted as "graphitic fibers." Soviet physicists published them in Russian in a journal that was nearly impossible to access outside the USSR. Japanese physicist Suimo Ijima working at NEC Corporation saw the structures in 1991 and published the result in Nature. This was right at the time that nanotechnology was gathering buzz as a field.

Unlike most cases, where a discovery is anticipated and built toward bit by bit for years by many researchers, Ijima published his result alone in one single paper. Due to the obscurity of any the previous reports and the prominence of singular report, a huge number of subsequent papers written about nanotubes cite Ijima as the originator of the field. Very few other discoveries have such a sudden and high profile coming out. But, so too were Einstein's works singular. Why are they not cited so heavily? The specific applicability of a research work, and the maturation of the field shape citation count as well.

Theories of spacetime and the quantum nature of the universe and dark matter and so forth are intellectually exciting and important, but they are less immediately applicable. While a method to read genetic DNA code is immediately applicable to thousands of biological and medical studies and millions of procedures, black holes are not.

Often, the physics discoveries that prove the most immediately applicable to the rest of the world are unglamorous reports of new materials physics discoveries. Materials science and condensed matter physics (physics of solids) are the biggest fields in physics. They receive the most money, produce the most immediately applicable research and perhaps improve the world the most directly. They shape the advanced that modern technology that will be ours decades from now.

Also, Einstein published his work at a time when there were a tiny fraction of the current number of practicing career physicists. The explosion in academic research over the past century means that every field has far more researchers publishing far more material. Any comparable work published today will receive more citations due to this expansion of the research universe. Further, after a while even the most groundbreaking advances of the past become common assumptions that no longer require citation. Einstein, Feynman, Dirac and others entirely reshaped physics, but their discoveries are taken as basic fact today, rarely needing attribution.

So, while we marvel about theories of the cosmos, the academic citation record points out another angle. The best science is often the science with the most immediate use in other research and outside world applications. The most cited physics research of all time is precisely this type of work.

(Photo: GigaOm)