This year, three National Football League players -- Adrian Peterson, Ray Rice, and Greg Hardy -- have either admitted to or been convicted of domestic violence. Their stories coalesced into a storm this past week with the release of a damning new video of Ray Rice punching his wife (then fiancée) and the indictment of Adrian Peterson, debatably the NFL's best running back, for child abuse.
The media onslaught of updates, analysis, and opinion on what has been called the National Football League's "worst week ever" leaves a distinct impression: the NFL is a league stocked full of criminals.
Evidence, however, doesn't bear that out.
Back in 1999, leading criminologist Alfred Blumstein teamed up with author Jeff Benedict, who has written five books focused on crime and athletics, to compare rates of criminal violence among NFL players to that of the general population. Controlling for age, they found that the annual rate of assault and domestic violence among NFL players was less than half that of the general population.
But Blumstein and Benedict's analysis is fifteen years dated. Perhaps things have changed in that time?
It doesn't appear they have. Back in July, FiveThirtyEight's Benjamin Morris tallied up the incidents in USA Today's NFL Arrests Database to discern crime rates among NFL players. He then compared those numbers to the national averages among 25-29 year olds, and found the rate of domestic violence in the NFL to be 55.4% that of the general population. And the overall crime rate was a mere 13% of the national average.
So why then do 69% of Americans believe that the NFL suffers a "widespread epidemic of domestic violence problems"? The answer is rooted in how we think. Humans are prone to rely on examples and experiences that can be easily recalled. The idea is that if we can remember it, it must be important. This mental shortcut is termed the availability heuristic. A key drawback of the heuristic is that it leads us to overestimate the prevalence of memorable events. Here, you can legitimately blame popular media. Because plane crashes are widely covered, many erroneously view flying as more dangerous than driving. Thanks to Shark Week, people are wearier of sharks than deer. Because 91% of people have seen, read, or heard something about Ray Rice's domestic violence, they overestimate the problem of domestic violence in the NFL.
That's not to say that domestic violence isn't a problem in the NFL. By type of crime, domestic violence is the closest the NFL comes to the national average. Moreover, Morris noted that NFL players do seem to commit acts of domestic violence at a higher rate than individuals with a similar socioeconomic status, though a direct comparison wasn't available.
As public figures, football players must hold themselves to higher standards, and be punished appropriately when they fail to meet them. But more quintessentially, as human beings, they need to recognize that unprovoked violence against others, particularly those not able to defend themselves, is utterly reprehensible.
Every man comfortable enough with his masculinity to squeeze into performance-enhancing lycra athletic body wear has sooner or later confronted the next frontier: The question of whether to shave his legs!
Aside from perhaps staunch feminists, female athletes don't face this social conundrum. Among male athletes, the otherwise socially uncool body-smoothing "manscaping" has long been a tradition. Cyclists will tell you that it might improve recovery from road rash or make leg massages better. They'll also admit that it's fashion. Tough, big-deal bike riders do it. Nobody races Le Tour de France with hairy legs. Male shaving is a form of machismo.
It also has negligible performance benefits, or so the conventional wisdom goes. Now, new data from men on bikes in wind tunnels contradicts this view. Bicyclists were measured to move more quickly with shaved legs! In theory this makes sense. Generally, smooth surfaces are more aerodynamic than rough or uneven ones.
Aerodynamics is so important to cyclists because the practical limit of their speed is not their muscle power, but the aerodynamic drag of their ride: bike and body. Terminal velocity on level ground (on a properly geared ride) is determined by how cleanly the forward-facing shapes cut into the wind. The more carefully a surface cleaves oncoming air into parts without disturbing it into a chaotic turbulent mess, the faster it goes.
An everyday pleasure rider may hit 15 mph on a brisk ride, a commuter may cruise at 16-18, and a professional racer can hold speeds in the mid 20s. A rider on a bike with an extremely aerodynamic fairing like the nose of a rocket can reach speeds of more than 80 mph!
"Aero" has become a huge buzzword and selling point in the cycling industry. Most competitive races have actually banned certain bike designs for being too fast. Within a limited bicycle geometry range, the next gains to be made are those from the other half of the aerodynamics of the system: the rider himself. Riders often employ a hunched position, with the arms out and the head tucked down, to reduce aerodynamic profile. They may smooth even their natural body profiles with seamless skinsuits.
Here's where the hairy legs come in. Smooth legs should be slightly more aerodynamic than hairy or, heaven forbid, "stubbly" legs, right?
Previous tests said no, there was no measurable effect. Leg-shaving is just machismo. This new test says otherwise. A cyclist going into the wind tunnel for aero testing at the bike industry "Specialized" forgot to shave his legs first. His test showed significantly higher drag. Surprised, he came back days later with legs as smooth as a baby's cheek, in addition to a 7% gain in aerodynamic slipperiness!
It was a repeatable result too. Several more cyclists tested in the same wind tunnel gained similar aerodynamic advantage. 7% doesn't sound massive, but it can mean more than a minute faster in a one-hour race against the clock. That is a huge competitive advantage. A similar gain in aerodynamic profile might require hundreds of dollars of specialized bike parts.
These results fly in the face of the last serious study of the subject in the 1980s. The big lesson: verifying previous results is really important. Also, men need to get to work with those razors. I'll no longer be making fun of you for your "macho fashion."
If eating was merely a matter of stuffing something in a hole, whether up top or down below, the answer would be "yes." But while insertion is certainly a key part of the eating process, it's only equivalent to walking through a room door. What happens inside is the most important.
Let's first follow the path of a piece of pizza consumed the good old fashioned way: through the mouth. Once chewed and swallowed, the food travels down the esophagus and into the stomach, where gastric juices begin to break down proteins. After roughly one or two hours, a valve opens and the pizza -- now liquidized and unrecognizable from its once delicious form -- continues its journey, first through the duodenum -- basically the waiting room for the small intestine -- and eventually into the small intestine itself. Here is where the digestive heavy lifting occurs. Roughly 95% of all nutrient absorption occurs within the small intestine. The pizza's next stop is the large intestine -- or colon -- where water, minerals, and some vitamins are taken in. Lastly, what remains of the pizza arrives at the rectum, to be excreted at your convenience.
As recently as 1926, the medical community believed that process could be reversed to an extent. Doctors regarded rectal feeding as a legitimate method for sustaining patients unable to eat normally. It made some sense. After all, the rectum is much closer to the small intestine than the mouth is. Who's to say that food won't wind its way upwards?
Well, as research would elucidate, the digestive system is not a two-way street. Scientists examining cadavers found that the small intestine was unreachable from below. A 1926 study using live medical students as guinea pigs was even more conclusive. The researchers found that nutrient enemas were only good for hydration. Food would enter the colon via the rectum. Sit around for a bit. Then come right back out... smelling a whole lot worse.
Source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013
World events have made it quite clear to most Americans that we should develop more of our own energy sources. Reducing our reliance on foreign oil by exploiting the natural gas under our feet is not only smart foreign policy but also smart environmental policy: Natural gas burns cleaner than coal or oil, and it has already lowered our CO2 emissions. Natural gas is a win for America and the planet.
But not according to anti-technology environmentalists, who have made all sorts of wild, unsubstantiated claims about the supposed harms of fracking. Three claims in particular are worth examining: (1) Fracking causes a dangerous leakage of methane into drinking water; (2) Fracking causes earthquakes; and (3) Fracking chemicals contaminate drinking water.
Claim #1 should be considered thoroughly debunked. The "documentary" Gasland, which depicted a guy lighting his tap water on fire, kickstarted the anti-fracking movement. The infamous scene, however, was built upon a lie: The methane in his tap water was due either to natural methane migration or to faulty well casings, not to fracking itself. And methane is neither toxic nor likely to cause your house to explode, so the note above the faucet, which read, "Do not drink this water," was nothing more than theatrics.
Even if basic chemistry and physics do not constitute sufficient evidence against Claim #1, then a new study in the journal PNAS should provide the final nail in the coffin. The researchers closely examined eight instances of drinking water contamination associated with the Marcellus and Barnett Shales. Their analysis reconfirmed the emerging consensus: Fracking itself does not cause methane to contaminate groundwater, but shoddy construction work can. Specifically, the researchers blamed leaky annulus cement and production casings.
Claim #2, that fracking causes earthquakes, is also misleading. Anti-fracking activists, including Rachel Maddow, have ignored research that suggests a nearby existing fault is necessary for fracking to trigger an earthquake. And as Bryan Walsh reported in TIME, the earthquakes are relatively minor and caused not by fracking itself but by the wastewater injection wells. (It should also be noted that injection wells are used for other things besides the disposing of wastewater from fracking. These injection wells can also trigger earthquakes.)
Claim #3, that fracking contaminates drinking water with various chemicals, is the only one that might have legs. The EPA detected carcinogenic benzene in Wyoming groundwater, and other researchers found arsenic in Texas groundwater.
If it is true that fracking is responsible for various chemicals leaking into groundwater, then the next step should be to determine if the pollutants are at unsafe levels. If they are, then the government should tighten regulations. Alternatively, Mr. Walsh suggested that companies "work on ways to clean, recycle and reuse wastewater from wells, eliminating the need for the deep injection wells." That's a good idea. It would prevent both minor earthquakes and groundwater contamination.
The EPA is set to publish a comprehensive report on fracking, but it has been delayed until 2016. Until then, there will probably be a lot more fearmongering in need of nuance.
Source: Thomas H. Darrah, Avner Vengosh, Robert B. Jackson, Nathaniel R. Warner, and Robert J. Poreda. "Noble gases identify the mechanisms of fugitive gas contamination in drinking water wells overlying the Marcellus and Barnett Shales." PNAS. Published online before print: 15-Sept-2014. doi: 10.1073/pnas.1322107111
As long as money, fame and love are to be won, professional athletes will continue to swallow, snort, shoot up, huff and "suppositorize" strange chemicals. Word of the benefits of gulping down rare xenon gas has been out for years now. How does it work, and why do they do it?
First, a brief history of the pervasive nature of chemically enhanced cheating in sports. Let's start with the most grueling competition: Le Tour de France.
Cyclists in the first half of the 20th Century eased their pain and boosted their stamina with numerous harsh substances: Alcohol, rags soaked in ether, strychnine and even cocaine. After World War II, amphetamines became prevalent in the sport until heart attack deaths during competition, such as Tom Simpson's death on the side of the road on a famous mountain in the 1967 Tour, brought about bans and tests. Around this time, injected male hormones and steroids also began to be used. The 1980s and 1990s saw the rise of blood doping. Impossible performances were recorded and riders died. Men in their late 20s weighing less than 150 pounds had heart attacks in their sleep. Lance Armstrong, of course, was the most infamous of this generation of cheaters.
New technologies such as better synthetic hormones and the Pandora's box of genetic doping are now becoming possible. Cheaters always win for a while -- that is, until rule-makers catch on and athletes are pushed to find something new.
Most other professional sports have a similar legacy: An evolution of drug-taking to improve performance, counterattacked by rules to thwart it. Often a chemical is taken for many years before authorities even devise methods to test for it, let alone become aware of it. As this pharmaceutical arms race continues, the xenon gas technique has emerged. It relies upon the same fundamental biology as many other performance-enhancing drugs: It boosts oxygen levels in the bloodstream.
The most obvious method to pack more oxygen into blood is to directly inject concentrated red blood cells (called blood doping). This method was used with great success in the 1980s and 1990s before tests were implemented to sniff its telltale tracks. Populations of red blood cells in one body with different surface antigens (protein "fingerprints") indicate transfusion from one person to another. Reinjection of an athlete's own blood, previously drawn when it contains more oxygen before prolonged activity, is harder to detect chemically. Careful recording of the blood content of an athlete over time can catch these activities as certain blood indicators spike abnormally (or reach levels naturally impossible for an adult human being). It makes cheating more difficult, but there are workarounds.
Dopers responded by taking medical research into a dark alley. They learned to inject the body with substances that cause the body's own natural red cell production to skyrocket. These chemicals, primarily erythropoietin (EPO) do not introduce new blood cells directly, so they must be found by direct detection of the chemical or other agents used to mask its presence. EPO is naturally produced by the bone marrow to regulate red blood cell production, but testing can distinguish the natural chemical from the synthetic version commonly injected. Using EPO and similar chemicals is now harder to get away with, and some athletes rely on microdosing, which reduces effectiveness.
This is where the huffing of gas comes in. Recent medical research has shown that breathing concentrated xenon, argon and possibly other noble gases (those on the far right column of the periodic table) triggers production of natural EPO in the body. These studies originally looked at xenon as a well-known anesthetic or as a treatment to alleviate lack of oxygenated blood flow to tissue, such as a kidney after injury.
The dark science of sports doping keeps very close tabs on medicinal research. As soon as it read that these benefits were the result of enhanced natural production of EPO, it began to appear in athletic competitions. As well as being too new to be understood and banned, it was easier to get away with: It is not uncommon to see athletes inhaling oxygen through a gas mask apparatus on the sidelines. Breathing in a mixture of oxygen and xenon acts almost immediately to trigger an increase in red blood cell production. As explained in this wonderfully thorough article, Russian companies who produce inhalable oxygen/xenon mixtures claim that the effects begin in minutes and last up to three days.
Last week, intentional inhalation of all gases that increase EPO production was officially banned by WADA, the biggest anti-doping authority in sports. However, there's no test yet. Catching an athlete in the act is difficult, but governing bodies say that they will be able to detect it soon.
How they will do this is not easy to guess. Xenon is very quickly eliminated from the blood; within a minute, half of it is gone. In an hour, the concentration in the bloodstream is probably indistinguishable from what is to be expected from breathing atmospheric xenon. Detection may rely upon finding a side effect caused by inhalation of unusual gas mixtures.
Will we see athletes busted for smuggling tanks of inert gas? Worse, will there be adverse medical effects discovered down the line?
Many athletes probably don't care. They just want to win. Cheaters gonna cheat.
In 2008, University of Virginia microbiologist Martin Schwartz recalled a meeting with an old friend, one who had been a Ph.D student with him and had left to attend Harvard Law School instead. At one point during their meeting, he asked why she dropped out.
"She said it was because it made her feel stupid. After a couple of years of feeling stupid every day, she was ready to do something else."
Schwartz was astonished at the answer.
"I had thought of her as one of the brightest people I knew and her subsequent career supports that view," he wrote.
Schwartz pondered on what his good friend had told him.
"What she said bothered me. I kept thinking about it; sometime the next day, it hit me. Science makes me feel stupid too. It's just that I've gotten used to it. So used to it, in fact, that I actively seek out new opportunities to feel stupid. I wouldn't know what to do without that feeling. I even think it's supposed to be this way."
Science humbles even the most brilliant people, bringing them to their intellectual knees. Such is the nature of an enterprise that delves into the unknown.
Schwartz' meeting with his friend inspired an essay: "The importance of stupidity in scientific research," published in 2008 to the journal Cell Science. In it, he argued why it's not only okay to feel stupid, but why it's a necessity.
He began his explanation with a simple and true statement.
"For almost all of us, one of the reasons that we liked science in high school and college is that we were good at it."
But unfortunately, that leaves aspiring scientists with a specious impression. Because, as most established scientists know, science is not about taking tests or getting correct answers! Even the laboratory work most students perform in high school and college is structured to reach a predetermined end. In research, the conclusion is never known at the outset. Researchers may have a strong inkling what might happen, but they don't know for certain.
When aspiring scientists reach graduate school and doctoral programs, being correct is no longer the goal. The goal is solving problems. It's not the same.
"A Ph.D., in which you have to do a research project, is a whole different thing," Schwartz wrote. "For me, it was a daunting task. How could I possibly frame the questions that would lead to significant discoveries; design and interpret an experiment so that the conclusions were absolutely convincing; foresee difficulties and see ways around them, or, failing that, solve them when they occurred?"
Schwartz' personal breakthrough came when he realized that nobody, not even the advisors he looked up to, had the answers to his problem.
"The crucial lesson was that the scope of things I didn't know wasn't merely vast; it was, for all practical purposes, infinite. That realization, instead of being discouraging, was liberating. If our ignorance is infinite, the only possible course of action is to muddle through as best we can."
Muddling earned Schwartz his Ph.D, as it has for countless other students. In fact, muddling is simply what researchers do. Science is like wading through a swamp only to reach a vast unexplored ocean.
"Science involves confronting our `absolute stupidity'. That kind of stupidity is an existential fact, inherent in our efforts to push our way into the unknown," Schwartz wrote.
He believes scientists should embrace that stupidity.
"One of the beautiful things about science is that it allows us to bumble along, getting it wrong time after time, and feel perfectly fine as long as we learn something each time. No doubt, this can be difficult for students who are accustomed to getting the answers right. No doubt, reasonable levels of confidence and emotional resilience help, but I think scientific education might do more to ease what is a very big transition: from learning what other people once discovered to making your own discoveries. The more comfortable we become with being stupid, the deeper we will wade into the unknown and the more likely we are to make big discoveries."
In the six years since it was published, Schwartz' essay has become a source of solace for despairing doctoral students, a reminder that feeling lost is a sign you're on the right course.
Last week, I was at a coffee shop working when a lady approached me and invited me to attend a science discussion group. The topic was the "limits of science." Intrigued, I put away my laptop and joined the group, which consisted mainly of elderly people who were thoughtful, well-spoken, and seemingly intelligent. I had no idea what to expect in terms of the tone of the conversation, so I listened eagerly as the discussion leader (who has a master's degree in geology) started the meeting.
"Science is subjective, though we like to think of it as objective," he began. "When I speak of 'facts,' I put them in quotation marks." He elaborated that things we once thought to be true were later overturned by further study.
Right away, I knew I was going to be in for a ride. While the geologist didn't clarify exactly what he meant, we can deduce one of two things: Either (1) he does not believe facts are real or (2) he believes facts are not accessible to scientific investigation.
Both of these beliefs are problematic from a scientific viewpoint. The first implies that there is no such a thing as a fact, and hence, no such thing as truth. My favorite philosophy professor, former mentor, and (I'm honored to say) friend, Robert Hahn of Southern Illinois University, once quipped, "If the ultimate truth about the universe is that there is no truth, what sort of truth is that?" I would add that if there is no such thing as truth, then science is merely chasing after the wind. Science would be pointless. As fictitious Tottenham Hotspur coach Ted Lasso would say, "Why do you even do this?"
The second belief poses a much bigger challenge to science because there is no convincing response to it. Philosopher Immanuel Kant wrote of the noumenon (actual reality) and the phenomenon (our experience of reality). Because we experience reality through our imperfect senses, we do not have direct access to it. For instance, we perceive plants as green, but that is simply the result of our eyes and brains processing photons and interpreting them as the color green. How do we know that perception is reliable? Isn't it possible that plants are actually some other color? Given that we are limited by our sensory capabilities, we can never know the answer to that question. Our experience of the greenness of a plant (phenomenon) is separate from the underlying reality of a plant's color (noumenon).
Humans in general, and scientists specifically, ignore this philosophical challenge. We assume that our perception of reality matches actual reality. Do we have any other option? How could we live daily life or accept the findings of scientific research if we believed otherwise?
The point of that lengthy aside is that the geologist's comment was at odds with a practical scientific worldview. But, things got even weirder after that.
When our conversation turned to the reliability of the scientific method, I commented, "Scientific laws are generalized in such a way that if you perform an experiment like a chemical reaction on Earth or on Mars, you should get the same result."
One of the ladies asked, "But how do we know? We've never been to Mars."
I answered, "We have a basic understanding of how chemical reactions work. To our knowledge, they aren't affected by gravity.* So, we should get the same reaction on Mars."
Well, yes, in theory. But this sort of extreme skepticism is difficult to address. Chemistry is a mature science whose basic principles are well understood. Until we have sufficient reason to believe otherwise, we should expect chemical reactions to be identical whether they are performed on Earth or on Mars.
Strangely, a bit later on, the same skeptical lady asked me, "How do you explain telepathy?" She added that there have been times when, as she was speaking to another person, that she knew what the other person was going to say before she said it.
"Scientists don't believe telepathy is real. That's how I explain telepathy," I responded.
"Some scientists do believe in it," retorted the geologist.
True. But, some scientists believe that HIV doesn't cause AIDS. That doesn't mean we should take them seriously. I decided to elaborate: "Think of all the times that you thought of words, but nobody said them. Or all the times you thought of somebody, but they didn't call. You forget all of those, but you remember the few times where a coincidence occurred. That's called confirmation bias."
Unsurprisingly, I didn't win her over. The conversation then took one final turn.
The skeptical lady believed the future would be run entirely by robots and machines. This is referred to as the "singularity" and has been popularized by Ray Kurzweil. It is also probably bunk. Not only are we unable to model a worm's brain accurately, but the scientific knowledge and sheer computing power necessary to properly replicate a human brain -- with its 86 billion neurons and some 100 trillion synapses -- are massive. Besides, there is no compelling reason to believe that computing power will grow exponentially forever. Eventually, some mundane physical factor will limit our technological progress. If (and that's a big if) the "singularity" is even possible, it is likely centuries away.
Our evening ended there. Over the next 24 hours, I pondered what could make otherwise intelligent people embrace pseudoscience and science fiction? Moreover, what could make a person doubtful of chemistry, but accepting of telepathy?
I'm still not sure, but I have a clue. Conspiracy theorists are known to believe contradictory ideas. For instance, as Live Science reported, "people who believed [Osama] bin Laden was already dead before the raid were more likely to believe he is still alive." Similarly, the lady who believed that science wasn't advanced enough to fully understand chemistry yet also somehow so advanced that it could build Earth-conquering robots may be engaging in conspiracy-like thinking. She had no awareness that her skepticism of chemistry and credulity toward telepathy were, in many ways, fundamentally incompatible.
Extreme skepticism and extreme credulity are anathema to the scientific mindset. Successful scientists accept the reliability of the scientific method but question extraordinary claims that are not founded upon extraordinary evidence. That is healthy skepticism, and it was curiously absent from the science discussion group.
*Note: The kinetics of chemical reactions could possibly vary under different gravitational conditions. See an interesting discussion here.
Since the rise of both modern medicine and society, a large subset of the Western World's population has required a scapegoat to explain their everyday ills. Today, it's gluten. A decade ago, it was monosodium glutamate (MSG). One hundred years ago, it was poop.
Yes, poop. But I'm not talking about the occasional dog doo along the side of the road. (Though in the early 1900s, there was plenty of horse manure to go around.) I'm referring to the feces stored inside you, within the wondrous trash receptacle that is the colon.
Thousands of years ago, the ancient Egyptians were affronted by the idea that, at any given time, feces were inside their bodies. If it was so nasty coming out, surely it must equally nasty roiling about within! They reasoned that putrefying poop releases toxins that leach into the circulatory system, causing fever, creating pus, and making people sick.
We can excuse the ancient Egyptians for their naiveté, but it's harder to go easy on physicians of the early 1900s. With a shiny new name -- autointoxication -- and just a few preliminary studies to back it, the theory became widespread. Admittedly, it must have been difficult to go against Ilya Ilyich Mechnikov, who won the 1908 Nobel Prize in Medicine for his work on phagocytosis, the process in which a cell -- often a white blood cell -- engulfs an invading particle or microbe. Mechnikov argued that intestinal toxins shorten lifespan, and that lactic acid could break them down. That's why he drank sour milk every day.
Sir William Arbuthnot Lane was also an influential proponent of autointoxication. But it was his wild overreaction to the theory that eventually helped reveal it as pseudoscience. Lane advocated irrigating, and sometimes even removing, the colon entirely to treat conditions ranging from general fatigue to epilepsy. Needless to say, both approaches did far more harm than good. Colon removal was particularly ill-advised. After all, if you eliminate an integral part of a sewage treatment plant, pretty soon you'll find $%#@ everywhere. When Walter Alvarez publicly pointed out the sheer lunacy of demonizing a vital bodily organ in the Journal of the American Medical Association in 1919, physicians everywhere finally shook off their fecal infatuation.
"Autointoxication was one of the most pervasive and enduring concepts in the long, bloated history of medical pseudoscience," Mary Roach wrote in Gulp. "It made no difference that neither the specific poisons nor the mechanisms by which they might be causing harm were known or named. In the realm of quackery, vague is better."
"It met a need," wrote James Whorton, a historian of science at the University of Washington, "that medicine has felt in every age, providing an explanation and diagnosis for all those exasperating patients who insist they are sick but are unable to present the physician with any clear organic pathology to prove it."
"Autointoxication was the gluten of the early 1900s," Roach commented.
Today, autointoxication lives on in the form of fruitless cleanse diets and enemas of all sorts. The lingering stench of pseudoscience never fully dissipates, especially when it comes to bull@#$%.
Source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013
(Image: prostok / Shutterstock.com )
WILL REID WAS getting tired of asking his teenage children to change the toilet paper roll, so he did what any enlightened father would do: call them out on YouTube. His "instructional video" on toilet roll changing has amassed over one million views since being posted on August 29th.
Of course, laziness isn't endemic to teenagers. Countless others willingly neglect to empty the overflowing trash bin, clean the leaning tower of dirty dishes, or replace the toilet paper roll.
Why do millions of Americans persistently put off easy chores like these? The answer is tied to motivation, or rather, a lack thereof.
Richard Ryan and Edward Deci, a prolific duo of psychologists based out of Rochester University in New York are the preeminent researchers on the science of motivation. They've narrowed down the basis of human action to two main drivers: intrinsic and extrinsic. We either perform an activity out of interest or enjoyment for the activity itself -- intrinsic -- or we perform an activity to attain an external, separate outcome -- extrinsic. Intrinsic activities are inherently motivating. Extrinsic ones are not.
After outlining these two categories, it's already easy to see why menial chores are often ignored. They're not stimulating in the slightest, so they certainly aren't intrinsically motivated. But as extrinsic activities, they lack enticing outcomes. For example, if you take out the the trash, you're rewarded with an empty bin and a guarantee that you'll have to repeat the chore in a couple of days. Totally lame.
According to Ryan, there also seems to be an inherent "control issue."
"One side is pressuring and demanding—the other (procrastinator) side is either unmotivated or rebelling," he explained in an email.
Ryan and Deci break down motivation as part of a framework called Self-determination Theory. For humans to really want to do something, they say, the task must satisfy three psychological needs: competence, autonomy, and relatedness. It must be hard enough to make us feel like we're accomplishing something and challenging ourselves: competence. Replacing the toilet paper roll comes up short here. The task must also grant some degree of freedom, like we're not being controlled: autonomy. If doing the dishes isn't a form of bondage, I don't know what is. Without clean pots, pans, dishes, or utensils, the task of feeding oneself can seem insurmountable. We've become slaves to modern modes of cooking and eating. Lastly, the task should at least partially sate our desires to feel that we belong to something grander than ourselves and that we are connected to others: relatedness. In co-op living arrangements, chores fulfill this psychological need. But in typical households, there's a fundamental disconnect. It's often every roommate or family member for themselves.
ENHANCING FEELINGS OF competence, autonomy, and relatedness surrounding a boring task has been shown to significantly boost motivation without altering the task itself. In 1994, Deci completed an experiment in which subjects were sat down in front of a computer and told to press the spacebar whenever a dot of light randomly appeared on the screen. One group simply completed the task with sufficient, but minimal, instruction, while another group performed it after having it presented in a slightly different manner.
"Doing this activity has been shown to be useful," the researchers told the participants. "We have found that those subjects who have done it have learned about their own concentration."
Researchers also acknowledged the subjects' dislike for the task. "I know that doing this is not much fun; in fact many subjects have told me that it's pretty boring."
Lastly, the wording in the task description was changed to make the subjects feel less compelled to take part, a subtle hint that they were free individuals who could walk away at any time.
The participants in the latter group reported feeling happier with the task, as well as more motivated to complete it.
CAN SELF-DETERMINATION Theory be put to use where menial housework is concerned? It probably won't work if you experiment on yourself, but you can certainly try it out on your indolent roommate our your neglectful kids! Tell them that taking out the trash builds character and competence, and that loading the dishwasher is an exercise in problem solving (How can I arrange the dishes most efficiently?). Impress upon them how important it is to you and the rest of the family that everyone pitch in, imbuing a sense of relatedness. And lastly, grant some autonomy in how and when they perform the chores.
If you're the guilty one, Ryan also offered a blunter tidbit of motivation.
"Remember why taking out the trash is worthwhile."
In 1992, UC-Riverside mathematician and physicist John Baez was overloaded, not with his day-to-day activities, but with emails from people touting "revolutionary ideas" that required his learned fine-tuning. This would have been fine, had the ideas at least had a foundation in reality. Sadly, almost all of them were not in accordance with recognized laws of nature.
In response, Baez created The Crackpot Index: "A simple method for rating potentially revolutionary contributions to physics." The index comprises 36 items tailored to determine whether an idea and the person behind it are brilliant or daffy. If your score is low, you might have something. But as it starts inching up, you might want to consider donning a hat made from aluminum foil and reconsidering your perception of reality.
Here are a few of the items:
1 point for every statement that is widely agreed on to be false.
5 points for using a thought experiment that contradicts the results of a widely accepted real experiment.
10 points for each new term you invent and use without properly defining it.
20 points for talking about how great your theory is, but never actually explaining it.
40 points for comparing those who argue against your ideas to Nazis, stormtroopers, or brownshirts.
Now, let's put The Crackpot Index to use. Andrea Rossi and Sergio Focardi's cold fusion Energy Catalyzer (E-Cat) should do nicely. A brief visit to E-Cat's website provides a number of examples:
1 point for every statement that is widely agreed on to be false.
Too many to count. Rossi and Focardi's international patent application for the E-Cat was judged to "offend against the generally accepted laws of physics and established theories."
10 points for offering prize money to anyone who proves and/or finds any flaws in your theory.
"So Rossi arranged a challenge for Prof. Focardi, telling him 'I will give you a prize (size non-disclosed) if you can show me that what I have done is wrong and does not work.'"
20 points for suggesting that you deserve a Nobel Prize.
50 points for claiming you have a revolutionary theory but giving no concrete testable predictions.
"Rossi knew he was on to something big, something so powerful it could change the world forever." Yet Rossi repeatedly conducts misleading, "black box" demonstrations not giving full access to independent reviewers.
Despite crafting the index, Baez is very empathetic to crackpots and cranks. As he told This American Life in 2005:
"I think they do it because they really want to understand the universe and they have very noble albeit grandiose motivations trying to do what us regular physicists are also trying to do... And I think what distinguishes them from physicists who can make a useful contribution is that they don't want to be somebody whose epitaph says they tightened the screws on a particle accelerator that made a great experiment, they want to be Einstein. And most of us can't be Einstein."
Why waste time and money testing medical treatments that defy the laws of physics and chemistry?
That's the pointed question posed by Drs. David Gorski and Steven Novella in a new op-ed published in the journal Trends in Molecular Medicine. To most, the answer is obvious: we shouldn't. But in the past decade, alternative medicines without any basis in science, like acupuncture, homeopathy, and chiropractic, have received hundreds of millions of dollars from the U.S. government, which, in turn, has been used to fund hundreds of randomized clinical trials.
Alternative medicine supporters insist that these trials are necessary to find out what does and does not work. That seems reasonable. But unlike proper scientists, they don't cast off that which evidence shows to be worthless. When a study's result is negative -- and almost all of them are -- they ignore it. And on the rare occasion when a study's result is positive -- however miniscule the effect may be -- they cling to it like there's no tomorrow. In the eyes of the alternative medicine proponent, more research will always be needed.
So what we're left with is a medical community endlessly analyzing treatments that amount to nothing more than a placebo, thus lending credibility to the practices themselves.
Evidence is the lifeblood of science and rational thought. But should we analyze hocus-pocus? Take homeopathy, for example.
"Homeopathy violates multiple laws of physics with its claims that dilution can make a homeopathic remedy stronger and that water can retain the ‘memory’ of substances with which it has been in contact before," Gorski and Novella write.
In other words, it's based on magic.
"Thus, treatments like homeopathy should be dismissed as ineffective on basic scientific grounds alone."
In evidence-based medicine, a treatment must first be shown to be plausible with basic science, then further studied in vitro on cell cultures and in vivo on animals. Only then is it allowed to continue to clinical trials in humans. But alternative medicine consistently seems to get a pass on the first three steps, proceeding straight to human trials, Gorski and Novella say. It is in these clinical trials, where confounding variables seep in, and occasionally produce false-positives. Moreover, it's ethically dubious to test implausible alternative treatments on patients with serious medical conditions. The $30 million TACT study analyzed unsubstantiated chelation therapy on patients with heart disease, who -- unsurprisingly -- received no benefit. Another trial examined an alternative treatment strategy for pancreatic cancer in which patients drank juices, used coffee enemas, and took large quantities of supplements. The results of this disturbing trial were tragically unsurprising.
"One year survival of subjects undergoing this protocol was nearly fourfold worse than subjects receiving standard-of-care chemotherapy," Novella and Gorski describe.
Terrible research like that can be avoided with a simple rule.
"All clinical trials should be based on scientifically well-supported preclinical observations that justify them," the duo says.
Until alternative medical practices pass the basic science test, we shouldn't waste time or money testing them on humans.
Source: David H. Gorski, Steven P. Novella. "Clinical trials of integrative medicine: testing whether magic works?" Trends in Molecular Medicine. August 2014. DOI: http://dx.doi.org/10.1016/j.molmed.2014.06.007
1996 was the last year that a commercial nuclear reactor came online in the U.S. That project, at the Watts Barr plant in Tennessee, began all the way back in 1973. We haven't built a new facility in 40 years.
A new startup called UPower is hoping to thaw some of this frozen market. Their plan: think small.
Currently, it is nearly impossible to open a new plant in the U.S. The reasons for this are well laid out here; they boil down to overregulation. A continuous increase in the number and complexity of regulations beginning in the early 1970s caused the materials and construction cost to increase dramatically. This increased the time required to construct a plant to nearly triple.
Vastly longer construction time has two huge negative effects. First, the loans needed to pay the high initial cost of building a plant accrue far more interest during those extra years of construction. Thus an exponential increase in cost occurs before the plant can begin its very profitable operating years. Second, during construction, new regulations often are introduced. This can require a redesign and perhaps even a partial tear-down and rebuild before the plant even opens.
The worst part? Most of these regulations would have done little to prevent previous accidents. Nuclear engineers and scientists don't believe they are useful at all. Rules stay because it's bad politics to oppose them.
Nuclear regulations, driven by a hype-fuelled media and anti-nuclear fearmongers such as the Union of Concerned Scientists, have strangled the building of nuclear plants. Ironically, these policies have directly contributed to our nation's reliance on fossil fuels, further damaging the environment and empowering tyrants in the Middle East. Given that nuclear power is our best energy strategy (as well as a good foreign policy strategy), what can be done to thwart this mess?
Go small. UPower's proposed reactor is tiny, making its design, testing, and implementation much easier. A typical U.S. nuclear reactor produces roughly 700-1300 megawatts (MW) of power at all times. In the current toxic regulatory environment, these reactors cost billions of dollars and take more than a decade to build. UPower's nuclear reactor would only produce about 2 MW of usable electricity. But after initial production of the first few units, they are hoping to reach a complete cost below $10 million each. The small, simple design allows a fast build time and easier accomodation of future regulatory burdens.
Its nuclear core can be made of several common nuclear fuels, depending upon availability, but it will not be suspended in water like most current nuclear plants. Instead, the reactor cycles coolant through an enclosed system within the device, carrying heat from the core to the outside. A particular strength of the design is that it is self-contained. No water, steam, or external electricity needs to be hooked up. The unit is placed in the ground and runs for more than a decade without needing constant micromanagement.
There are some hurdles. The reactor unit does not directly produce electricity: its output will be heat. UPower will need to design and package the machinery for turning that heat into energy. It's a relatively simple engineering task, which has been well understoood for centuries. Current nuclear, coal, natural gas, solar thermal, and geothermal power plants all generate power by converting the heat collected from those fuels into electricity via steam turbines.
In addition, they haven't yet produced a working model. However, nuclear reactor engineering is a technologically mature field. Thousands of nuclear fission reactors run all over the world today (e.g., in nuclear submarines and aircraft carriers); many of them cranking out more than 100% of the power output for which they were originally designed every single day without incident. Also, remember that these are 40-year-old designs; far better designs now exist, despite their being stifled in the US.
What about the nuclear waste? UPower claims that it will be minimal. After the plant runs for 12 years, the reactor is shut down, leaving some matter behind. This doesn't immediately become waste however; they claim that this spent fuel can easily be converted to a second material that can power the plant for a second 12-year cycle. Then after 24 years total, the fuel is spent and becomes waste. How much? Roughly the volume of a basketball. Not bad!
Whether this vision reaches commercial reality is anybody's guess. The idea, however, seems sound and could help melt the glacier of nuclear regulation in America.
One of the hazards of science journalism is the regularity with which we are called names, by both the Left and the Right. "Shills for Monsanto," "lackeys for the pharmaceutical industry," "enablers of the global warming hoax," and (of course) "Nazis" are some of the nicer things that have been said. But just like an auto mechanic who spends his day with oily, greasy hands, we too don't mind getting a little dirtied up for the sake of science. It's all in a day's work.
Because the relentless pursuit of data-based knowledge is our sole guiding principle at RealClearScience, we are not wedded to any particular scientific outcome. For instance, we are staunch supporters of the Big Bang, not because we want there to have been a Big Bang but because we accept the overwhelming data that backs it. The same goes for evolution, anthropogenic climate change, the benefits of GMOs, and so many other supposedly hot-button topics. However, if the evidence changes, our opinion changes. That is the primary benefit of having a fact-based worldview.
After reading literally thousands of articles and writing hundreds, we have become quite familiar with the scientific evidence favoring or opposing various controversial issues. The editorial team thought it would be useful if we compiled a list of those issues, categorizing them based on how well supported (or unsupported) they are by current evidence. For those issues in which we have written an article that further explains our position, we have provided a link.
The weight of scientific evidence FAVORS:
The weight of scientific evidence OPPOSES:
Based on current scientific evidence, we are CAUTIOUSLY OPTIMISTIC toward:
Based on current scientific evidence, we are SKEPTICAL of:
Again, we are not wedded to any of these conclusions. If the data changes, so too will our opinion!
BY NOW, THOUSANDS, perhaps millions, of Americans have already filmed themselves dumping ice water on their heads in the name of amyotrophic lateral sclerosis (ALS) -- Lou Gehrig's disease. Thousands more will follow suit. Whether or not you're a fan of the Ice Bucket Challenge -- and particularly its narcissistic nature -- you cannot deny that it's been extremely successful. As of Thursday, The ALS Association has received $41.8 million in donations from July 29th to August 21st, compared with just $2.1 million during the same time period last year.
The Ice Bucket Challenge has been an undeniable boon to the fight against ALS and online egos everywhere (Look at all the Facebook "likes!"), but how much awareness it has truly raised? While videos of people creatively dousing themselves with cold water abound on social media, the story and the science of ALS seem either absent or drowned out.
This is an attempt to fill that void. If social pressure isn't enough to convince you to donate to ALS research, the heart-wrenching story of Lou Gehrig and the science behind the illness that shares his name should be.
THOSE ATTENDING YANKEES spring training in 1939 saw slugging first baseman, Lou Gehrig, the "Iron Horse," set to return for his 17th season. Up until that point, Gehrig had been an "institution of the American League," hitting 493 home runs, averaging .341 at the plate, and playing in 2,122 consecutive games. But onlookers wondered how long that could continue. Gehrig was now 35, and his prior season had been a bit off his usual pace. He only hit .295, an amazing feat by most standards, but squarely subpar for Lou. Yankees fans hoped that Gehrig would get back on track in 1939.
However, as spring training pressed on, it was clear something was amiss. Sports writers picked up on it.
"They watch him at the bat and note he isn't hitting the ball well. They watch him around the bag and it's plain he isn't getting the balls he used to get. They watch him run and they fancy they can hear his bones creak and his lungs wheeze as he lumbers around the bases," the New York World Telegram's Joe Williams wrote.
"On eyewitness testimony alone, the verdict must be that of a battle-scarred veteran falling apart."
A rare few, like the New York Sun's James Kahn, were more perceptive.
"I think there is something wrong with him. Physically wrong, I mean. I don't know what it is, but I am satisfied that it goes far beyond his ball-playing. I have seen ballplayers 'go' overnight, as Gehrig seems to have done. But they were simply washed up as ballplayers. It's something deeper than that in this case, though. I have watched him very closely and this is what I have seen: I have seen him time a ball perfectly, swing on it as hard as he can, meet it squarely — and drive a soft, looping fly over the infield. In other words, for some reason that I do not know, his old power isn't there... He is meeting the ball, time after time, and it isn't going anywhere."
Things didn't improve when the season began. One day at batting practice, Yankees teammate Joe DiMaggio watched as Gehrig swung on and missed ten "fat" pitches in a row. Eight games in, right before a bout against the Detroit Tigers, and despite the protests of his teammates and manager, Gehrig benched himself "for the good of the team." Everyone, even the stadium announcer for the Tigers, was shocked. "Ladies and gentlemen," he announced, "this is the first time Lou Gehrig's name will not appear on the Yankee lineup in 2,130 consecutive games." Gehrig received a standing ovation from the Detroit fans. Tears glistened in his eyes.
A month later, Gehrig visited the Mayo Clinic in Rochester, Minnesota. The six-day visit produced the following diagnosis from Dr. Harold H. Habian:
"After a careful and complete examination, it was found that he is suffering from amyotrophic lateral sclerosis. This type of illness involves the motor pathways and cells of the central nervous system. The nature of this trouble makes it such that Mr. Gehrig will be unable to continue his active participation as a baseball player."
Gehrig was also informed that the disease was incurable, and that he likely did not have long to live. Despite the earth-shattering diagnosis, he remained optimistic.
"The road may come to an end here," he wrote his wife. "Seems like our backs are to the wall. But there usually comes a way out. Where and what I know not, but who can tell that it might lead right on to greater things."
On July 4, in-between a double-header against the Washington Senators, a ceremony was held to commemorate Lou Gehrig and allow him to announce his retirement. 61,808 hushed fans watched as Yankees manager Joe McCarthy -- who had been like a father to Gehrig -- handed the outgoing slugger a trophy. They watched as Gehrig bent down with the apparent effort of a man forty years his senior to set it on the ground. They watched as Gehrig stood silently with his head slightly turned down, too moved to move. And then, they watched as Gehrig gathered himself, walked to the collection of microphones, and gave one of the greatest, most humble speeches ever delivered.
"Fans, for the past two weeks you have been reading about the bad break I got. Yet today I consider myself the luckiest man on the face of the earth...
When the New York Giants, a team you would give your right arm to beat, and vice versa, sends you a gift—that's something. When everybody down to the groundskeepers and those boys in white coats remember you with trophies—that's something. When you have a wonderful mother-in-law who takes sides with you in squabbles with her own daughter—that's something. When you have a father and a mother who work all their lives so that you can have an education and build your body—it's a blessing. When you have a wife who has been a tower of strength and shown more courage than you dreamed existed—that's the finest I know.
So I close in saying that I might have been given a bad break, but I've got an awful lot to live for. Thank you."
Two years later, Lou Gehrig died.
ALS NOW AFFECTS more than 30,000 Americans. In those diagnosed, the motor neurons -- the cells that signal muscles to move -- suddenly and mysteriously start to degrade. As the motor neurons dwindle, the muscles they formerly controlled diminish as well from underuse. Paralysis eventually sets in, but cognitive function is often spared. In this respect, ALS is the opposite of Alzheimer's: the body goes, but the mind remains. Still incurable today, the disease is often fatal within five years of diagnosis. Most patients die from respiratory failure.
Though precise causes and risk factors haven't been identified, a number of genes and mutations have been linked to ALS. That means that those with a family history of the disease can get tested and receive an imperfect estimation of their risk.
In February, researchers revealed how ALS is spread from neuron to neuron. It seems that a mutant of the enzyme SOD1 causes the cells to go haywire. The researchers also found that certain antibodies can block SOD1 from being transmitted, which could potentially halt the progression of ALS. The method has yet to be tried in humans.
The ALS Ice Bucket Challenge has arrived at an "exciting time" for ALS research. With new drugs undergoing clinical trials and promising research pathways being elucidated, the money raised is sure to be put to good use. To donate, visit the website of the ALS Association.
Now you can dump that bucket of ice water on your head.
A LITTLE OVER a dozen years ago, "la merde... hit le ventilateur" in the world of wine.
Nobody remembers the 2001 winner of Amorim Academy's annual competition to crown the greatest contribution to the science of wine ("a study of genetic polymorphism in the cultivated vine Vitis vinifera L. by means of microsatellite markers"), but many do recall the runner-up: a certain dissertation by Frédéric Brochet, then a PhD candidate at the University of Bordeaux II in Talence, France. His big finding lit a fire under the seats of wine snobs everywhere.
In a sneaky study, Brochet dyed a white wine red and gave it to 54 oenology (wine science) students. The supposedly expert panel overwhelmingly described the beverage like they would a red wine. They were completely fooled.
The research, later published in the journal Brain and Language, is now widely used to show why wine tasting is total BS. But more than that, the study says something fascinating about how we perceive the world around us: that visual cues can effectively override our senses of taste and smell (which are, of course, pretty much the same thing.)
WHEN BROCHET BEGAN his study, scientists already knew that the brain processes olfactory (taste and smell) cues approximately ten times slower than sight -- 400 milliseconds versus 40 milliseconds. It's likely that in the interest of evolutionary fitness, i.e. spotting a predator, the brain gradually developed to fast track visual information. Brochet's research further demonstrated that, in the hierarchy of perception, vision clearly takes precedence.
Here's how the research went down. First, Brochet gave 27 male and 27 female oenology students a glass of red and a glass of white wine and asked them to describe the flavor of each. The students described the white with terms like "floral," "honey," "peach," and "lemon." The red elicited descriptions of "raspberry," "cherry," "cedar," and "chicory."
A week later, the students were invited back for another tasting session. Brochet again offered them a glass of red wine and a glass of white. But he deceived them. The two wines were actually the same white wine as before, but one was dyed with tasteless red food coloring. The white wine (W) was described similarly to how it was described in the first tasting. The white wine dyed red (RW), however, was described with the same terms commonly ascribed to a red wine.
"The wine’s color appears to provide significant sensory information, which misleads the subjects’ ability to judge flavor," Brochet wrote of the results.
"The observed phenomenon is a real perceptual illusion," he added. "The subjects smell the wine, make the conscious act of odor determination and verbalize their olfactory perception by using odor descriptors. However, the sensory and cognitive processes were mostly based on the wine color."
Brochet also noted that, in general, descriptions of smell are almost entirely based on what we see.
"The fact that there are no specific terms to describe odors supports the idea of a defective association between odor and language. Odors take the name of the objects that have these odors."
Now that's deep. Something to ponder over your next glass of Merlot, perhaps?
A FEW YEARS after publishing his now famous paper, the amiable, bespectacled, and lean Brochet turned away from the unkind, meritocratic, and bloated culture of French academia and launched a career that blended his love for science and his passion for "creating stuff."
Yep. You guessed it. He makes wine.
(Images: AP, Morrot, Brochet, and Dubourdieu)
Not an Ebola expert.
The Ebola outbreak in West Africa, which continues to rage and has now claimed the lives of more than 1100 people, offers some big lessons for America.
#1. For all its flaws, the American public health system is pretty good. We transported two patients from the middle of a hot zone who were infected with one of the world's deadliest viruses to a major metropolitan area in the United States. We did this without infecting anybody else or putting the public in danger. The two Americans were treated with a "secret" remedy (that we reported on two years ago) and are continuing to improve. One of them may actually be discharged soon.
#2. Bringing the sick Americans home was the right thing to do. On August 1, the ever-present Donald Trump tweeted: "The U.S. cannot allow EBOLA infected people back. People that go to far away places to help out are great-but must suffer the consequences!" If Ebola was as infectious as, say, measles or influenza, then Trump would be right to be concerned. If such a virus were to emerge, quarantining the patients abroad would probably be the appropriate course of action to prevent unnecessary risk to the American public. But Ebola is not that infectious. Ignorance is no excuse to stir up public anxiety, and Trump's comments were completely out of line.
#3. Biotechnology and GMOs save lives. The antibody cocktail that was used to treat the patients was the product of biotechnology, specifically GMOs. Mouse genes were modified to become human-like, and then they were placed inside of a tobacco plant. The medicine was then extracted from the plant and given to the patients. (Read John Timmer's excellent article for the details.) Keep in mind that this is the sort of life-saving research that anti-GMO activists are fighting to prevent.
#4. Do not destroy smallpox. A few months ago, the world was once again debating whether or not to destroy the known vials of smallpox that exist at the CDC in Atlanta and at a facility in Russia. Since that debate, the Ebola outbreak exploded, and some previously forgotten vials of smallpox reappeared in an NIH storage room. When scientists say we should keep smallpox around "just in case," these are the sorts of surprises they are talking about. Yes, there is a real risk that smallpox (or some other deadly pathogen) could escape from a laboratory. But is the world really better off if we forego research out of fear?
#5. Americans need to pay more attention to global affairs. Separated by two vast oceans, and bordered by two friendly neighbors, we tend to be rather insular in terms of our global perspective. Unless there is a war or some other geopolitical instability that directly threatens our interests, we remain disinterested in the rest of the world. Even then, we still may not be able to find the troubled spot on a map, as 84% of Americans were unable to do with Ukraine. If merely 1 in 6 Americans can find a gigantic country bordering Russia on a map, just how few could find Liberia, Guinea, or Sierra Leone -- the center of the outbreak? In our modern, interconnected world, what happens on one side of the globe can and will affect the other side. Maybe it's time to teach more geography in school.
#6. NIH funding should be increased. The U.S. government has neglected the National Institutes of Health (NIH), more or less letting funding slide ever since 2003. As Pacific Standard reported last year, "the Obama administration's budget request for the 2014 fiscal year is $31.3 billion, more than 23 percent lower than the 2003 funding level in purchasing power." If the U.S. wants to remain globally competitive and ready to fight disease, this downward trend needs to be reversed. Maybe the Ebola outbreak will force some very much needed bipartisanship.
Epigenetics is the next big field that the media, fearmongers, and political hacks will attempt to exploit. How do we know? Because there is a flurry of research in the field (which is not always a good sign), and journalists are already hacking away. You can find articles blaming epigenetics for obesity, cancer, personality, homosexuality, and (absurdly) how we vote.
Never mind the fact that there is serious reason to believe epigenetic changes are temporary and may not be passed down to multiple generations, particularly among mammals.
To be sure, epigenetic changes are probably linked (albeit very slightly) to some of those things. But, epigenetics will turn out to be just like regular genetics: Any one allele or epigenetic variation will only make a person ever so slightly more/less inclined toward a particular health outcome. That is because out of thousands or even millions of such potential variables, the impact of any one of them is usually marginal.
In other words, epigenetics is incredibly complicated. It is a field that is quite literally in its infancy, and there is still much to be learned. If the human genome is the black box of an aircraft, the epigenome is the black box of a UFO. Therefore, be wary of over-simplifications and general pronouncements. They will almost certainly be incorrect.
Other researchers have made similar observations. A commentary in the journal Nature indicates that the biggest headline-grabbers mostly involve women, specifically mothers:
‘Mother’s diet during pregnancy alters baby’s DNA’ (BBC), ‘Grandma’s Experiences Leave a Mark on Your Genes’ (Discover), and ‘Pregnant 9/11 survivors transmitted trauma to their chil- dren’ (The Guardian). Factors such as the paternal contribution, family life and social environment receive less attention.
The authors fear that "exaggerations and over-simplifications are making scapegoats of mothers, and could even increase surveillance and regulation of pregnant women."
They have a point. And they used a particularly persuasive example to illustrate it.
Everybody knows that pregnant women shouldn't drink. But everybody is wrong. Though it is completely taboo for a pregnant woman to be seen sipping a glass of wine, research has shown that moderate amounts of alcohol probably do not harm the developing fetus. Yet, the concern surrounding "fetal alcohol syndrome" -- a serious condition that can arise when an expecting mother drinks excessively -- resulted in mass government regulation, in some cases even making it illegal for pregnant women to drink at all.
The authors worry, perhaps rightly so, that the media hype surrounding epigenetics will once again turn its focus on mothers. Will the government once again regulate what pregnant women can eat, drink, and do? And if so, why not regulate the behavior of men, as well? Epigenetics, after all, can affect sperm quality.
The authors' conclusion provides an excellent framework for the media and policymakers to make sense of epigenetic studies:
First, avoid extrapolating from animal studies to humans without qualification. The short lifespans and large litter sizes favoured for lab studies often make animal models poor proxies for human reproduction. Second, emphasize the role of both paternal and maternal effects. This can counterbalance the tendency to pin poor outcomes on maternal behaviour. Third, convey complexity. Intrauterine exposures can raise or lower disease risk, but so too can a plethora of other intertwined genetic, lifestyle, socioeconomic and environmental factors that are poorly understood. Fourth, recognize the role of society. Many of the intrauterine stressors that DOHaD [developmental origins of health and disease] identifies as having adverse intergenerational effects correlate with social gradients of class, race and gender. This points to the need for societal changes rather than individual solutions.
This is eminently reasonable. Media and politicians, the ball is in your court.
Source: SS Richardson et al. "Don't Blame the Mothers." Nature 512:131-132. (2014)
The prospect of eating stomach is not something that would excite a great many Americans. But what's disgusting to us is a delicious, balanced meal to a great many animal predators. I mean, think about it.
"Stomachs are especially valuable because of what's inside them. The predator benefits from the nutrients of the plants and grains in the guts of its prey," science writer Mary Roach wrote in Gulp.
It's not just stomachs, of course. Bodily organs in general are amazingly nutritious.
"A serving of lamb spleen has almost as much vitamin C as a tangerine. Beef lung has 50 percent more," Roach wrote.
Yet thanks in part to our culturally evolved sense of disgust, most citizens of the Western world tend to turn up their noses at organs and ship them elsewhere.
"In 2009, the United States exported 438,000 tons of frozen livestock organs... Egypt and Russia are big on livers. Mexico eats our brains and lips. Our hearts belong to the Philippines."
Are we missing out? Alternative medicine proponent and supplement guru Joseph Mercola thinks so. But Mercola, who funds anti-vaccine and anti-fluoridation groups, regularly spouts woo-y mumbo jumbo. What are the facts?
Right now, Americans binge on muscle meat -- wings, legs, breasts, ribs... you get the idea. You should know, however, that muscle meat contains plenty of protein but little else. By comparison, a single 100-gram serving of liver contains roughly 800% of your daily value of Vitamin A and 1100% of Vitamin B12. Most organs sport similarly copious amounts of B vitamins, and only slightly less protein than muscle meat. They also generally contain more fat and a higher amount of cholesterol. For example, a serving of liver has more cholesterol than an egg, while a 4-ounce serving of brain has as much as ten eggs! (Coincidentally, brains are often served scrambled.)
It's obvious that organs are more nutritionally complete, but we live in an era of over nutrition, not under nutrition. To live a healthy life, you don't need a serving of liver each and every day. In fact, that might be unhealthy. The Institute of Medicine, a part of the National Academy of Sciences, recommends consuming on average no more than 3,000 micrograms of Vitamin A each day, and one serving of pork liver alone contains 6,500! The Vitamin A that we don't use is stored mostly in the liver (your own, not the one you'd be eating) and fat to be used later. So consuming more than needed over long periods can lead to a toxic build-up, known as Hypervitaminosis A.
Several members of a polar expedition led by Franz Josef Land in the late 1800s reportedly suffered from Vitamin A toxicity after consuming a stew made from the liver, heart, and kidneys from a single polar bear. (The organs of walruses, bears, seals, and moose contain especially high amounts of Vitamin A.) Within four hours, the explorers who partook of the stew grew nauseous, drowsy, and suffered severe headaches. Over the next 24 hours, the skin of half of those individuals began peeling off. All eventually recovered.
Organs from livestock animals such as cows, pigs, sheep, and chickens can certainly be consumed safely, but I'd caution against making liver an everyday staple. Eating brain can also be a risky proposition. You might think it would make you smart, but it can actually transmit neurodegenerative diseases like bovine spongiform encephalopathy, more commonly known as mad cow disease.
Now that you're sufficiently informed, care for some skewered heart?
Source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013
(Image: AP, Shutterstock)
In modern society, antiperspirants are widely hailed as a godsend, dispelling the inconvenient odors wafting from armpits everywhere. But a new study casts doubts on their vaunted position. As it turns out, antiperspirants may actually make you smell worse in the long run.
For 90% of all Americans, slathering on deodorants and antiperspirants is a daily occurrence, a precautionary measure against foul odors and unsightly sweat stains. The odors arise when bacteria living in our armpits break down lipids and amino acids excreted in sweat into more smelly substances. Deodorants employ antimicrobial agents that kill off bacteria, as well as chemicals that replace noxious odors with pleasant aromas. Deodorants that double as antiperspirants, like Degree, Old Spice, and Dove, take the process one step further by physically plugging sweat glands with aluminum-based compounds.
While most of us might only concern ourselves with the dry, aromatic benefits of antiperspirants and deodorants, researchers at the Laboratory of Microbial Ecology and Technology at the University of Ghent in Belgium are more interested in the effects on bacteria. Billions of bacteria dwell in the "rain forests" under our arms, and the substances we don are mucking with their habitats!
To uncover how deodorants and antiperspirants affect armpit bacteria, Chris Callewaert, a Ph.D student specializing in microbial ecology, and a team of researchers recruited eight subjects for a task a great many people (and especially their friends) might deem unbearable: Six males and two females pledged not to use deodorant or antiperspirant for an entire month. Specifically, four subjects stopped using their deodorants and another four stopped using their antiperspirant deodorant. (Most antiperspirants are also deodorants. See image below for an example.) Another control subject who did not regularly use either was asked to use deodorant for a month. The duration was chosen because it takes approximately 28 days for a new layer of skin cells to form.
The researchers analyzed the diversity and abundance of subjects' armpit bacteria at various timepoints before they stopped using antiperspirant, during the period of abstaining from antiperspirant, and for a few weeks after resuming the use of antiperspirant. Switching hygiene habits plainly altered the armpit bacterial communities of every subject. Since no two armpits and their resident bacteria are identical, it was difficult to pinpoint precise changes brought about by deodorants or antiperspirants, but one clear trend did materialize: antiperspirants resulted in a clear increase of Actinobacteria.
You might not recognize the name of Actinobacteria, but chances are, you've smelled them. Dominated by Corynebacterium, they are the major instigators of noxious armpit odor. Other microbes that inhabit the armpit, like Firmicutes and Staphylococcus, don't produce odors as quickly, nor are those odors nearly as pungent.
Callewaert believes the aluminum compounds in antiperspirants may be to blame, killing off "good," less smelly bacteria and allowing "bad" bacteria to dominate. His study found that deodorants which lack these sweat-blocking antiperspirant compounds are actually linked to a slight decrease of stinky Actinobacteria. (Below: An example of the bacterial community of one subject. When using an antiperspirant, Firmicutes [diamonds] and Actinobacteria [dashes] dominate. Without antiperspirant, Firmicutes are more abundant and Actinobacteria are hardly present.)
Though antiperspirants and deodorants are widely used, they are only a temporary fix.
"The measures we utilize today do not take away the initial source: the odor causing bacteria," Callewaert told RealClearScience. "Deodorants only mask unpleasant odors. We can do better than that. The follow up of this research is finding better solutions."
And Callewaert is already working on one: "armpit bacterial transplantation."
"We take away the bad bacteria from the armpit of somebody with a body odor, and replace it with the good bacteria of a relative who doesn't have a body odor," he explained.
"So far we have helped over 15 people. For most subjects it brings immediate improvements. Most of them on a permanent time scale, although there are also people who suffer again from a body odor after some months."
For now, this approach seems rather extreme, but maybe it should be used as a last-line resort for the sort of person you can smell from the other side of the room.
The big limitation of the current study is its sample size. Just nine subjects took part.
"The sample size is rather small," Callewaert admits. "However, we see consistent outcomes."
Callewaert also says that this is the first study to specifically examine the effect of deodorant on the diversity and abundance of armpit bacteria.
"We barely know what lives in our armpits, on our clothes, in our laundry machines, and what causes all these unpleasant odors."
While the current study strongly suggests that antiperspirants can make their users smell worse via the growth of Actinobacteria, it does not directly assess body odor. As a follow up, Callewaert should recruit subjects to use antiperspirants and utilize methods like gas chromatography to directly measure the amount of the stinky volatile organic compounds emanating from their armpits. Professional smellers could also be put to work... at their own peril.
Source: Chris Callewaert, Prawira Hutapea, Tom Van de Wiele, Nico Boon. "Deodorants and antiperspirants affect the axillary bacterial community." Arch Dermatol Res. 2014 Jul 31.
(Images: Shutterstock, Callewaert et. al.)
Last week NASA released results of a test on a new space engine design. It seems to produce thrust without burning fuel. Is the impossible science fiction of the future now possible?
A simple picture of the proposed idea reveals its fundamental absurdity. Grab a friend and a hop into a car with a back seat (not like that). You push on the windshield repeatedly, while they push on the rear window just after every time you push on the front. Now, does the car roll forward? NASA is claiming that it might.
Peer a little deeper into the story and a forest of red flags start to appear. The design of the engine is disarmingly simple, but conceptually doesn't make sense. There are lots of equations to confuse the average reader. NASA didn't actually build the engine. It was given to them completely built from a design that has been criticized, refuted, discredited and described as a fraud by physicists.
NASA's test? Conducted by a guy who believes in warp drives. The second test of validity? A study published by an unknown researcher in a fourth-rate academic journal and never cited by anyone else.
Even worse, the control engine that was supposed to produce no thrust produced the same thrust as the test engine. When a "null" experimental control doesn't produce a null result... well, that's very bad.
Looks like NASA got duped.
According to Roger Shawyer and Guido Fetta, the peddlers of the EMDrive or Cannae drive, the engine works by bouncing microwaves back and forth within a metal container. The claim is that by making one side of the container bigger than the other, more thrust is deposited on that wall. Face that wall to the back and the engine pushes forward. What's the flaw?
The total energy flux of a wave doesn't change except for dissipation as it goes back and forth. Translation: the same amount of energy is deposited on the front of the engine as the back of the engine. You cannot possibly get net energy out of a system with no additional energy input. In reality, the net force is zero, producing zero thrust.
Once the basic idea is busted, the creators resort to true BS. First, they claimed relativistic electrodynamics explained the device's power. Then they switched tacks and claimed that vacuum quantum energy is the key. Science seems to refute these findings as somewhere between implausible and nonsensical. What other extraordinary evidence can we look for to support this extraordinary claim?
How about the only other test of the engine? Great scientific work is not always published in top journals, and sometimes fraudulent work is. However, better work from better researchers generally tends to be published in a select few well respected journals. This academic test was not published in one of the 10 best publications in physics, nor one of the 50 best, nor even one of the 500 best. It was published in a journal ranked 688th in the field, a place where weak research findings go to quietly die. This publication isn't even translated into English, the universal scientific language of the Earth.
Finally, common sense can be a last check: the smell test. Extracting free energy with no loss of fuel? Does this sound plausible? I leave that to you. Personally, I'd bet my salary against it. Not that that's much money, mind you.