Scarcely a week goes by without some public discussion of superbugs, new contagions, or disease outbreaks. Humans fear the insidious threats we cannot see. But we are not the only species to experience devastation at the hands of an infectious foe. Here are four horrifying animal diseases:
Hendra Virus: In Australia, it's hard to find a veterinarian who doesn't subtly wince at the word "Hendra." Harmless enough on paper and fragile outside a host, Hendra virus is anything but harmless in horses. A slight wobble in an equine's step coupled with listlessness in its demeanor are the initial signs of infection, followed by rapid deterioration and death. Hendra kills 80 percent of horses it infects, and can even rarely cross over into humans, where it kills with similar efficiency. It's for this reason that many veterinarians will not treat a horse with symptoms of Hendra if the animal has not been previously vaccinated.
Chytridiomycosis: According to at least one group of scientists, it's “the worst infectious disease ever recorded among vertebrates in terms of the number of species impacted, and its propensity to drive them to extinction." Chytridiomycosis is caused by two types of chytrid fungi, and it has infected about 30% of all amphibian species in the entire world. Three percent of all frog species have gone extinct because of it. Chytridiomycosis affects the skin of amphibians, often causing it to thicken. Having "thick skin" may be seen as beneficial to humans, but to amphibians, it's deadly, as many species absorb nutrients, release toxins, and even breathe through their skin.
White-Nose Syndrome: A dusty cap of white fungus resembling a dab of powdered sugar is the hallmark sign of of this syndrome, which began infecting North American bats back in 2006. Since then, white-nose syndrome has laid waste to bat populations in 33 U.S. states and seven Canadian provinces, sweeping mechanically and menacingly across the continent from where it originated in New York. The aptly named Pseudogymnoascus destructans is the architect of the deadly disease. The fungus erodes the skin of bats, particularly on the wings, leaving the little critters with open wounds. Infection eventually results in weight loss, dehydration, electrolyte imbalances, and death. At least 5.7 million bats have died as of 2012. That number has surely elevated into the tens of millions by now.
Mercury is a mesmerizing element. Liquid at room temperature, yet dense enough that coins will float atop its surface, it has wowed a great many students in high school science classes over the decades.
Unfortunately, it is also incredibly toxic, releasing vapors that can acutely destabilize the central nervous system and, over time, damage the brain, kidney, and lungs, sparking symptoms like tremors, vision impairment, hearing impairment, speech impairment, as well as death.
Gallium closely mimics mercury's otherworldly appearance, though unfortunately lacks its prodigious density. But what gallium lacks in mass, it makes up for with other fascinating properties. Its melting point is roughly 85.7 degrees Fahrenheit, meaning it can be melted in the palm of one's hand, molded to form a shape, then quickly returned to solid form in a refrigerator. Gallium is also one of the few materials that expands when it solidifies, a trait only shared with water, silicon, germanium, antimony, bismuth, and plutonium.
Over the last three years, Theranos has toppled from a $9 billion company to bankruptcy. As Wall Street Journal investigative reporter John Carreyrou exposed, the company's proprietary blood-testing technology never really worked. Founder Elizabeth Holmes, who was heralded as the next Steve Jobs, raised close to a billion dollars from investors by sharing her dream of replacing needle blood draws with less painful fingerpricks for medical testing, but it was a dream that never was fully realized. In touting that fake figment of her imagination for over a decade, she defrauded investors, misled regulators, and put patients at risk. For her misdeeds, Holmes has been fined $500,000, barred from leadership in a public company for ten years, and was recently indicted on charges of wire fraud and conspiracy.
One of Theranos' most appalling missteps was launching their blood testing technology to the public in dozens of Walgreens stores even though many at the company knew that it produced inaccurate results. Holmes' hasty and haphazard actions exposed as many as 176,000 customers to compromised health information. Some patients were diagnosed with life-threatening conditions they never actually had, prompting them to receive unneeded medical tests, incur unnecessary costs, and experience psychological trauma.
For lying to and endangering consumers, Theranos' fall was much deserved. Another, comparable industry deserves to share that fate: homeopathy.
On face value, Theranos and homeopathy may seem worlds apart. One is a medical testing company born in Silicon Valley. The other is an alternative medicine which suggests that taking toxic compounds diluted to obscurity in water can cure the ailments they might otherwise cause. But at their cores, both homeopathy and Theranos aim to shake up the evidence-based medical establishment with wild, false claims that put the public at risk. Homeopathic products are physiologically implausible and do not work, yet there are "remedies" available for potentially life-threatening conditions like asthma attacks, influenza, and cancer. Studies show that alternative medicine like homeopathy dissuades cancer patients from pursuing evidence-based treatments. People who choose alternative medicine to treat their cancers experience rates of death between two and five times higher than patients electing conventional therapies.
Earlier this year, a young American veteran of the War in Afghanistan became the first person to receive a successful total penis and scrotum transplant. A medical team at Johns Hopkins Hospital performed the complicated, 14-hour procedure.
To date, five individuals have received penis transplants, one in China, two in South Africa, and two in the United States. Each event is vigorously reported in the popular press, but as is typical, the coverage often contains more cheerleading than context.
In a recent article published to the journal European Urology, Johns Hopkins University urologist Hiten D. Patel provides that missing context, along with some skepticism.
"In the wake of the world's fifth human penile allotransplantation, an important question resurfaces—what in the world are we doing?" he writes.
Space is brutally inhospitable to human life, so it's a small wonder that out of the 561 people who have ventured beyond the safety of Earth, only three have died there. Five times as many have perished due to crashes or explosions when rocketing away from our planet or re-entering its atmosphere.
The three brave spacefarers who lost their lives in space were cosmonauts Georgy Dobrovolsky, Vladislav Volkov, and Viktor Patsayev. All three died on the Soyuz 11 mission of Jun 1971.
While Soyuz 11 ended in sadness, the vast majority of the mission progressed gloriously. Dobrovolsky, Volkov, and Patsayev lived aboard Russia's Salyut 1 Space Station, the very first space station, for twenty-three days, setting the record for the longest stay in outer space at the time. During their mission, the bold cosmonauts wowed Russians back on Earth with live television broadcasts, projecting hope and depicting a bright future. Patsayev also became the first man to operate a telescope in space. The spectrograms he produced of the stars Vega and Beta Centauri with the station's ultraviolet telescope were later published in the journal Nature.
On March 16, 1926 in Auburn, Massachusetts, American engineer Robert Goddard launched the first liquid-fueled rocket. The flight lasted a mere 2.5 seconds and ended anticlimactically 181 feet away in a snow-covered cabbage field, but it would prove to be one of the most significant flights in history.
Ninety-two years later, liquid-fueled rockets are the norm for spaceflight. Towering, explosive behemoths standing sixty times taller than Goddard's original rocket blast humans beyond the boundaries of Earth's atmosphere. Each launch is a true spectacle, offering testament again and again to humankind's collective potential to transcend barriers and reach new heights through brains and cooperation.
But will rockets remain our primary transportation to space into the far flung future? Or will they eventually be replaced by new methods and technologies?
Rockets, after all, are far from perfect. Fourteen astronauts have died during launches. By chemical engineer Don Pettit's calculation, "sitting on top of a rocket is more dangerous than sitting on a bottle of gasoline!" He ought to know, he's done it a few times. Pettit has flown five missions to the International Space Station and has tallied 369 days, 16 hours, and 41 minutes in space. At age 62, he's NASA's oldest active astronaut.
Today, creating ice is as easy as placing water in your freezer, but how would you accomplish the same phase-altering feat without an energy-guzzling appliance?
What may seem unfathomable at first thought was regularly accomplished more than 2,000 years ago, and in the desert of all places!
In the early evening hours, Persians and other ancient peoples of the Middle East would pour water in long, shallow stone pools no more than a foot or two deep. They would return to the pools just before first light the following morning to find the water frozen over. They'd then collect the ice and store it inside a yakhchāl, or "ice pit" (pictured above). Within these hollow, insulated domes were deep, subterranean holes where ice could be stored for months.
Okay, "What's the big deal?" you might be thinking. After all, one could easily replicate this process in frigid climes where ambient temperatures dip below freezing. But what's amazing here is that nighttime desert temperatures rarely dipped below freezing, yet ancient Middle Easterners managed to create ice nonetheless!
The rapid rise of e-cigarettes has left many worried vaping will rewind society's progress against the public health dangers of smoking. Underlying these concerns are three key questions. Today, we answer them:
1. Are E-cigarettes safer than normal cigarettes?
Yes. After years of research, there's little doubt that e-cigarettes are far safer than normal cigarettes. Joining the consensus are organizations like the National Academies of Sciences, Engineering, and Medicine, Cancer Research UK, and the British Medical Association. By replacing combustion of tobacco with vaporization of nicotine salts and chemical flavoring, long-term health risks are greatly reduced. Cancer risk from vaping could be as little as one percent of the cancer risk from smoking. Passive exposure to e-cigarette vapor is far less harmful than secondhand smoke. One study found that 3.5 years of daily e-cigarette use did not appear to damage the lungs of vapers in their twenties and thirties.
While e-cigarettes are unquestionably safer than cigarettes, they are not without health risks. Potential side effects include coughing, dehydration, sore throat, headaches, nausea, stomachaches, and, of course, nicotine addiction. Regular nicotine use can hinder brain development in adolescents and is especially harmful to developing fetuses.
Students of history in America are aware of the Ancient Egyptians, the Incas of Peru, and the Aztecs of Mexico, but fewer are familiar with a great civilization that spread across the eastern United States from approximately 800 to 1600 CE. Meet the Mississippians.
Long before European settlers planted the seeds of modern civilization in North America, the Mississippian culture spread from the Florida Panhandle all the way to southern Minnesota. Defining the dozens of discovered settlements are distinct earthwork mounds that resemble pyramids of dirt. Various structures were regularly constructed atop these mounds. Chiefs presided over individual settlements, and were thought to regulate trade, particularly of maize, which archaeologists believe was the primary staple crop.
The rise of centralized agriculture is the most agreed upon explanation for the evolution of the Mississippian culture. Settlements were set up near rivers to take advantage of fertile farmland. Food was grown and shared under the altruistic watch of the settlement chief.
With a reliable source of food, the Mississippians could undertake other pastimes. Metalworkers fashioned stone tools for farming and etched ornate copper plates for adornment. Artists crafted necklaces and pottery out of riverine shells. Spectators watched athletes compete in a game known as chunkey, in which players tried to hurl a spear closest to a thrown disc-shaped stone.
The climate is currently in a phase of rapid, human-caused change. Concentrations of carbon dioxide in the atmosphere are increasing. Global average temperature is rising. Arctic ice is melting. Sea levels are increasing.
While the overarching fact of manmade climate change is no longer debatable, a few facets are open to legitimate questioning. How bad will global warming really be? Are climate models really all that accurate? Is it better to mitigate or adapt to climate change? Will climate change actually produce net benefits?
The last question is perhaps the hardest to quantify, but it's also the most interesting. I'll broadly attempt to tackle it today.
Climate change will not be uniformly bad. Some parts of the world and their inhabitants will see benefits, others harms. Farmers in Denmark and Canada will enjoy longer and more productive growing seasons. Cold weather deaths will fall, particularly in northern latitudes. On the other hand, the average amount of time it takes for ecosystems to recover from drought is increasing. Moreover, many island nations may be essentially uninhabitable or even completely underwater due to rising sea levels.
The Messel Pit, located about 22 miles southeast of Frankfurt, Germany, isn't much to look at, no more than a dull splotch of gray amidst a serene, earth tone landscape of rolling hills and densely packed trees. Locked within, however, is a menagerie of exquisitely fossilized life dating back 47 million years.
At this time, during what's called the Eocene Epoch, the pit was a small, yet surprisingly deep lake within a tectonically-active landscape. Scientists hypothesize that shifting earth intermittently triggered the release of concentrated gases that seeped out of the lake and enveloped nearby fauna. Rendered unconscious, these animals would fall in and slowly drift down through the oxygen-deprived waters to the muddy floor below, where compacting soil and vegetation slowly preserved them over millions of years.
Today, that soil and vegetation is now petrified oil shale. More than 150 years ago, this shale drew the attention of miners. Organized operations began in the 1900s, turning up a wealth of fossil finds in the process. When shale mining became uneconomical in the late 1960s, operations halted. The site was briefly considered for a landfill, but scientists and citizens spoke out loudly against it. Messel became a UNESCO World Heritage site in 1995.
Now safely reserved for scientists and the general public, Messel Pit has yielded tens of thousands of fossils in the past couple decades, including pygmy horses, insects, early primates, cat-like predators, rodents, and lots of fish, some so well preserved that you can make out their colors and fur. Here is a selection:
In 2013, then congressman Jim Bridenstine, a Republican from Oklahoma, stepped onto the floor of the House of Representatives and said these words:
Mr. Speaker, global temperatures stopped rising 10 years ago. Global temperature changes when they exist correlate with Sun output and ocean cycles. During the Medieval Warm Period, from 800 to 1300 AD, long before cars, power plants, or the Industrial Revolution, temperatures were warmer than today. During the Little Ice Age, from 1300 to 1900 AD, temperatures were cooler. Neither of these periods were caused by any human activity. Even climate change alarmists admit that the number of hurricanes hitting the U.S. and the number of tornado touchdowns have been on a slow decline for over a hundred years. But here's what we absolutely know: We know that Oklahoma will have tornadoes when the cold jet stream meets the warm Gulf air. And we also know that this president spends 30 times as much money on global warming research as he does on weather forecasting and warning.
This single, misleading statement, spoken in the wake of a deadly, devastating tornado outbreak in Oklahoma, was crafted with truths and mistruths. Perhaps it was uttered out of genuine misunderstanding of the scientific evidence on manmade climate change? Perhaps it was said out of ideological-driven political expediency? Regardless, it earned Bridenstine the brand of "climate denier," an indelible mark that has been with him ever since, and made his recent confirmation as NASA administrator one of the most dramatic and controversial ever, for why should a science denier lead one of the world's leading scientific institutions?
But now, just six weeks into his tenure as NASA administrator, Bridenstine stated that he has "evolved" on climate change.
In 1894, the U.S. Department of Agriculture published its first dietary guidance. Chemist Wilbur Olin Atwater simply suggested diets based on based on protein, carbohydrate, fat and "mineral matter." He later wrote:
Unless care is exercised in selecting food, a diet may result which is one-sided or badly balanced–that is, one in which either protein or fuel ingredients (carbohydrate and fat) are provided in excess.... The evils of overeating may not be felt at once, but sooner or later they are sure to appear–perhaps in an excessive amount of fatty tissue, perhaps in general debility, perhaps in actual disease.
He pointed out that Americans consume fat and sugar in far greater quantities than Europeans, and overeating these mostly empty calories could be our undoing.
Some of Atwater's ideas about nutrition are now obsolete, but in his original, basic dietary instructions, there is a blueprint that the federal government of today should consider returning to: It doesn't make sense to prescribe a fairly specific diet to a broad, diverse populace.
Acupuncture, the practice of inserting thin needles into the skin to stimulate vital points of the body, has been employed in traditional Chinese medicine for thousands of years to heal a variety of maladies. Skeptics of the time-tested treatment have long derided it as lacking in evidence, but that's just wrong. Numerous systematic reviews of published trials show acupuncture to be effective at treating low-back pain, headache, and depression. Beyond alleviating these more psychological ailments, acupuncture can also treat physiological problems as well. A meta-analysis published in April showed that acupuncture is more effective than laxative drugs at treating constipation. And a review of 31 studies concluded that acupuncture is an effective treatment for obesity. Skeptics of acupuncture may claim to have evidence on their side, but in fact, the published literature proves them to be the true science deniers.
Now, here's why all of the above is complete crap.
Acupuncture is a pseudoscience, in that it features all the trappings of a legitimate, tested medical treatment without actually being one. Scientists and science aficianados can readily distinguish between science and pseudoscience, but to laypersons, the distinction can be incredibly difficult.
Take the opening paragraph, for example. Everything written has a passing attachment to fact, what Stephen Colbert might call "truthiness," and someone could easily find the linked studies and come to the conclusion that acupuncture really and truly works. Yet it is only when you dig into the meat of the research that you find it to be not all that impressive.
I found myself turning to my "Bible" last week, but not the Old Testament nor the New. No, the closest thing to a holy book I adhere to is The Demon Haunted World, Carl Sagan's enduring treatise on science and skepticism. As I picked through its pages, slowing my skimming here and there to absorb some of my favorite passages, I wondered what lessons of Sagan's laid down in his book would compare to the biblical Ten Commandments put forth thousands of years prior? I distilled eight, but perhaps a few Sagan fans could provide more?
Here they are, with supporting quotes from Sagan written underneath:
Thou shalt show humility in all things.
“We have not been given the lead in the cosmic drama. Perhaps someone else has? Perhaps no one else has? In either case, we have good reason for humility.”
Low-carbohydrate diets are all the rage in the Western world, and they always seem to be... While the foundational claim – eat fewer carbs – hasn't changed over the past few decades, the marketing has shifted. "Atkins," "Stillman," "Paleo," "Keto;" pick what sounds most appetizing and eat your way to health nirvana, or so devotees proclaim.
And any of these diets could very well prove successful for an ardent follower, but not for the reasons that are regularly touted. Our bodies are not evolutionarily predisposed to thrive on a "Paleolithic" diet. The ketosis brought on by eating extremely low amounts of carbohydrates is not metabolic magic. Rather, people who switch to these diet regimes see improvements simply because they are probably transitioning from a haphazard, unbalanced diet to one that's actually nourishing. If they experience significant weight loss, it's because they reduced calories as a side effect of the diet change.
Yale University neurologist Steven Novella provided a wonderful analogy to explain this on his podcast The Skeptics' Guide to the Universe. Controlling the body's weight is a lot like adjusting the speed of a car, he said, "It’s basically a function of how much you’re applying the gas versus the brake." In other words, how many calories are you eating vs. burning?
Low-carb purveyors commonly point to studies (almost always conducted on rodents) suggesting that their chosen diets alter metabolic pathways, boost certain hormones, lower your appetite, and burn more fat. That may all be true, Novella says, but it's mostly meaningless when you examine effects on the whole human body.
Modern scientific publishing has a lot of problems. For starters, it perpetuates the "publish or perish" mindset, which all too often encourages quantity (more citations) over quality (better controlled research). The status quo also sees the latest research behind an exorbitant paywall, making most studies inaccessible to the general public and even a fair amount of scientists. The open-access movement sprung up to fix this injustice, but hundreds of open journals are now "predatory," publishing all sorts of poor quality research papers provided the authors pay the up-front costs. And this has fed a new problem: there's such a glut of research these days that many scientists can't keep up with their fields and have difficulties determining which studies merit their limited time and which do not.
Unfortunately, these are all multifaceted problems with no easy fixes. There's another problem, however, which is easily remedied and could improve the quality of published research. Right now, the overwhelming majority of peer reviewers, the scientists who scrutinize the latest studies, aren't paid for their labor. This is completely ridiculous. Peer review may be the most important part of the scientific enterprise, and it is not incentivized monetarily.
According to one study, the free labor scientists provide for academic journals is worth as much as $2.57 billion each year. Considering that scientists and their institutions are making such a generous donation of time to publishers, one might think that publishers would return the favor with lower costs to access the latest research. But no, Elsevier, a behemoth amongst publishing companies, maintains margins of nearly 37% in the scientific, technical, and medical sector. In 2017, that translated to $1.23 billion in profit on revenues of $3.35 billion.
Despite this unmistakable inequity, many scientists see peer review as an obligation to science and view recognition in journals as enough reward for their work. Mick Watson, a professor of bioinformatics at the University of Edinburgh thinks this is naive, however.
Today, the Big Bang model of cosmology is pretty much taken for Gospel, and for good reason. For more than fifty years, evidence gathered from all manner of sources has supported the notion that the Universe as we know it expanded from an infinitely dense singularity.
But things didn't always look so certain for the Big Bang. In its most nascent form, the idea was known as the hypothesis of the primeval atom, and it originated from an engineer turned soldier turned mathematician turned Catholic priest turned physicist by the name of Georges Lemaître. When Lemaître published his idea in the eminent journal Nature in 1931, a response to observational data suggesting that space was expanding, he ruffled a lot of feathers. As UC-San Diego professor of physics Brian Keating wrote in his recent book Losing the Nobel Prize, "Lemaître's model... upset the millennia-old orthodoxy of an eternal, unchanging cosmos. It clearly implied that everything had been smaller and denser in the past, and that the universe must itself have had a birth at a finite time in the past."
Besides questioning the status quo, Lemaître's primeval atom also had some glaring problems. For starters, there were hardly any means of testing it, a must for any would-be scientific theory. Moreover, it essentially suggested that all the matter in the Universe came from nothing, a flabbergasting claim. It also violated an accepted notion known as the perfect cosmological principle, which suggested that the Universe looks the same from any given point in space and time.
For these reasons, English astronomer Sir Fred Hoyle gathered with a few colleagues to formulate the Steady State theory of the cosmos. The idea kept the observable universe essentially the same in space and time, and it accounted for evidence suggesting that the universe is expanding by hypothesizing that matter is instead being created out of the fabric of space in between distant galaxies. Steady State didn't have the problems inherent to the notion of a primeval atom, and, as Keating wrote "it sure as hell didn't look like the creation narrative in Genesis 1:1."
Around 12,900 years ago, Earth's climate abruptly changed, but the upheaval was particularly felt in North America. Over the span of just a decade, temperatures fell between 3.6 and 10.8 degrees Fahrenheit on average. Glaciers which had been gradually receding through Canada reversed course and crept southward. The air dried, and droughts became frequent. Megafauna like mammoths, camels, and giant bears couldn't adapt to the sudden changes and died out in droves, their extinction abetted by hungry humans on the hunt.
For years, the accepted explanation for the Younger Dryas, as this period is called (named for the tundra-loving wildflower pictured above), is that glacial meltwater severely slowed or even halted North Atlantic "conveyor" currents that carry warm waters originating closer to the equator northward, thus depriving the air and nearby continents of heat from the ocean. But over the past decade, accumulating evidence has been gradually dislodging this narrative. Taking its place is a story that's decidedly more explosive.
Dozens, and perhaps hundreds, of scientists from all across the world now ascribe to the notion that some sort of cometary impact triggered the Younger Dryas cooling and extinctions. They cite as evidence the presence of nanodiamonds and microscopic grains associated with impact events at sites across the globe around 12,900 years old, the time that the Younger Dryas began.
But critics have countered their claims every step of the way, questioning the dating techniques used, disputing the origin of the nanodiamonds, and pointing out that no large impact craters linked to the Younger Dryas have yet been found. Their fierce skepticism is justified. Supporters of a Younger Dryas impact hypothesis are making a bold claim. It is their responsibility to back it up.
A sperm's quest to fertilize an egg is not easy. Of the roughly 250 million sperm ejaculated into the human vagina during intercourse, fewer than one in a hundred will survive the perilous trek up the hostile, acidic chamber to the cervix. If the female is set to ovulate soon, the cervix will allow the frisky, foreign interlopers inside. If not, they'll "drown" in a thick flow of cervical mucus. Sperm that enter into the cervix must next decide to go left or right – for a human female has two fallopian tubes with an ovary at each end, and usually just one releases an egg. Perhaps a hundred sperm will enter a fallopian tube, and maybe one or two will actually reach the egg for a chance at fertilization.
In this arduous race, all of the sperm independently race towards the same goal: reproduction. But what if the competition was more cutthroat?
In the 1990s, biologist Robin Baker put forth the idea that a significant proportion of human sperm are not actually capable of fertilization and instead only serve to block or attack sperm from rival males potentially present inside the female. These "kamikaze" sperm probably also exist in other primates as well as many other promiscuous animal species, he hypothesized.
But an experiment published in 1999 countered Baker's claims. Scientists Harry Moore and Tim Birkhead mixed sperm samples in settings similar to the female reproductive tract and watched if the individual cells went to war. The sperm flit about furiously in search of an egg, but there were no signs of microscopic violence. The findings shut the door on the "kamikaze sperm" hypothesis in humans. It remains open in other species, however.