I recently adopted a puppy. She's a cute dog – people say she resembles a fox or a little auburn wolf.
The latter description is one that many dog owners have taken to heart with their own canine companions. Swayed by marketing, influential trainers on television, and online blogs, they've decided that treating their slobbering pooches like wolves is a good idea.
Wolves and dogs diverged from their last common ancestor between roughly 11,000 and 41,000 years ago. Though tens of thousands of years and numerous genetic mutations separate them, they still share 99.9% of their DNA. Citing these intimate links, some suggest that you should train your dog like a wolf. Chiefly, you should be your dog's "alpha" or pack leader. This entails getting your dog to 'submit' when he or she steps out of line or misbehaves. The idea is that their missteps are truly attempts to gain dominance over you. This means you should respond with strategies like rolling them onto their backs or forcing their heads into the ground. Unfortunately, all these techniques accomplish is instilling fear in your dog. Your pet learns that humans can be harsh creatures, and their touch, a scary thing to be avoided.
While dogs can sometimes look live wolves, their behavior differs from wolves' just as our behavior differs from chimpanzees. Sure, humans share 98.8% of our DNA with chimps, but you wouldn't eat the lice out of your family members' hair, would you?
In 2014, researchers asked men to estimate the size of the average erect penis. Their guess? 6.2 inches (15.8 centimeters). That's actually in line with what numerous scientific studies have reported. But guess what? Those studies are wrong.
Clemson University Professor Bruce M. King, senior author of the textbook Human Sexuality Today, drew attention to this issue in a recent review published to the Journal of Sex and Marital Therapy.
For years, researchers asked men to self-report the length of their erect penises by measuring along the top from the abdomen to the tip, and over that time, men consistently informed researchers that their members ranged from roughly 6.1 to 6.5 inches.
Can you see the problem with this procedure? Asking men to accurately report the size of their penises is like trying to eclipse the speed of light in a junker car: it's not gonna happen.
America is once again awash in conspiracy theories, and it's easy to understand why.
"Studies suggest that conspiracy theories flourish when people feel anxiety, alienation, paranoia, or loss of control," political scientists Joseph Uscinski and Joseph Parent wrote in their seminal book American Conspiracy Theories.
By themselves, a once-in-a-lifetime global pandemic, an unprecedented economic recession, or widespread civil unrest are each enough to trigger all of those feelings. Americans are facing all of these events simultaneously.
That's why some stressed people have linked the COVID-19 pandemic to the spread of 5G cellular technology, insist that the virus was intentionally created and unleashed, accuse Bill Gates of trying to depopulate the world though vaccinations, theorize that mass protests were meant to start a race war, or contend that face masks are killing people.
Hilary can't be sure whether it was she or one of her sisters who subsumed their fellow fetus when they were in their mother’s womb, but she enjoys jokingly taking the blame when the matter comes up at family gatherings.
Hilary is a triplet, part of a rarity that occurs roughly once every thousand or so births. But she could have been a quadruplet. At some point very early in her and her sisters’ development, there were four fetuses present. Then there weren’t.
Such a disappearance is termed "vanishing twin syndrome". According to the American Pregnancy Association (APA), "This occurs when a twin or multiple disappears in the uterus during pregnancy as a result of a miscarriage of one twin or multiple. The fetal tissue is absorbed by the other twin, multiple, placenta or the mother."
And it's surprisingly common, occurring in one out of every eight multifetus pregnancies, but perhaps in as many as three in ten. In the vast majority of instances, vanishing twin syndrome happens so early that it goes entirely unnnoticed, usually within the first seven weeks of pregnancy. Rarely, however, the loss occurs in the second or third trimester, potentially placing the surviving fetuses and the mother at higher risk, as well as causing understandable emotional distress to the mother, father, and other family members.
Consciousness is a paradox – both intimately knowable and nearly impossible to pin down. As humans, we know we have it but haven't a clue how it arises. It's a facet of intelligent life so nebulous that it stretches both science and philosophy to their limits.
When something is this maddening, it helps to break it up into simpler parts. Christopher Tyler, a visual neuroscientist at the Smith-Kettlewell Brain Imaging Center and a Professor at the City University of London, outlined ten properties of human consciousness in a recent paper published to the journal Frontiers in Psychology.
According to Tyler, consciousness' first property is privacy. Simply put, there is no way (outside of science fiction) for any conscious being to completely share the conscious experience of another.
The second property is unity. As Tyler wrote, consciousness must "occur either in a single brain site or in a unified neural net of some kind in the brain, rather than in multiple independent brain sites."
When one thinks of where dinosaurs lived, the most salient image that comes to mind is Jurassic Park. The setting is tropical, with ferns erupting from the ground, enormous trees reaching into the sky, and insects buzzing through the thick, humid air.
But just like today, Earth of the past was a big, ecologically diverse place. And dinosaurs, which dominated the land for over 100 million years, dwelled in almost every corner, even the colder parts.
So instead imagine this scene, which could have played out in the late Cretaceous period between 66 and 100 million years ago: Hundreds, perhaps thousands of elephant-sized Edmontosaurus ambling in a great herd through a snow-covered landscape dotted with coniferous trees, migrating in search of food. Suddenly, out of one of the patches of trees, a predator emerges, perhaps the raptor-like Troodon or the T. rex-related Nanuqsaurus (pictured top). Nanuqsaurus, whose name means "polar bear lizard," might have been blanketed in white, fluffy feathers for warmth and camouflage. It charges the herd, scattering some of the members in panic. Finally, it zeroes in on a target and lunges, clamping its powerful jaws on the Edmontosaurus' neck, delivering an incapacitating blow.
Any student of nature documentaries knows that life and death scenarios like this occur regularly in the Arctic today, most notably with ravenous wolves and migrating caribou. It's fascinating to think that such chases have been happening for millions of years, albeit with different characters.
Humans suffer from all sorts of biases. There's the default effect, "when given a choice between several options, the tendency to favor the default one." There's stereotyping, "expecting a member of a group to have certain characteristics without having actual information about that individual." There's confirmation bias, "the tendency to search for, interpret, focus on, and remember information in a way that confirms one's preconceptions." This is a mere sampling of biases; there are hundreds more.
But there's a strong case to be made that the ultimate human bias may be what's called the "bias blind spot". Simply put, it's the bias that you are unbiased, or at least not as biased as everyone else.
Stanford University researchers Emily Pronin, Daniel Y. Lin, Lee Ross coined the term back in 2002, but the bias first went by a different name, "the illusion of objectivity", in psychologist David Alain Armor's 1999 dissertation, in which he conducted five experiments on more than 800 individuals.
"Across these studies, approximately 85% of participants indicated that they were more objective than the average member of the group from which they were drawn, a state of affairs that cannot be true," he wrote.
Over the preceding decades, hundreds of potential treatments for Alzheimer's Disease have failed to show clinical benefits in human trials, crashing and burning with a consistency that Stat News' Damian Garde called "metronomic." That's terrible news for the more than five million Americans who currently have Alzheimer's and the more than fourteen million who will be living with it by 2060.
It seems that the primary hypothesis for the cause of Alzheimer's – that a buildup in the brain of a protein called beta-amyloid is responsible for cognitive decline – is wrong. Drugs that reduce beta-amyloid don't resolve Alzheimer's patients' crippling dementia.
But despite a landscape of pharmaceutical solutions that's largely devoid of hope, there does exist a widely available therapeutic that has proven highly effective at preventing Alzheimer's: exercise.
According to the Alzheimer's Society, "Several prospective studies have looked at middle-aged people and the effects of physical exercise on their thinking and memory in later life. Combining the results of 11 studies shows that regular exercise can significantly reduce the risk of developing dementia by about 30 percent. For Alzheimer's disease specifically, the risk was reduced by 45 percent."
This past December, I lost my mother. She was 72.
While vacationing for the holidays, she fell and hit her head. What might have been a minor knock for someone far younger resulted in a subdural hematoma – bleeding from the brain leading to a pooling of blood – owing to her age and history with stroke. She was taken to a nearby hospital, anesthetized, intubated, and placed on a ventilator. Doctors performed emergency surgery to relieve pressure on her brain. The surgery was technically successful – the bleeding stopped and the pressure abated – but my mom was unresponsive.
This was the state in which I found her when my brother and I finally arrived to be by her side in the hospital. A situation unfamiliar to us as it would be to most people, it was doubly-so for my mom, who was strong and outspoken throughout her life and career as a college proessor. Breathing with a ventilator but not sedated, she should have been awake, but the trauma that wracked her brain had apparently caused her cognitive processes to go dark. Talking to her with raised voices yielded no response. Only the most painful and annoying stimuli, a hard pinch from the doctor or prick with a needle, would prompt her eyes to open, only fleetingly. We took those treasured opportunities to tell her how much we loved her and wanted her back with us. During the interminable hours, our eyes were glued to the various medical monitors, riding the ups and downs of her blood pressure and body temperature as she battled a post-surgery fever.
Unfortunately, each day in the hospital, she faded a little more. The brief gazes and slight movements grew harder to rouse. She was slipping away. My father, brother, and I were bluntly yet kindly given the prognosis and our options: Our mom may wake up again, but more than likely she won't. If she did, she would probably be completely care-dependent, unable to feed, move, or clean herself. The ventilator and feeding tube that were keeping her alive could do so indefinitely. The longer she was on them, her brain could stabilize and she might be able to live without them, but this "recovery" could leave her locked in, aware of the world around her yet unable to interact with it in any meaningful way. We all agreed this prospect would be torturous for her, but the alternative option: removing her ventilator and feeding tubes, felt like giving up. The doctors framed it in a different way: rather than forcing her to stay alive, withdrawing life support would allow her to choose her fate. After some consideration, we decided that's what she would want.
Anyone who has seen an image of Earth from space can instantly recall our planet's fiercely blue oceans, resplendent green forests, and splotchy brown deserts. This work of art is our wondrous home.
But between 3 and 3.8 billion years ago, Earth may have been unrecognizable, its modern medley of colors instead dominated by one: purple.
University of Maryland-Baltimore Professor Shiladitya DasSarma originally painted this picture at 2007's American Astronomical Society Meeting. His 'Purple Earth' hypothesis has remained a speculative delight ever since and now has growing evidence to support it.
The hypothesis is that Earth's earliest microbes were phototrophs, capturing photons from sunlight to produce energy for themselves. This isn't too surprising – the plants that dominate Earth today are also phototrophs. However, unlike modern plants, which utilize the pigment chlorophyll to capture light, these ancient microbes might have used a pigment called retinal. Chlorophyll absorbs red and blue light while reflecting green. Retinal does the opposite, meaning microbes making use of it would appear purple.
Beloved astrophysicist and science communicator Carl Sagan published The Demon-Haunted World: Science as a Candle in the Dark twenty-five years ago. An accessible explainer on the scientific method and critical thinking, the book remains just as relevant today as it was then. The world is full of misinformation, biases, and downright falsehoods that can mire us in darkness. Science shines a light that can lead us through.
Unfortunately, humans all too often ignore science. Most of the time that's just because we don't see it. Popular media sources regularly feature diverting nonsense, broadcast dogmatic screeds, or even depict brutal violence. Not featured enough are stories of discovery, debates with intellectual humility, and tales of sussing out the truth with evidence and logic.
In one section of The Demon Haunted World, Sagan wondered what an intelligent alien might think after perusing everything that's available in the media.
"An extraterrestrial being, newly arrived on Earth – scrutinizing what we mainly present to our children in television, radio, movies, newspapers, magazines, comics, and many books – might easily conclude that we are intent on teaching them murder, rape, cruelty, superstition, credulity, and consumerism. We keep at it, and through constant repetition many of them finally get it."
46,802 Americans died from opioid overdoses in 2018, the latest year for which CDC data is available. This painful cost has been exacted regularly in recent years, the price of rampant opioid overprescription and profit-hungry pharmaceutical companies.
Preventing these deaths means finding an effective way to treat opioid addiction. Somewhere around two million Americans suffer from opioid-related substance use disorder. Treatments like buprenorphine and methadone calm the brain circuits affected by opioids, reducing cravings and withdrawal. In conjunction with counseling, these medications can gradually ferry addicted individuals back to normalcy. Unfortunately, medications are underutilized and states generally lack the resources to provide them to all afflicted individuals.
It is into this quagmire that some have suggested inserting a new, surprising treatment: a powerful psychedelic drug called ibogaine.
Derived from the root or bark of a West African shrub called Tabernanthe iboga, ibogaine has been used in the Bwiti spiritual discipline of the forest-dwelling Punu and Mitsogo peoples of Gabon for generations. Unforgettable to those who have taken it, a high dose of ibogaine induces an "oneirogenic" waking dream-like state for as long as 36 hours, with introspective effects that can last for months afterwards, supposedly permitting takers to conquer their fears and negative emotions.
In the mid-20th century, under Joseph Stalin's Soviet regime in Russia, Trofim Lysenko pushed an ideological system of agriculture that, among many questionable planks, contended that various crop plants could be physically reshaped and their new characteristics passed on, thus producing more food. Lysenkoism became state dogma amidst Stalin's war on 'western' genetics, and the effects were predictably tragic: crops failed and tens of millions starved.
Lysenko's notion that acquired traits can be inherited did not originate with him. Rather, he repurposed ideas expounded upon by French biologist Jean-Baptiste Lamarck, who himself repurposed commonly held ideas.
"[Lamarck] merely endorsed a belief which had been generally accepted for at least 2,200 years before his time and used it to explain how evolution could have taken place," historian of science Conway Zirkle wrote.
In Lamarck's book Philosophie Zoologique, published in 1809, he described two laws to explain biological evolution, constituting the first cohesive theory to do so, later dubbed 'Lamarckism'. Bryan M. Turner, a Professor of Experimental Genetics at the University of Birmingham in the U.K., described them in a 2013 article.
A single virus particle – or virion – of SARS-CoV-2 is just 50 to 200 nanometers in diameter. Though diminutive in size, this virus has now upended human life on Earth. Travel is essentially shut down, millions are out of work, and hundreds of thousands are dead.
More biological machines than living entities, viruses number in the nonillions. (There's roughly 10^31 individual viruses, or virions, on the planet.) A virus is composed of an RNA or DNA genome and a protein shell called a capsid. Some even have basic membranes – like outer skin – called envelopes. That's essentially it. A virus' sole drive is to replicate itself, and it can only do so inside living cells, which unfortunately usually results in the death of those cells.
Certain viruses target human cells. A few have grown so adept at invading our cells that they infect the majority of humans on Earth. It's a near certainty that you are, or have been, infected by one of these viruses:
1. Epstein–Barr virus. Spread through saliva, this virus (pictured top) is the primary cause of the mild yet protracted disease mononucleosis, commonly know as mono. In the United States, about 90% of adults show evidence of previous active infection. The term "active" is needed because once the initial infection is beaten back, the Epstein–Barr virus lies dormant in the individual's B cells, a type of white blood cell, for the rest of their life. In this form, the virus is harmless, but it can reactivate when the immune system is stressed and cause illness once again.
This week, I joined a rapidly growing group of 14,183 people via the organization 1 Day Sooner in volunteering for human challenge trials (HCTs) to test promising vaccines against the novel coronavirus. If selected, I would be administered a candidate vaccine or a placebo vaccine then deliberately exposed to the SARS-CoV-2 coronavirus.
I did not sign up for this endeavor seeking heroism or notoriety, but because I weighed my own personal risks against the benefits to global society. In this calculation, there is an overwhelming net benefit. As a white, healthy, physically active 32-year-old with no underlying health conditions, my risk of death from COVID-19 could be as high as 1 in 1,200, but more likely as low as .014 percent, roughly 1 in 7,400. Participants in HCTs would be given small, controlled doses of virus and quarantined with excellent medical care, so that rate could be even lower. Still, we can estimate that if 20,000 people took part in HCTs, between two and seventeen participants could die.
In the absence of HCTs, thousands of participants would be given experimental vaccines or placebos in phase III trials then asked to go about their lives so they can potentially be exposed to the novel coronavirus. These trials last many months or even years to ensure adequate infection numbers and sample sizes to ascertain efficacy. HCTs could attain more accurate results in a fraction of the time. Roughly 5,000 people are currently dying every day from COVID-19. That number could swell this winter in the event of a predicted second wave. If three months could be shaved off the estimated 12 to 18 months needed to produce an effective vaccine, hundreds of thousands of lives could be saved. Millions more could safely return to livelihoods, visit quarantined loved ones (particularly the elderly), and resume abandoned pastimes that much sooner.
There is presently a debate over whether or not human challenge trials for COVID-19 should be permitted. That debate will grow more pressing as phase I and II trials – meant to gauge safety, preliminary efficacy, optimal composition, and dose – for some of the dozens of vaccine candidates near completion this summer. The two main arguments against HCTs are ethical and practical. Ethically, given that we still know comparatively little about SARS-CoV-2, can there truly be informed consent for trial volunteers?
Western Europe hosted the most fatal plague pandemic in history – the Black Death killed over 50 million in the mid-14th century. Today, however, plague is essentially extinct in that part of the world. Across the Atlantic, the United States still sees between one and seventeen cases of the infamous bacterial disease each year. At least 106 people have been infected since 2000, with twelve deaths.
Plague is caused by the bacterium Yersinia pestis, which most often spreads from the bite of infected fleas. When the bacterium infects humans or other mammals, it usually multiplies quickly, causing fever, weakness, headache, and a variety of other symptoms depending upon the part of the body the pathogen attacks. Lymph nodes swollen to characteristic buboes hint that the bacterium has entered the lymphatic system, while blackened skin indicates that plague has reached the blood.
Thankfully, the disease is readily treatable today with early administration of antibiotics. Its rarity in the U.S. can result in misdiagnosis, however, which contributes to the roughly ten percent death rate.
In the U.S., plague became a harsh reality rather than a distant piece of history relegated to Europe in 1900, when steamships bearing immigrants, goods, and infected rodents arrived on the west coast. California experienced almost all of the resulting 280 cases and 172 deaths over the next eight years. Many politicians, officials, and newspapers initially covered up the outbreak, worried that it would devastate the state's lucrative and growing agriculture industry.
Mice are not humans. This obvious truth, coupled with issues like poor methodology, reporting bias, and sloppy statistics, explains why – historically – studies conducted on mice have rarely translated to us.
Those latter problems always seem to plague scientific research, no matter how hard scientists try to weed them out. Perhaps that's why in the early 2000s, researchers began working in earnest to re-engineer laboratory mice altogether. The goal? Make them a little more human.
"Humanized models – mice expressing human transgenes or engrafted with functional human cells or tissues – can provide important tools to bridge the gap between animals and humans in preclinical research," wrote Monica J. Justice, Program Head and Senior Scientist in Genetics and Genome Biology at The Hospital for Sick Children.
Mice and humans share roughly 97.5% of their DNA, so one might think that would make us near perfect stand-ins for each other when it comes to studying pharmaceutical treatments and modeling disease. However, the slight difference in our biological coding means that mice are not susceptible to various infections like HIV, Epstein Barr Virus, or Ebola. Moreover, they metabolize drug compounds much differently.
Dogs are regularly infected with H3N8 and H3N2 canine influenzas, as well as a variety of other strains. Cats often catch dangerous respiratory infections from viruses like feline calicivirus and Felid alphaherpesvirus. Both also carry various bacteria and parasites that can pass to humans. Yet despite all the diseases affecting these animals and the countless, adorable interactions we share, there has not been a single instance in known memory where one of our beloved canine or feline companions has triggered a global pandemic in humans. Why not? And could it ever happen?
To explore the first question, it might help to compare our pets to the animal presently synonymous with emerging viruses: bats. Deadly infectious diseases like Ebola, Marburg, nipah, SARS, Lassa, and of course COVID-19 have all been linked to these furry, flying mammals. Unlike, dogs, cats, and most other mammals, bats have mutations that blunt their immune responses. Somewhat paradoxically, that seems to be a benefit – their immune systems keep viruses at bay while not overreacting, which can cause harmful collateral damage to bodily systems. As a side effect, this means that viruses and bats can co-exist – bats provide homes to various viruses while suffering few ill effects.
Unfortunately for other animals, that makes bats a breeding ground for dangerous diseases that can mutate and jump species. Many of the 1,200 bat species worldwide live in large colonies, which range in size from dozens of individuals to hundreds of thousands. These colonies are dense and intimate, making viral transmission a fact of life. Such a melting pot is a mixing pool for viruses where mutations abound. Bats can then come into contact with other animals or humans via predation, co-habitation, or hunting and share their viral interlopers.
Moving the discussion back to our prized pets, there are a few apparent factors that make it less likely that dogs or cats will spark a global infectious disease pandemic in humans, even though we regularly interact. For one, they are fairly separated from others of their species, making viral transmission more difficult and mixing less common. Nor do they interact very often with other animal species, provided they are kept on leash or indoors. Moreover, between grooming appointments, veterinary visits, and mandatory vaccinations, we regularly keep them clean and cared for.
Mount Wingen in New South Wales, Australia is commonly known as Burning Mountain, partly for the red regolith that colors its summit, but primarily because an actual fire smolders one hundred feet below its surface, and has done so for at least 6,000 years! This is the oldest-known natural coal fire.
There are a couple good reasons why – for decades – humans have harvested coal to generate energy: it's copious and quite combustible. Over millions of years, simmering heat and crushing pressures transformed dead plant matter into this sedimentary rock, which is composed of carbon and smaller amounts of hydrogen, sulfur, oxygen, and nitrogen. It sometimes only takes a little heat to roil coal into flaming life. Just as this is true when grilling on a summer's day, so is it true within the Earth.
An underground coal bed can ignite from a lightning strike, wildfire, or a mere jostling of the topsoil, enough to permit a slight, but steady stream of oxygen to reach the rocky fuel below. Oxgyen is a highly reactive element, and will oxidize the coal, pilfering electrons and spurring a release of heat. When a fire starts, it can endure for a very long time.
Scientists have unearthed remnants of coal fires that burned millions of years ago, so they've been occurring naturally for some time. Humans, however, have set alight a lot more in a comparatively short timespan. Why? Mining. Explosions, digging, and drilling at mines set numerous coal fires each year, so much so that in 2010, it was estimated that coal fires accounted for as much as 3% of global carbon emissions.
In 2008, the National Academy of Sciences put forth a textbook-ish definition of "science": "The use of evidence to construct testable explanations and predictions of natural phenomena, as well as the knowledge generated through this process."
That's fine, well, and accurate, but it's also a tad dull. Over the years, influential scientists and philosophers have taken their own stabs at describing the discipline to which they have dedicated their lives. What they've written has been elucidating and occasionally odd.
Philip Morris Hauser, a demographer and pioneer in urban studies who was a president of the American Sociological Association, expressed that science is "the most subversive thing that has ever been devised by man. It is a discipline in which the rules of the game require the undermining of that which already exists, in the sense that new knowledge always necessarily crowds out inferior antecedent knowledge."
Nobel Prize-winning physicist Richard Feynman agreed, but he focused instead on out-of-date fountains of knowledge, saying, "Science alone of all the subjects contains within itself the lesson of the danger of belief in the infallibility of the greatest teachers in the preceding generation... As a matter of fact, I can also define science another way: Science is the belief in the ignorance of experts."