Make no mistake, human-caused climate change is an urgent global threat, one in which the consequences vastly outweigh the benefits. Sea levels are rising, causing entire islands to disappear and coastal cities to flood. Warmer waters are intensifying hurricanes and augmenting their rainfall. Droughts are growing more common and lasting longer. Agricultural yields will likely fall in many of the globe's breadbasket regions. These impacts, along with many others, are predicted to sap the world's economy by trillions of dollars over the next thirty years.
But while climate change on a global scale is decidedly damaging, not all areas of the planet will experience these negative effects equally. In fact, some areas may actually benefit. As Earth's climate changes, here are three regions that could be big winners.
1. Northern Minnesota and Michigan's Upper Peninsula. While climate change is commonly characterized by extremes in temperature and weather, the northernmost parts of Minnesota and Michigan may actually end up with more moderate temperatures and weather patterns, according to University of Illinois economist David Albouy. Officials in Duluth, Minnesota, a city of roughly 86,000 people along the shores of Lake Superior, have even considered the slogan "climate-proof Duluth".
"We’re not seeing worse heat waves or longer heat waves or more of those long nights that don’t fall below 75 degrees,” Dr. Kenneth Blumenfeld, a senior climatologist at the Minnesota Department of Natural Resources, told the New York Times. "Instead, what we’re seeing is warmer winters, fewer days during winter where we get to negative 30 Fahrenheit."
In 2006, Dr. James Fallon found out he had the brain imaging pattern and genetic make up of a "full-blown psychopath".
He was surprised, to say the least.
As a happily married family man and a successful neuroscientist at the University of California-Irvine, Fallon didn't exactly fit the malevolent stereotype of a psychopath, but there it was on a brain scan: drastically diminished activity in specific areas of the frontal and temporal lobes linked to empathy, morality and self-control. So he asked his wife, kids, grandchildren, and colleagues for their thoughts on his apparent diagnosis.
"Big mistake," he later recalled.
Of all human bodily organs, the eye is especially alluring. Seemingly alive as it twitches to and fro within the socket, the eye permits sighted interaction with the physical world and allows us – in a superficial, yet meaningful way – to gaze into the minds of others.
The human eye's hallmark trait is its medley of colors. The iris, which surrounds the pupil, can appear blue, green, gray, hazel, brown, and even red. Differences in levels of the pigment melanin primarily account for the varying hues – more melanin renders the eyes darker, while less leaves eyes reflecting light blue. How much melanin dwells within the iris depends on the expression of around a dozen different genes, and perhaps more. The two most important by far are OCA2 and HERC2. OCA2 produces a protein that controls the maturation of melanin-producing melanosomes. HERC2 controls the expression of OCA2.
When humans arose in the horn of Africa at least a quarter of a million years ago, human eyes were extremely dark brown or nearly black. That's because OCA2 was expressed at high levels, in turn leading to the production of more melanin, which colored skin dark brown and, as a side effect, darkened irises. Brown skin is less likely to be sunburned or to develop skin cancer, benefits which served humans well in Central Africa's sunny, equatorial climate.
But when humans started migrating out of Africa between 50,000 and 70,000 years ago, the selective pressures that drove heightened melanin production disappeared. In Northern Europe, where sunlight can be a scarce commodity in winter, lighter skin tones became advantageous, as they allow for more vitamin D absorption from sunlight. This meant less melanin in the body, which permitted eye color to diversify as other genes that more subtly affected eye color mutated, their influence becoming more apparent.
Even a thousand years later, it's an unsettling sight: A medieval man's skeleton, bearing signs of repeated stabs to the sternum, lying face-down in a shallow grave. Who was this man buried in Sicily? And why was his body arranged in such a deviant manner?
Archaeologists' best guess is that he was an exiled outlaw. Even in death, which was apparently by execution, he was to be separated from the rest of civil society. Thus, the prone positioning while others had their bodies oriented skyward.
Face-down burial in medieval Europe was exceedingly rare. During the Middle Ages, which stretched from roughly the 5th to 15th centuries, less than one percent of all burials were in the prone position.
The reasons for this rare arrangement of remains varied widely. Between 950 and 1300, some high ranking nobles and priests were apparently buried prostrate as a sign of humility before God, their "pious" positions somewhat undercut by the presence of fine clothes and jewels placed alongside them. Other prone burials were found towards the edge of cemetaries separated from the bulk of graves. Most were shallow and lacked coffins. The people laid to rest in this manner may have been outcasts, criminals, or seen as cursed. Still, some remains were found in prone positions within coffins buried alongside many normal, upward-facing skeletons.
There are a variety of ways the human race might go extinct. The ozone layer could collapse, exposing Earth's surface to dangerous levels of ultraviolet radiation. Supervolcanoes could erupt across the globe, depriving food crops of essential sunlight. Anarchists could concoct a lethal, highly contagious supervirus. A planet-killer asteroid could strike the Earth, wiping it clean of all life.
This nightmarish list prompts an accounting question: How much money should we spend to prevent our own extinction?
You might think that your life and the existence of the human race is priceless, therefore worth any amount of money and effort. On the other hand, you might rightly recognize that these cataclysmic events are exceedingly unlikely, so improbable in fact that you round their chances down to zero and go about your life relatively free from existential dread and financial burden.
Considering all of the distracting day-to-day problems most of us face, the vast majority of humanity understandably adopts the latter view.
Jesse and Sam both start kindergarten this year. Jesse was born on September 4th, 2014. Sam was born August 24th, 2015. Jesse is roughly 20% older than Sam on the day they start school. While this relative difference in maturity will dwindle as Jesse and Sam grow up, studies suggest that it could give Jesse an advantage in sports, academics, and even politics that will persist for the rest of their lives.
This is the relative age effect. It's spawned from the creation of cohorts, particularly for school and sports. Since human births span fairly evenly across a calendar year, it means that some kids in – for example – sixth grade or a ten-year-old sports team will be older than their peers. Due to this age disparity, they might be stronger and faster, with more developed brains. These advantages can result in better athletic and academic performance early on, which can lead to additional benefits. Relatively older kids might get promoted to better sports teams or placed into "gifted" academic programs. What started as a simple difference in age can snowball into better life outcomes compared to their younger peers, even as the relative age difference disappears.
A 2006 study looked at what this means academically.
"The youngest members of each cohort score 4–12 percentiles lower than the oldest members in grade four and 2–9 percentiles lower in grade eight," economists at UC-Santa Barbara reported. "In fact, data from Canada and the United States show that the youngest members of each cohort are even less likely to attend university."
Between 1998 and 2007, along a five-mile stretch of the Kali River in India, three people were reportedly dragged underwater, never to be seen again. On an episode of the Animal Planet show River Monsters, biologist Jeremy Wade speculated that the culprit might have been a gigantic goonch catfish, weighing as much as 200 to 300 pounds, more than five times larger than the average goonch.
Though this particular explanation for the disappearances is unlikely, it showcases how wild stories of human-eating animals capture our attention. Some of them are actually true! While catfish may not predate upon humans, other animals do. These are the most likely perpetrators:
1. Lions. As a large, apex predator that hunts animals weighing up to 1,000 pounds, a lion is more than capable of having a human for lunch. And they do. Lions kill between 20 and 250 people each year worldwide. Tanzania has the largest population of lions in Africa. Between 1990 and 2004, 593 people died from lion attacks. As humans encroach more and more upon lions' habitats, it grows increasingly likely that we will be on the menu for these big cats.
2. Tigers. At least 373,000 people may have died from tiger attacks between 1800 and 2009, most likely due to the burgeoning human population in India. While humans are not often a preferred meal, we are relatively easy prey, making us prime targets for older or infirm tigers. History is replete with tales of tigers known to have killed numerous humans. Most recently, a tiger in the Indian state of Maharashtra was hunted down and killed after eating thirteen people over a two-year timespan.
Four and a half months ago, the scenes and stories emerging from New York were downright apocalyptic. Chaos and death reigned inside overwhelmed hospitals, while outside an eerie quiet overtook the normally bustling streets.
The initial onslaught of COVID-19 claimed more than 30,000 lives, but through grit and dedication New Yorkers endured. Since June 4th, the state has seen no more than 100 deaths from COVID-19 in a single day. Since July 23rd, that number hasn't eclipsed twenty. New cases of coronavirus have fallen drastically and remained low.
Here at RealClearScience, a lazy blogging day can prompt a torrent of laughter! That's because we occasionally return to the well of humor available at a crudely-named subreddit of the popular website Reddit to bring you "hilariously stupid science questions". Be prepared to drown in terrible puns, painful fallacies, and poor logic. Should you survive (and somehow enjoy the experience), you can check out some of the other installments in this recurring series.
If we lose net neutrality, will the net become acidic or basic?
If global warming was real, wouldn't the ice wall melt and let the oceans out of the flat earth? So then why is the sea level rising?
Why do meteorites always land in craters?
'Politically-incorrect' comedian Bill Maher holds a number of kooky, anti-science views, but in a recent monologue on his popular HBO talk show Real Time, he shared a candid, evidence-based truth: obesity in America is a major driver of COVID-19 illness and death.
"America fighting COVID is like a boxer who went into the ring, out-of-shape and is taking a beating for it... I think so many lives could have been saved at the very beginning of this crisis if the medical establishment had made a more concerted effort to tell Americans, 'while you're in lockdown, getting free money for not working, you need to do something, too... A national campaign to get in shape would have dramatically improved our chances against this disease..."
America's obesity rate currently stands at an astounding 42.4%, up from 30.5% just twenty years ago. Obese individuals have a body mass index of 30 or greater. To fall into this category, someone who stands 5'9" (the average height for a male) must weigh at least 203 pounds. Similarly-tall individuals tipping the scales at 237 pounds have a body mass index of 35. People in this category suffer vastly more from the effects of COVID-19. A study in France found that they are twice as likely to require mechanical ventilation after entering the intensive care unit compared to healthy-weight individuals. A systematic review discovered that obese COVID-19 patients were 2.3 times more likely to have severe disease. Public Health England recently shared the most glaring data. "For people with a BMI of 35 to 40, risk of death from COVID-19 increases by 40% and with a BMI over 40 by 90%, compared to those not living with obesity."
Johns Hopkins cardiologist David Kass offered three explanations for why obesity significantly worsens the effects of COVID-19.
I recently adopted a puppy. She's a cute dog – people say she resembles a fox or a little auburn wolf.
The latter description is one that many dog owners have taken to heart with their own canine companions. Swayed by marketing, influential trainers on television, and online blogs, they've decided that treating their slobbering pooches like wolves is a good idea.
Wolves and dogs diverged from their last common ancestor between roughly 11,000 and 41,000 years ago. Though tens of thousands of years and numerous genetic mutations separate them, they still share 99.9% of their DNA. Citing these intimate links, some suggest that you should train your dog like a wolf. Chiefly, you should be your dog's "alpha" or pack leader. This entails getting your dog to 'submit' when he or she steps out of line or misbehaves. The idea is that their missteps are truly attempts to gain dominance over you. This means you should respond with strategies like rolling them onto their backs or forcing their heads into the ground. Unfortunately, all these techniques accomplish is instilling fear in your dog. Your pet learns that humans can be harsh creatures, and their touch, a scary thing to be avoided.
While dogs can sometimes look live wolves, their behavior differs from wolves' just as our behavior differs from chimpanzees. Sure, humans share 98.8% of our DNA with chimps, but you wouldn't eat the lice out of your family members' hair, would you?
In 2014, researchers asked men to estimate the size of the average erect penis. Their guess? 6.2 inches (15.8 centimeters). That's actually in line with what numerous scientific studies have reported. But guess what? Those studies are wrong.
Clemson University Professor Bruce M. King, senior author of the textbook Human Sexuality Today, drew attention to this issue in a recent review published to the Journal of Sex and Marital Therapy.
For years, researchers asked men to self-report the length of their erect penises by measuring along the top from the abdomen to the tip, and over that time, men consistently informed researchers that their members ranged from roughly 6.1 to 6.5 inches.
Can you see the problem with this procedure? Asking men to accurately report the size of their penises is like trying to eclipse the speed of light in a junker car: it's not gonna happen.
America is once again awash in conspiracy theories, and it's easy to understand why.
"Studies suggest that conspiracy theories flourish when people feel anxiety, alienation, paranoia, or loss of control," political scientists Joseph Uscinski and Joseph Parent wrote in their seminal book American Conspiracy Theories.
By themselves, a once-in-a-lifetime global pandemic, an unprecedented economic recession, or widespread civil unrest are each enough to trigger all of those feelings. Americans are facing all of these events simultaneously.
That's why some stressed people have linked the COVID-19 pandemic to the spread of 5G cellular technology, insist that the virus was intentionally created and unleashed, accuse Bill Gates of trying to depopulate the world though vaccinations, theorize that mass protests were meant to start a race war, or contend that face masks are killing people.
Hilary can't be sure whether it was she or one of her sisters who subsumed their fellow fetus when they were in their mother’s womb, but she enjoys jokingly taking the blame when the matter comes up at family gatherings.
Hilary is a triplet, part of a rarity that occurs roughly once every thousand or so births. But she could have been a quadruplet. At some point very early in her and her sisters’ development, there were four fetuses present. Then there weren’t.
Such a disappearance is termed "vanishing twin syndrome". According to the American Pregnancy Association (APA), "This occurs when a twin or multiple disappears in the uterus during pregnancy as a result of a miscarriage of one twin or multiple. The fetal tissue is absorbed by the other twin, multiple, placenta or the mother."
And it's surprisingly common, occurring in one out of every eight multifetus pregnancies, but perhaps in as many as three in ten. In the vast majority of instances, vanishing twin syndrome happens so early that it goes entirely unnnoticed, usually within the first seven weeks of pregnancy. Rarely, however, the loss occurs in the second or third trimester, potentially placing the surviving fetuses and the mother at higher risk, as well as causing understandable emotional distress to the mother, father, and other family members.
Consciousness is a paradox – both intimately knowable and nearly impossible to pin down. As humans, we know we have it but haven't a clue how it arises. It's a facet of intelligent life so nebulous that it stretches both science and philosophy to their limits.
When something is this maddening, it helps to break it up into simpler parts. Christopher Tyler, a visual neuroscientist at the Smith-Kettlewell Brain Imaging Center and a Professor at the City University of London, outlined ten properties of human consciousness in a recent paper published to the journal Frontiers in Psychology.
According to Tyler, consciousness' first property is privacy. Simply put, there is no way (outside of science fiction) for any conscious being to completely share the conscious experience of another.
The second property is unity. As Tyler wrote, consciousness must "occur either in a single brain site or in a unified neural net of some kind in the brain, rather than in multiple independent brain sites."
When one thinks of where dinosaurs lived, the most salient image that comes to mind is Jurassic Park. The setting is tropical, with ferns erupting from the ground, enormous trees reaching into the sky, and insects buzzing through the thick, humid air.
But just like today, Earth of the past was a big, ecologically diverse place. And dinosaurs, which dominated the land for over 100 million years, dwelled in almost every corner, even the colder parts.
So instead imagine this scene, which could have played out in the late Cretaceous period between 66 and 100 million years ago: Hundreds, perhaps thousands of elephant-sized Edmontosaurus ambling in a great herd through a snow-covered landscape dotted with coniferous trees, migrating in search of food. Suddenly, out of one of the patches of trees, a predator emerges, perhaps the raptor-like Troodon or the T. rex-related Nanuqsaurus (pictured top). Nanuqsaurus, whose name means "polar bear lizard," might have been blanketed in white, fluffy feathers for warmth and camouflage. It charges the herd, scattering some of the members in panic. Finally, it zeroes in on a target and lunges, clamping its powerful jaws on the Edmontosaurus' neck, delivering an incapacitating blow.
Any student of nature documentaries knows that life and death scenarios like this occur regularly in the Arctic today, most notably with ravenous wolves and migrating caribou. It's fascinating to think that such chases have been happening for millions of years, albeit with different characters.
Humans suffer from all sorts of biases. There's the default effect, "when given a choice between several options, the tendency to favor the default one." There's stereotyping, "expecting a member of a group to have certain characteristics without having actual information about that individual." There's confirmation bias, "the tendency to search for, interpret, focus on, and remember information in a way that confirms one's preconceptions." This is a mere sampling of biases; there are hundreds more.
But there's a strong case to be made that the ultimate human bias may be what's called the "bias blind spot". Simply put, it's the bias that you are unbiased, or at least not as biased as everyone else.
Stanford University researchers Emily Pronin, Daniel Y. Lin, Lee Ross coined the term back in 2002, but the bias first went by a different name, "the illusion of objectivity", in psychologist David Alain Armor's 1999 dissertation, in which he conducted five experiments on more than 800 individuals.
"Across these studies, approximately 85% of participants indicated that they were more objective than the average member of the group from which they were drawn, a state of affairs that cannot be true," he wrote.
Over the preceding decades, hundreds of potential treatments for Alzheimer's Disease have failed to show clinical benefits in human trials, crashing and burning with a consistency that Stat News' Damian Garde called "metronomic." That's terrible news for the more than five million Americans who currently have Alzheimer's and the more than fourteen million who will be living with it by 2060.
It seems that the primary hypothesis for the cause of Alzheimer's – that a buildup in the brain of a protein called beta-amyloid is responsible for cognitive decline – is wrong. Drugs that reduce beta-amyloid don't resolve Alzheimer's patients' crippling dementia.
But despite a landscape of pharmaceutical solutions that's largely devoid of hope, there does exist a widely available therapeutic that has proven highly effective at preventing Alzheimer's: exercise.
According to the Alzheimer's Society, "Several prospective studies have looked at middle-aged people and the effects of physical exercise on their thinking and memory in later life. Combining the results of 11 studies shows that regular exercise can significantly reduce the risk of developing dementia by about 30 percent. For Alzheimer's disease specifically, the risk was reduced by 45 percent."
This past December, I lost my mother. She was 72.
While vacationing for the holidays, she fell and hit her head. What might have been a minor knock for someone far younger resulted in a subdural hematoma – bleeding from the brain leading to a pooling of blood – owing to her age and history with stroke. She was taken to a nearby hospital, anesthetized, intubated, and placed on a ventilator. Doctors performed emergency surgery to relieve pressure on her brain. The surgery was technically successful – the bleeding stopped and the pressure abated – but my mom was unresponsive.
This was the state in which I found her when my brother and I finally arrived to be by her side in the hospital. A situation unfamiliar to us as it would be to most people, it was doubly-so for my mom, who was strong and outspoken throughout her life and career as a college proessor. Breathing with a ventilator but not sedated, she should have been awake, but the trauma that wracked her brain had apparently caused her cognitive processes to go dark. Talking to her with raised voices yielded no response. Only the most painful and annoying stimuli, a hard pinch from the doctor or prick with a needle, would prompt her eyes to open, only fleetingly. We took those treasured opportunities to tell her how much we loved her and wanted her back with us. During the interminable hours, our eyes were glued to the various medical monitors, riding the ups and downs of her blood pressure and body temperature as she battled a post-surgery fever.
Unfortunately, each day in the hospital, she faded a little more. The brief gazes and slight movements grew harder to rouse. She was slipping away. My father, brother, and I were bluntly yet kindly given the prognosis and our options: Our mom may wake up again, but more than likely she won't. If she did, she would probably be completely care-dependent, unable to feed, move, or clean herself. The ventilator and feeding tube that were keeping her alive could do so indefinitely. The longer she was on them, her brain could stabilize and she might be able to live without them, but this "recovery" could leave her locked in, aware of the world around her yet unable to interact with it in any meaningful way. We all agreed this prospect would be torturous for her, but the alternative option: removing her ventilator and feeding tubes, felt like giving up. The doctors framed it in a different way: rather than forcing her to stay alive, withdrawing life support would allow her to choose her fate. After some consideration, we decided that's what she would want.
Anyone who has seen an image of Earth from space can instantly recall our planet's fiercely blue oceans, resplendent green forests, and splotchy brown deserts. This work of art is our wondrous home.
But between 3 and 3.8 billion years ago, Earth may have been unrecognizable, its modern medley of colors instead dominated by one: purple.
University of Maryland-Baltimore Professor Shiladitya DasSarma originally painted this picture at 2007's American Astronomical Society Meeting. His 'Purple Earth' hypothesis has remained a speculative delight ever since and now has growing evidence to support it.
The hypothesis is that Earth's earliest microbes were phototrophs, capturing photons from sunlight to produce energy for themselves. This isn't too surprising – the plants that dominate Earth today are also phototrophs. However, unlike modern plants, which utilize the pigment chlorophyll to capture light, these ancient microbes might have used a pigment called retinal. Chlorophyll absorbs red and blue light while reflecting green. Retinal does the opposite, meaning microbes making use of it would appear purple.