Why the Public Should Mistrust Science

This weekend RealClearScience ran an editorial bemoaning the public’s distrust of science. The author explains away the problem by insisting that the public is naive, rooked by Republican "pandering," and uninterested in a scientific establishment “too old, too male,” and not "representative.”

In essence, she answers her own question. Reasonable people distrust science because of the way scientists treat them.

The truth is that many scientists pick political sides, look down their noses at the public, and play politically correct games. I think the public is on to something. I wouldn’t trust us either.

The condescension is the worst part. Many research scientists who fret and wail about public ignorance live off the public dime. Our microscopes, our labs, our pipettes and our particle colliders are bought with taxpayers’ money. They pay our salaries too.

So when we tell them how stupid they are, how ignorant and backward and wrong they are, why shouldn’t they be angry? Belittling someone in an argument never wins his support. How much more arrogant and foolish is it to belittle the people who write your paycheck?

If the public doesn’t understand why we believe certain things, that’s our fault. We owe them better communication. We need to demonstrate the value of our work and ideas, not demean theirs.

The other big problems in scientist-citizen relations arise from politics.

Scientists have increasingly become overtly political rather than maintaining distance. Worse, they usually fight for one side only. Then they turn around and act shocked when the enemies they have unnecessarily made don’t trust them! There’s condescension here too: how dare these ignorant little people not fall in line behind their intellectual superiors!

The public trusts those with strong moral codes lying in a plane above politics. Four of the five most trusted institutions in the United States are the military, the police, the church, and medicine. All of which are supported by apolitical moral backbones.

Congruently, we distrust those possessing no morals above the political. Among the least popular institutions in this country: Congress, TV news, organized labor, and newspapers.

Every time science picks sides in politics, it slips away from the trusted group and sinks toward the disreputable one. Science is selling away its considerable moral stock by choosing to fight for the momentary goals of its political favorites. That’s a terrible, shortsighted bargain.

Just as science needs a system of morals outside of politics, it needs a system of discerning merit outside of politics too. Scientists should be choosing and rewarding their leaders and workers by quality of output and scientific ability. When we trade this honest system of merit for politically correct affirmative action, we throw away the confidence of a huge swath of the public.

“Too old, too male, out of touch.” These are politically correct cries that endear scientists to the press, to the bureaucracy, and to a subset of the public. But they create many more needless enemies. Here again we sell out our moral stock for a shallow bit of momentary political and media affection.

If the public does not trust science, I say many scientists have given them good reason. Scientists need to be more trustworthy. We need to stop looking down on the public. We need to stop playing politics and taking up the banner of political correctness. Otherwise we become just another snooty political faction, trusted fleetingly by our friends of convenience but permanently loathed by our growing legion of self-inflicted enemies.

[Image: AP]

The Idea That the Big Bang Destroyed

What happened before the Big Bang? From a cosmic perspective, it's impossible to know for sure (for now, at least). But there is a similar question we can answer! What was there before the Big Bang theory?

For half a century, The Big Bang model of the Universe has stood out as the dominant theory for how everything came to be. But before coming to prominence, the Big Bang struggled to take hold in the minds of cosmologists. Instead, many preferred Steady State theory. Championed by English astronomer Sir Fred Hoyle, the theory states that the observable Universe remains essentially the same at all locations of time and space. Stars and planets form, die, and form again, but the density of matter does not change. This means that the Universe had no beginning. It also means it will have no end. The Universe simply is.

During Steady State's heyday in the 1940s and 50s, Hoyle and his contemporaries defended the theory with intellectual vigor, winning over a significant portion of the scientific community. To reconcile their idea with Edwin Hubble's observations that the Universe was expanding, they suggested that matter was created in between distant galaxies. As long as a single atom of hydrogen per cubic meter were squeezed out of the stretching fabric of space every ten billion years, their theory would check out with observational data.

"If matter were spontaneously created at this meager rate, then each and every location in the Universe -- on average -- would always contain the same number of galaxies, the same population of stars, and the same abundance of elements," astrophysicist Ethan Siegel described in his recent book Beyond the Galaxy.

Compared to the notion of a fiery, primeval creation event from which the entire Universe sprouted -- what Hoyle snarkingly dubbed the "Big Bang" -- this was not that far-fetched. Hoyle derided the Big Bang model as an "irrational process" that "can't be described in scientific terms," akin to creationism.

But in 1965, Steady State Theory lost a decisive scientific battle. Radio astronomers Arno Penzias and Robert Woodrow Wilson realized that the faint, blanketing noise emanating from their antenna's receiver at Bell Labs originated from the universe itself. It was the afterglow of the Big Bang, the cosmic microwave background radiation! Hoyle and his fellow Steady State proponents scrambled to account for the new observations.

"Perhaps this was not radiation left over from the Big Bang, but rather was very old starlight, emitted from stars and galaxies strewn across the Universe," Siegel described. This light could have been scattered and re-emitted by the atoms constantly popping into existence.

Alas, further observations discounted this explanation, and the Big Bang model swiftly ascended to take its place as the leading theory accounting for the Universe.

Like the slow, creeping heat death of the universe predicted by the Big Bang model, in which the Universe expands forever until what's left is too devoid of energy to sustain life, the Steady State theory gradually fizzled out in scientific circles. Hoyle, however, defended his theory until his dying day in 2001, never accepting the overwhelming evidence for the Big Bang.

(Tom Image: Fabioj)

What Is the Earliest Evidence for Life on Earth?

For the first 600 million years of Earth's 4.54 billion-year history, our planet was a hellish place. The rampant volcanism and frequent collisions that wracked our world rendered the surface unforgiving and purportedly inhospitable to life. While water was probably present, the oceans of the time may instead have been rolling seas of magma. The name for this period, the Hadean, is borrowed from Hades, the Greek god of the underworld. The moniker's meaning is obvious: early Earth was a place of death.

Yet it was on this comparatively cursed landscape that --against all odds -- life might have emerged. The controversial clue to this incredible notion was made public last fall. Scientists from UCLA showed off apparently biogenic carbon that was locked away inside a near impenetrable crystal for 4.1 billion years.

The oldest rocks on Earth don't even date back that far, but peculiar minerals called zircons do. The oldest-known zircons, discovered in the Jack Hills of Western Australia, originally crystalized 4.4 billion years ago! It was within one of these zircons that geochemist Elizabeth Bell and her team discovered the carbon they think was produced by life. Life that old, whatever it was, would not have bones, or even a clearly-defined shape, so a true fossil find will probably never be unearthed. Instead, whatever carbon-based life existed back in the Hadean would simply leave traces of itself in the form of carbon itself. Bell's co-author, Mark Harrison, referred to the stuff as "the gooey remains of biotic life."

But not all carbon comes from living organisms, so how did Bell and Harrison conclude that the carbon originated from life? Well, biological processes typically concentrate the lighter, more common form of carbon, an isotope called carbon-12, and less of the heavier isotope carbon-13. The carbon that they discovered within the zircon had lots more carbon-12 than one would expect from typical stores of the element. (Figure below: The arrows indicate the concentration of Bell's carbon compared to the typical carbon concentrations of certain life forms. Note that it has far less carbon-13. )

"The simplest explanation for this is that we had biogenic carbon at or before 4.1 billion years," Bell said in a talk earlier this year at the SETI Institute. The second-most likely explanation, she says, is that the carbon came from the meteorites that supposedly peppered the planet around the same time.

Astrobiologists are big fans of Bell's finding. If true, it serves as a testament to the hardiness of life, and fuels hopes that we may find it elsewhere in our very own solar system.

Bell and her team have now set about searching for carbon in more ancient zircons from the Jack Hills, raising the possibility that they may find even earlier signs of life on Earth. 

(Images: MarioProtIV, Elizabeth Bell)

The LHC Is at the End of an Era

For the past eight months, physicists held their breath anticipating further news about the "bump" seen at the Large Hadron Collider (LHC). We just found out that this bump was a statistical anomaly, but it’s still of great consequence to the particle physics community. The future of the field depends on bumps like these. The departure of this bump back to the beyond brings us closer to the decision looming at the end of the accelerator physics road.

We can argue about how many TeV we need to hypothetically imagine seeing some supersymmetric particle. That's missing the forest for the trees. The scientific motivation is far too weak to justify spending tens of billions of dollars. High energy particle physics has been wildly successful, but things are now changing.

For half a century we beat down the path of experimental particle physics. Along the way, we verified that the field’s leading theory was indeed guiding us in the right direction. The Standard Model map was validated to stunning precision by the detection of bump after bump in the road, right about where they all should have been. The Higgs Boson, found in 2012, was the crowning achievement. It was also the last marked destination on that road.

Now we’re at a dead-end; the road has flattened out. String Theory—the new roadmap—has so far proved an utter failure. The excitement over this possible new detection was that it might signal some big discovery, maybe even open a whole new uncharted land of physics to explore. Unfortunately we weren’t so lucky. We’re back to being lost.

There's always one good reason to build big experiments: they may spot something totally new and unexpected. But that’s a tough sell when there may be nothing new to see with a $10 billion telescope and your next pitch is for a $20 billion-plus telescope. Perhaps someone will pick up this tremendous tab, but without any solid theoretical predictions—or astounding unexpected discoveries—I wouldn't bet on it. There probably won't be any more LHC's.

So, what do we do with what we've got?  Two purposes come to mind for the future of the colossal machines built to bushwhack into uncharted fundamental physics.

Science shouldn’t be mental masturbation. The ultimate goal of burning 10-figures-worth of public dollars is to find facts that can be used to better everyone's lives. When you’ve got facilities as powerful as the LHC, you can focus their enormous capabilities onto the problems of medicine, energy, and designing new materials.

Many fundamental physics machines ply this trade. Well-designed, well-built, and well-run accelerator facilities can live on into their golden years by finding new uses to bend their beams to.

The machines themselves aren’t the sole accomplishments. America has built its commanding superiority in the scientific world with the help of thousands upon thousands of highly trained and exceptionally skilled scientists, many of whom have worked at places like the LHC, SLAC, Fermilab's Tevatron, its predecessor the Bevatron, the failed SSC project, and others.

It’s hard to directly appraise this resource but its secondary value is incalculable. You can’t buy this kind of talent; you can only develop it.

Directing its energy toward applied science while developing generations of new talent looks like the foreseeable future of America’s particle physics community. CERN and the Large Hadron Collider should now follow that path. Unless another bump comes along.

[Image: BEBC Experiment]

 

Building a Mountain on a Pinhead

Richard Feynman imagined writing the Encyclopedia Britannica on the head of a pin. A decade ago scientists inscribed the entire Hebrew text of the Old Testament into a pinhead of silicon. Digging holes only one ten-thousandth-of-an-inch across is easy with today's technology.

A study published in Applied Physics Letters this week describes a different process: building a tiny Matterhorn on a pinhead.

Here's a beautiful image of their creation.

A team of German scientists led by Gerald Göring used the concept of 3-D printing, miniaturized to the nanometer scale, to build tiny sculptures. But they aren't doing it for art connoisseurs. They made needle-point tips for an atomic force microscope (AFM).

An AFM drags its needle tip across a surface, feeling for contours. When it hits a bump as small as a single nanometer, it is deflected. A system watching the reflection of a laser off of the needle tip records the amount of tip movement from the change in the angle of the laser reflection. The tip gently swipes back and forth across the entire sample like the beam in an old CRT television, measuring the height along the way. In this manner it builds up a topographic map of the surface.

The goal of this incredibly delicate instrument is to make maps of surfaces far smaller than any optical microscope can see. The smallest thing the AFM can see is the size of the needle tip.

Hence, the German researchers used their miniaturized 3-D printer to produce tips of a wide variety of shapes and sizes. They made tips that are extremely long and thin, for probing deep and narrow canyons. They made tips with extremely precise shapes so that they could factor the shape back out of a surface map more accurately.

They also experimented with printing modifiable tips. Most AFM tips need to vibrate at a certain ferquency to perform their scans. By building and knocking out tiny struts along the AFM tip, they can tune the tip's vibration, like the tension of a buzzing guitar string.

Building a mountain on a pinhead is an exercise in incredible control of tiny things. It's also a gorgeous demonstration of how far miniaturized technology has progressed and expanded in new directions.

(Image: AP)

Are Religious People More Moral Because They Fear God's Punishment?

Is religion necessary for morality? The answer is almost certainly "no". But slightly to the chagrin of passionate atheists, religious people do seem to behave more morally than irreligious people. Through population level surveys and laboratory studies designed to gauge prosocial behavior, psychologists have found a robust link between religiosity and salubrious actions like helping, sharing, donating, co-operating, and volunteering.

Not to be upstaged in this foundational philosophical feud, the irreligious have fired back, arguing that religious morality arises not from altruism, but from selfishness. In this viewpoint, religious followers are more inclined to do good because they are tempted by divine rewards and threatened with divine punishments

Turns out, atheists may have a point.

Psychologists James Saleam and Ahmed Moustafa of Western Sydney University scoured the psychological literature in search of evidence for and against this controversial contention. They just published their findings in the journal Frontiers in Psychology,

Saleam and Moustafa begin by acknowledging that the balance of empirical evidence shows that religious people do tend to behave more prosocially than irreligious people. However, they note that the evidence is rarely accompanied by explanations.

"If there is a link between religious belief and prosociality, then there must be an underlying reason for this," they write.

The attempt to uncover this reason has led a small number of psychologists to focus on two intertwining ideas, the "supernatural monitoring hypothesis" and the "supernatural punishment hypothesis". Taken together, the notions suggest that religious people behave more prosocially because they feel they are being watched by divine entities and that their actions will be punished or rewarded by those entities. 

A number of studies support this controversial idea. In one, religious people primed with notions of salvation and Heaven were more generous when participating in the Dictator game, a classic psychology experiment in which one subject gets to decide how much money to share with another subject. In another study focused on the Muslim population, subjects who read passages of the Qur'an related to divine punishment showed more prosocial actions than subjects who read more neutral passages focused on Allah's mercy. In a third study conducted in Canada, participants who viewed God as "forgiving" and "gentle" were more likely to cheat on a math test than subjects who viewed God as "vengeful" and "terrifying".

It will take a significant number of studies conducted across the world to truly evince the link between moral behavior and religious incentives. After all, human belief is just as diverse and nuanced as human beings. To say that all believers are more moral than nonbelievers out of fear and selfishness is a gross oversimplification.

"It is unlikely that any single hypothesis will provide a comprehensive account of the religion-prosociality link," Saleam and Moustafa write, "...but now there is a growing body of empirical literature supporting the notion that... divine incentives do influence prosociality."

Source: Saleam J and Moustafa AA (2016) The Influence of Divine Rewards and Punishments on Religious Prosociality. Front. Psychol. 7:1149. doi: 10.3389/fpsyg.2016.01149

(Image: AP)

I'm Here Because We Bombed Hiroshima

Saturday is the 71st anniversary of the atomic bombing of Hiroshima. The culmination of the greatest science project in the history of the human race incinerated tens of thousands of Japanese and left thousands more with fatal radiation poisoning.

Widely considered at that time a necessary measure to finish the war with Japan, opinion on the bombing is mixed today. With the brutal reality of all-out global conflict a distant memory, many view the bombing as a tragic mistake.

I'm writing today because President Truman made the right decision in 1945. Millions of other so-called millenials can count the same blessing.

Historical facts strongly support the decision to bomb. The two atomic bombs killed roughly 200,000. Japanese military officers estimated that as many as 20,000,000 Japanese would have lost their lives in defense of the Japanese mainland. America estimated its own deaths to number in the hundreds of thousands. Wounded would have been in the millions on both sides.

But there is another angle to this story. It's very personal to me.

Beside my desk is an old leather case with leather handles and a brass catch. Gently easing open the worn top reveals a large pair of metal binoculars. The lenses are still clear; a careful focusing procedure brings into relief tiny reticles etched onto the glass for measuring distances. They are painted a dark green color, maybe to provide some meager camouflage in the high bows of a tree.

My grandfather was a forward observer. His job was to go in to the beach first, climb a tree, and call in directions for the artillery that would bombard the defenses at the Japanese landing beach. In front of the invading army, these binoculars were made to resolve the targets of the first artillery barrages to soften its arrival on the beachhead. Picture the opening scene of Saving Private Ryan; the planned invasion of Japan would have been an amphibious assault on the scale of D-Day.

Artillery spotters like my grandfather had just about the lowest life expectancy of any troops in ground combat. He very likely would have died up in that tree, calling artillery directions into his radio.

Thankfully, he never had to go in first, to face the 2.3 million defending Japanese. The US command at the time made the right decision: finish the war as quickly as possible, with the fewest deaths on both sides.

All-out war is often a choice between something horrific and something even more horrific. Making these decisions surely weighs upon the consciousness of wartime leaders. Harry Truman struggled with the decision but ultimately believed he had made the correct call. Oppenheimer, the physicist who headed the weapon design lab at Los Alamos during the war, was a staunch leftist who later became a peace activist. Yet, he too went to his grave supporting the creation of the bomb.

Today, looking back at history from a privileged position as beneficiaries of the bombing, many who grew up only in its aftermath wish that Hiroshima and Nagasaki had been spared. This is a fantastically naïve thought; yet it is held by even some of our most prominent leaders today.

Earlier this summer our President minced words with the Japanese Prime Minister at Hiroshima. While stopping short of apologizing, he expressed sadness about the event. How quickly we forget our history and our blessing never to have faced such a difficult choice.

Fortunately, we made the right decision. Without that momentous blast shaping human history I might not be alive today. Millions of you can count the same blessing. Remember that.

There's an Amazing Reason Why Races Are Started With Gunshots

Over the next few weeks, sprinters from all over the world will gather in Rio to show their speed on the sport's grandest stage: the Summer Olympics.

Sometimes glossed over whilst watching the galloping strides of the world's top athletes is a simple truth: Sprinting events are not only tests of speed; they are also tests of reaction. To have a chance of winning the 100-meter dash, modern sprinters must finish in less than ten seconds. To win the 200 meters, they have to finish in under twenty.

The overwhelming majority of an athlete's race time is spent sprinting, but a significant minority is reserved for reaction time. Races are traditionally started with gunshots. At the sound of a bang, sprinters leap off the blocks and accelerate down the track at breathtaking speeds. But the seemingly small duration of time in between the gunshot and the start is actually a sizable gap. As Stanford neuroscientist David Eagleman explained in his recent PBS documentary The Brain:

"[Sprinters] may train to make this gap as small as possible, but their biology imposes limits. Processing that sound, then sending out signals to the muscles to move will take around two-tenths of a second. And that time really can't be improved on. In a sport where thousandths of a second can be the difference between winning and losing, it seems surprisingly slow."

In light of this situation, Eagleman posed an intriguing question: Why not start races with a flash of light? After all, light travels roughly 874,029 times faster than sound. By starting races with a gunshot, we may actually be handicapping sprinters!

To answer the question, Eagleman lined up a group of sprinters. In one case, they were triggered by a flash of light. In another case, they were triggered by a gunshot. Their starts were filmed in slow motion and compared. (Note: In the image below, the gun fired and the light flashed at the same time.)

As you can clearly see, when sprinters are triggered by a gunshot they actually react faster, despite the fact that light reaches their eyes well before sound reaches their ears! There's an incredible reason why this is the case.

"[Visual cues] takes forty milliseconds longer to process," Eagleman explained. "Why? Because the visual system is more complex. It's bigger, and involves almost a third of the brain. So while all of the electrical signals inside the brain travel at the same speed, the ones related to sight go through more complex processing, and that takes time."

As you sit down with friends and family to watch sprinting during this summer Olympics, feel free to enlighten them with this fascinating factoid!

(Images: AP, The Brain with David Eagleman)

The Scientific Study Condemned by Congress

On July 30th, 1999, the U.S. House of Representatives passed House Concurrent Resolution 107 by a vote of 355 to zero. The vote was staggering, but not for its bipartisan appeal in an era of hyperpartisanship. It was staggering because it was the first and only time that a scientific study was condemned by an act of Congress.

"Resolved by the House of Representatives (the Senate concurring), That Congress... condemns and denounces all suggestions in the article 'A Meta-Analytic Examination of Assumed Properties of Child Sexual Abuse Using College Samples'' that indicate that sexual relationships between adults and 'willing' children are less harmful than believed and might be positive for 'willing' children..."

With wording like that, it's plain to see why the resolution passed as it did. A vote against would be seen as a vote for pedophilia! Understandably, no politician would ever want that on their voting record.

Representative Brian Baird was one of the few representatives who opposed the resolution, electing to do so by voting "present." As a licensed clinical psychologist and the former chairman of the Department of Psychology at Pacific Lutheran University, Baird knew that the study had been blatantly mischaracterized. But one didn't need a doctorate in clinical psychology (like Baird) to recognize that, one simply needed to read the study.

Any politician who bothered to do so would have realized that the paper did not endorse pedophilia at all. In her recent book, Galileo's Middle Finger, historian of science Alice Dreger described the paper and its findings:

"Bruce Rind, Philip Tromovitch, and Robert Bauserman performed a meta-analysis of studies of childhood sexual abuse... They took a series of existing studies on college students who as children had been targets of sexual advances by adults and looked to see what patterns they could find."

The authors uncovered a plethora of patterns, but none more controversial than the fact that childhood sexual abuse did not always inflict lasting harm upon the victim. The degree of harm seemed to depend on a variety of factors. Two of the most notable included forced abuse and incest. As Dreger recounted:

"Rind, Tromovitch, and Bauserman were saying something very politically incorrect: some people grow up to be psychologically pretty healthy even after having been childhood sexual abuse victims. In fact, Rind and company... [suggested] that the term childhood sexual abuse seemed to imply that child-adult sex always led to great and lasting harm, whereas the data seemed to show it did not in a surprising proportion of cases."

Aware that their findings would be controversial, the authors concluded their study by saying that they in no way advocate pedophilia.

"The findings of the current review do not imply that moral or legal definitions of or views on behaviors currently classified as childhood sexual abuse should be abandoned or even altered. The current findings are relevant to moral and legal positions only to the extent that these positions are based on the presumption of psychological harm."

Their erudite invocation of nuance did not work. The North American Man/Boy Love Association quickly praised the paper, which, in turn, prompted righteous condemnation from religious, conservative, and family groups. Conservative radio host Dr. Laura Schlessinger even went so far as to assert "the point of the article is to allow men to rape male children." (It wasn't.) Representative Joseph Pitts claimed that "the authors write that pedophilia is fine... as long as it is enjoyed." (They wrote no such thing.)

With the fires of controversy blazing and ideological battle lines drawn, the paper soon landed on the desks of Republican congressmen Tom Delay and Matt Salmon, who swiftly moved to score political points. Salmon drafted the aforementioned resolution condemning the study and Delay pushed it through the House.

Laypersons and politicians weren't the only people to criticize the study, other scientists did, too, particularly questioning the methodology and the matter in which the findings were reported. However, the American Association for the Advancement of Science commented that the paper did not demonstrate questionable methodology or author impropriety.

To anyone even remotely connected to the scientific enterprise, the "Rind et. al. controversy" (as it is called) offers a chance not only for ample face-palming but also for reflection. Science sometimes questions common sense and does not adhere to political correctness. It deals only in what is true and what isn't. All that's up to us is to decide whether to accept that truth and try to understand it... or deny it.

As Dreger noted in her book, the truth presented Rind's study is not as alarming as it was caricatured more than fifteen years ago.

"It seemed to me that the Rind paper contained a bit of good news for survivors, namely that psychological devastation need not always be a lifelong sequela to having been sexually used as a child by an adult in search of his own gratification."

"But simple stories of good and evil sell better," she added.

Primary Source: Galileo's Middle Finger: Heretics, Activists, and One Scholar's Search for Justice, by Alice Dreger. 2015. Penguin Books.

(Image: Public Domain)

The "Neglected" Archaeological Mystery on the World's Largest Lake Island

Lake Huron's Manitoulin Island is the largest lake island in the world. It's so large, in fact, that it has 108 lakes of its own. Three of Manitoulin's largest lakes -- Lake Manitou, Lake Kagawong and Lake Mindemoya -- even have islands of their own, some of which have ponds of their own.

Pockmarked, picturesque Manitoulin is also home to an enduring archaeological mystery. In 1951, Thomas Edward Lee, an archaeologist for the National Museum of Canada, discovered an incredible array of artifacts on a rocky hill near the island's northeastern shore. The site was dubbed Sheguiandah, after the small village nearby.

Four years of excavations would eventually unearth rudimentary scrapers, drills, hammerstones, and projectile points, apparently constructed out of the quarried bedrock jutting from the hill. Blades were in particular abundance. Many thousands of all shapes and sizes were found, some preserved in the cool, forest soils, others drowned in peat bogs, and even more littering the open ground. Documenting the expedition in 1954, Lee imagined prehistoric "quarrying operations of staggering proportions." Clearly, Sheguiandah was a hub of industry. 

“It would not be surprising that people who carried out the enormous amount of work done here should also have lived on the site," Lee wrote.

But while Lee and his colleagues uncovered a great many blades and other objects, they didn't turn up a single human bone. Signs of human activity were etched, scraped, and smashed in stone (quartzite to be specific), but there were no human remains. The men and women that worked the quarry at Sheguiandah may have originated from more permanent settlements somewhere else. If that's the case, those settlements have yet to be unearthed.

It is partly due to the lack of remains that dating Sheguiandah has proven extremely difficult. When Lee concluded his excavations in the 1950s, he created quite a buzz by insisting that some of the artifacts were found in deposits dating back to the Ice Age, as many as 30,000 years ago! Suddenly, Manitoulin Island may have been home to the oldest traces of humankind in North America!

The story Lee wove was fascinating. As summarized by his son:

"Early peoples lived and left their stone tools on the Sheguiandah hilltop in a warm period before the last major glacial advance. The returning glaciers caught up and moved those tools - but only a few yards or tens of yards. The tools stayed locked up under the ice for tens of thousands of years, until it melted away. Then a succession of Paleo-Indian and Archaic groups migrated along the north shore of a subarctic Great Lake. Each stopped briefly at Sheguiandah, leaving a scattering of spearpoints and other stone tools on top of the glacial till deposits. About 5000 years ago, the lake basin filled again, temporarily turning the hill into an island in Great Lakes Nipissing. A tremendous stone-quarrying industry sprang up, covering parts of the site at least five feet deep in broken rock."

But the archaeological community generally disagreed. Lee's timeline challenged popular theories insisting that humans didn't reach the heart of North America until after the Laurentide Ice Sheet receded from what is now the northern United States. This glacial retreat heralded the end of the Ice Age roughly 20,000 years ago. Soon, major journals refused to publish Lee's controversial papers on Sheguiandah, and as the discussion faded from public forums, the site itself was slowly forgotten. As Lee's son Robert noted in 2005, "In 1987 neutral observers characterized it as 'Canada's most neglected major site of the past 30 years.'"

In 1991, archaeologists Patrick Julig of Laurentian University and Peter Storck of the Royal Ontario Museum in Toronto led new expeditions to Sheguiandah, and, armed with advances in dating techniques, re-evaluated Lee's claims. They determined that the site was roughly 10,500-years-old, instead envisioning Paleo-Indians as the site's first and only inhabitants after the conclusion of the Ice Age. At the time, Manitoulin Island was almost certainly connected to what is now mainland Ontario, and the inhabitants likely traded their stone tools far and wide.

Manitoulin Island is today a quiet place, home to roughly 12,600 permanent residents who adore the island's untrammeled wilderness and timeless, glacier-worn topography. The controversy surrounding Sheguiandah has settled down. Now, tourists can take guided walks through the site and still find scores of artifacts strewn across the landscape. We may never know who they once belonged to.

(Images: NASA, Thomas E. Lee, American Antiquity, Jhapk)

How Do We Know What Is True?

How do we know if something is true?

It seems like a simple enough question. We know something is true if it is in accordance with measurable reality. But just five hundred years ago, this seemingly self-evident premise was not common thinking.

Instead, for much of recorded history, truth was rooted in scholasticism. We knew something was true because great thinkers and authorities said it was true. At the insistence of powerful institutions like the Catholic Church, dogma was defended as the ultimate source of wisdom.

But by the 1500s, this mode of thinking was increasingly being questioned, albeit quietly. Anatomists were discovering that the human body did not function as early physicians described. Astronomers were finding it hard to reconcile their measurements and observations with the notion that the Sun revolves around the Earth. A select few alchemists were starting to wonder if everything really was composed of earth, water, air, fire, and aether.

Then, a man came along that refused to question quietly. When Italian academic Galileo Galilei looked through his homemade telescope and saw mountains on the moon, objects orbiting around Jupiter, and phases of Venus showing the Sun's reflected light -- all sights that weren't in line with what authorities were teaching -- he decided to speak out, regardless of the consequences.

In The Starry Messenger, published in 1610, Galileo shared his initial astronomical discoveries. He included drawings and encouraged readers to gaze up at the sky with their own telescopes. Thirteen years later, in The Assayer, Galileo went even further, directly attacking ancient theories and insisting that it was evidence wrought through experimentation that yielded truth, not authoritarian assertion. Finally, in 1632, Galileo penned the treatise that would land him under house arrest and brand him a heretic. In Dialogue Concerning the Two Chief World Systems, Galileo cleverly constructed a conversation between two fictional philosophers concerning Copernicus' heliocentric model of the Solar System. One philosopher, Salviati, argued convincingly for the sun-centered model, while the other philosopher, Simplicio, stumbled and bumbled while arguing against. At the time, "Simplicio" was commonly taken to mean "simpleton." Simplicio also used many of the same arguments the Pope employed against heliocentrism. At the the time, the Catholic Church was not opposed to researching the topic, but they did have a problem with teaching it. Thus, the Vatican banned the book and imprisoned Galileo.

By stubbornly refusing to be silent, Galileo irrevocably altered the very definition of truth. Scientists today forge breakthroughs in all sorts of fields, but their successes can ultimately be attributed to Galileo's breakthrough in thought. In her recent book, Galileo's Middle Finger, historian of science Alice Dreger paid tribute to the legendary astronomer.

"Galileo actively argued for a bold new way of knowing, openly insisting that what mattered was not what the authorities... said was true but what anyone with the right tools could show was true. As no one before him had, he made the case for modern science -- for finding truth together through the quest for facts."

Primary Source: Galileo's Middle Finger: Heretics, Activists, and One Scholar's Search for Justice, by Alice Dreger. 2015. Penguin Books.

There Are Actually Two Puberties

Did you know that humans, mice, monkeys, and other mammals experience two puberties?

It's the second you're no doubt aware of. In a process that initiates in the early teens, boys and girls are flooded with hormones produced by their sex glands. The result is a mental and physical transformation. Boys grow facial hair, develop a deeper voice, and double their skeletal muscle. Girls' breasts enlarge and their hips widen. Both boys and girls grow pubic hair, become sexually fertile, take on an "adult" body odor, and generally annoy their parents. 

But more than a decade before this second puberty, the one chronicled again and again in coming-of-age movies, there is a first puberty, a so-called "mini-puberty." Originally described back in the 1970s, scientists are still working today to understand it. Here's what they know: Roughly one to two weeks after birth, newborn boys and girls endure a rush of hormones, a process known as the Postnatal Endocrine Surge. Luteinizing hormone and testosterone dominate male mini-puberty, while follicle-stimulating hormone and estradiol (estrogen) hallmark female mini-puberty. The process lasts around four to six months in boys and a little longer in girls, concluding with hormone levels abating to typical childhood levels.

Exactly why mini-puberty happens is unclear, but the bodily changes it causes are quite clear. In a commentary recently published to the journal Pediatrics, University of Oklahoma pediatricians Kenneth Copeland and Steven Chernausek described a few of them. 

"Resultant effects on the reproductive organs include testicular, penile, and prostate growth in boys, uterine and breast enlargement in girls, and sebaceous gland and acne development in both sexes."

According to a recent study, the testosterone surge that boys experience during this phase likely accounts for why adolescent boys are a little taller than adolescent girls. The surge also explains roughly fifteen percent of the height difference between adult men and women.

Research is increasingly showing that mini-puberty is a sensitive time of development, acting as a "stress test" of sorts for the endocrine system, revving it up to ensure lifelong function. It also primes target tissues -- particularly in the reproductive system -- for growth and maturation later in life. Lastly, and more controversially, mini-puberty may be a time where sexual orientation and behavior are imprinted, with hormones playing a defining role. Multiple studies evince this claim in animals, but research in humans is lacking.

Mini-puberty is not nearly as conspicuous as its older, more mature sibiling, but its effects may be just as consequential. Further research will undoubtedly reveal its outsized role in human development.

(Sandra J. Milburn/The Hutchinson News via AP)

The Biggest Myth About Your Skin

"The skin is the largest organ in the human body."

It's a factoid that has made the rounds in hundreds of scientific studies, been taught in high school anatomy classes across the country, and even sits nestled within Wikipedia's page on human skin. But that too-cool-to-be-true-so-it-must-be-true "fact" is actually false. No matter how you spin it, the skin is not the largest organ in the human body.

The firm debunking of this persistent myth arrived slightly more than a quarter-century ago. In a lighthearted paper, University of Rochester dermatology professor Dr. Lowell Goldsmith approximated the average skin mass of a 154-pound male. Without resorting to flaying, Goldsmith determined a skin weight of roughly 8.5 pounds, or 5.5% of body weight, placing the skin well behind the body's bones (14% of body weight) and far behind its muscles (40%). The proud dermatologist was humble in defeat.

"Even with my extreme chauvinism for our organ it is logically difficult for me to consider the skin to be an organ and not muscle, blood, and bones. These other three organ systems do not have the natural fuzzy, furry countenance of our organ, but they are larger."

Nor can the skin claim superiority in another category: surface area. As University of Utah dermatology professor Richard Sontheimer explained with a tinge of sorrow, "The human skin surface area is identical to body surface area. The body surface area of the proverbial 70 kg man is 1.7 square meters. By comparison, the gas exchanging surface of the lung’s airways has been estimated to be 70 square meters, which is approximately half the size of a tennis court."

Now, do not fret for the collective ego of dermatologists, and definitely don't look down at your dermis and epidermis, the skin's two primary layers. The age-old adage, "Size doesn't matter" absolutely applies. Your skin is a biological wonder, worthy of whatever moisturizer or fancy skin care product you can slather on! Simultaneously waterproof, yet dotted with sweat glands that serve to excrete waste and keep you cool, the skin is the body's best tool to interact with the world, thanks to sensitive receptors that allow you to touch and feel. It's also home to more than 1,000 species of bacteria! Most of them are harmless, but those that would be harmful are incessantly stymied by the skin, which, by brute force, denies entry to pathogens laying siege to the body.

So say your skin shines! Call it glamorous! Hail it as smooth and silky!

Just don't call it the largest organ...

(Image: Helena Paffen/Wikimedia Commons)

Teach Particle Physics in High School

Modern physics underpins many of the greatest technological discoveries of the past few decades. The cell phones and computers upon which we rely were sewn into reality thanks to a fundamental understanding of the Standard Model of Particle Physics. That's why it's surprising that, in many high school classrooms, physics seems stuck in the past.

Indeed, in curricula across the country, physics class often takes the form of a history lesson, with live demos and labs sprinkled in for occasional hands-on experimentation. Unfortunately, the focus on the past can create the impression that physics' greatest discoveries are in the past. While it's simultaneously stimulating and essential to learn about J.J. Thomson's electrons, Ernest Rutherford's protons, Nikola Tesla's alternating current, Isaac Newton's laws of motion, and Albert Einstein's relativity, there also must be room to discuss the big questions challenging physicists today. For that, students must be given the opportunity to wrap their minds around the unimaginably miniscule. They must be taught particle physics!

The atom is not the fundamental building block of all there is. Smaller particles called protons and neutrons pack inside the atom's positively charged nucleus, while electrons flit about in a negatively charged cloud just outside. For most high school physics students, this is where the story ends. They never learn that protons and neutrons are composed of even smaller particles called quarks. They never learn that the electron has five other sibling particles -- electron neutrino, muon, muon neutrino, tau, and tau neutrino -- which are collectively called leptons. They never learn that everything in the universe, from the cereal they eat for breakfast to the galaxies above, is fundamentally composed of just six quarks and six leptons.

A fortunate few do become privy to this incredible information. Each year, roughly 10,000 students aged 15 to 19 are excused from school for a day to venture to a nearby university or research center to learn about particle physics. The International Masterclasses, organized by the International Particle Physics Outreach Group, have been running for twelve years now and engage students in 47 different countries.

The Masterclasses are a worthy effort, but ultimately, no student should have to leave school to learn about particle physics. The 14.9 million students enrolled in American public high schools should be taught particle physics in their own classrooms, even if it's just a fleeting introduction. Ensuring the education won't even be that difficult. Lawrence Berkeley National Laboratory maintains and updates an excellent resource -- The Particle Adventure -- easily accessed via the Internet. While the site's layout is dated, the site itself is easy to use. Vitally, the information is readily understandable and regularly updated to advance with the latest discoveries in particle physics.

Particle physics is cloaked in an aura of mystery. Its mere mention addles minds. But I would argue that particle physics' challenging reputation is self-imposed. By shying away from teaching the subject in high school, we cast the shroud ourselves. To enlighten a new generation of curious students and future physicists, perhaps all we need to do is lift the shroud.

(Image: AP)

The Middling Power of the Placebo Effect

Alternative medicine practitioners love the placebo effect. To many of them, it is proof that the mind can heal the body through the power of positive thinking. While that is a tempting narrative, it is very much a tall tale. The placebo effect is not nearly as powerful as it's billed to be, nor can it be controlled with a simple thought.

Commonly witnessed in clinical trials, the placebo effect arises when sugar pills, sham surgeries, or any other fake treatments prompt improvements in patients' health. Such improvements result from an amalgamation of factors. Among these are genuine physiological effects, like the brain releasing "feel-good" hormones such as endorphins or dopamine. But the placebo effect also arises from problems that plague scientific studies. For example, subjects often report improvements in their symptoms out of a desire to please the researchers, or simply because they want to feel better. We humans are notoriously bad at gauging our actual health and wellbeing.

The placebo effect's standing is also inflated through a common misunderstanding about how it is measured. Franklin Miller and Donald Rosenstein, researchers based out of the National Institute of Mental Health, set the record straight back in 2006:

"Suppose that, in an 8-week trial, 50% of the patients respond to the investigational drug and 35% to placebo. The 35% response rate is typically described as the placebo effect deriving from receiving an inactive pill (the placebo) believed to represent a real drug."

But that is not the placebo effect! They continued:

"Just because 35% of patients in our hypothetical example were observed to have a reduction in depressive symptoms... it does not follow that the placebo administered in the trial caused this response rate. Patients who get better after receiving a placebo control may have improved as a result of the natural history of the disorder, natural healing, or the clinical attention they received by virtue of trial participation. In other words, they might have shown the same improvement without taking the placebo pill."

In fact, when researchers gathered 114 randomized trials conducted on forty medical conditions for a large meta-analysis, they found that patients given a placebo generally didn't fair much better than subjects given no treatment whatsoever. Only subjective measures of pain improved with placebos. 

"Outside the setting of clinical trials, there is no justification for the use of placebos," the researchers boldly and controversially concluded.

But is it that controversial? We will never be able to rely on a placebo to mend a broken bone or treat cancer. Patients may feel better, but that is not the same as actually being better.

"A great example of this is a study of sham acupuncture versus albuterol inhaler in patients with asthma." Science-Based Medicine's David Gorski wrote. "The results showed that, yes, patients did feel better. They did feel less short of breath. However, the “hard outcome” as measured by spirometry showed absolutely no effect on lung function."

While the placebo effect's power is decidedly overstated, we can still learn a lot from studying it. Again, Gorski:

"The science of placebos is a fascinating topic that might actually have some potential applications in medicine. These applications would at the very minimum include how to design better clinical trials whose results are not confounded by placebo factors. At the most, however, they would involve understanding how neurochemical functions can affect a patient’s perception of his or her symptoms and using that understanding to maximize the effects of science-based interventions."

(Image: Elaine and Arthur Shapiro)

The Biggest Myth About the Big Bang

13.8 billion years ago, the Universe exploded into existence. Or at least that's what most laypeople probably think of the Big Bang. But as astronomically alluring as that image is, it's also a myth. The simple fact is that physicists aren't certain exactly how the Universe began, or even if it did.

After all, the primordial Universe could have counterintuitively "popped" into being from nothing at all. Or perhaps it existed eternally in another nascent form? Maybe it oozed out of some higher dimension? Heck, as science fiction author Douglas Adams imagined, it could easily have been sneezed out of the nose of a being called the Great Green Arkleseizure.

All of these are perfectly cromulent possibilities (though some are certainly less likely than others), owing to a simple fact: Physics' reach is currently limited to roughly one second after the "Big Bang." Everything before then is left to learned speculation and hypothesis.

“We don’t have any idea what happened at the purported moment of the Big Bang," Caltech astrophysicist Sean Caroll recently admitted on Science Friday. "Cosmologists… sometimes exaggerate a little bit about what it means."

That's not to say that cosmologists don't know anything. Boatloads of evidence and observation support the notion that the entire Universe was once unfathomably dense and hot, and confined to a vastly smaller area. Moreover, it expanded and cooled into everything that is today.

"The Big Bang model… the general idea that the universe has been expanding from a hot, dense early state, that’s 100 percent true…" Carroll clarified.

But the "Bang" itself is very much a myth. On Science Friday, Carroll furnished a far more correct, although decidedly less dramatic definition.

“It’s the time at which we don’t understand what the Universe was doing."

(Image: NASA: Theophilus Britt Griswold – WMAP Science Team)

The Surprising Upside of Herpes

Herpesviruses get a bad rap. Their poor reputation isn't entirely undeserved. Widely maligned for causing cold sores, mononucleosis, shingles, chickenpox, and the overly stigmatized genital herpes, the eight herpesviruses that infect humans can't really bemoan their sinister status. One in five adults in the U.S. is infected with genital herpes, typically caused by herpes simplex type 2.

Herpesviruses are also some of nature's most notorious squatters. When their infectious antics are halted by the immune system, they linger on within their human hosts in a latent phase, often for life. Their rent-free stay is almost always innocuous, but the little blighters sometimes flare up at opportunistic moments when the immune system is taxed by illness or bodily stress. One herpesvirus, the cold sore-causing herpes simplex type 1, may slightly increase the risk for Alzheimer's disease. Roughly two-thirds of Americans aged 12-70 have had an active cold sore infection, and likely still host the latent phase of the virus.

Herpesviruses do have a few upsides, however. Scientists have long wondered whether their prolonged stay inside their hosts imparts any beneficial effects on the hosts themselves, and a couple studies hint that it does! Back in 2007, a team from Washington University School of Medicine in St Louis infected young mice with a herpesvirus similar to the strain that causes mononucleosis in humans. After the mice beat back their initial infections, the invading viruses entered their latent stage. The team then infected the mice with pathogens that cause encephalitis, meningitis, and plague. Turned out, the mice with herpes showed more resistance to the bacteria than mice without the infection!

Mouse studies are useful, but they don't always translate to humans. Last year, however, a study revealed that a type of herpesvirus called cytomegalovirus (CMV), which infects 50 to 80 percent of all 40-year-olds, enhances the immune response to the influenza virus. Critically, the researchers behind the study achieved the same results in both mice and humans.

Scientists are now doing more than just analyzing the effects of a latent herpes infection; they're actively enlisting the virus in the fight against cancer. Last summer, an international team announced that they engineered the herpesvirus that causes cold sores to instead attack cancer cells. The therapy, called T-VEC, worked wonders in a phase III clinical trial involving 436 patients afflicted with late stage melanoma.

 “Patients given T-VEC at an early stage survived about 20 months longer than patients given a different type of treatment," University of Louisville cancer researcher Jason Chesney reported. "For some, the therapy has lengthened their survival by years. ”

In T-VEC, the modified herpesviruses cannot replicate in normal cells, but they gleefully infect and destroy cancer cells. What's more, they release antigens that enable the immune system to target cancer cells.

Mere months after the success of T-VEC was announced, the FDA approved the therapy for primetime use. Melanoma patients can now turn to the herpesvirus for some small glimmer of hope in their fight against cancer.

The scientists behind T-VEC are hopeful that their herpesvirus can be further modified to attack all sorts of cancer cells. What a fascinating turn of fate: that such a maligned virus can transform from pariah to potential savior!

(Image: CDC)

Our Favorite Blogger Writes a Great Book

Ethan Siegel is one of our very favorite authors here at RCS. In 2013 we named him our top science blogger. His first book, Beyond the Galaxy, covers a broad range of material right in his wheelhouse: astronomy, astrophysics, and cosmology. You can grab a copy at the world's largest bookstore.

The premise is to teach the students of an introductory college course the state of the universe as we've discovered it through the history of astronomy and physics. Basically, you're taking an Intro Astronomy for Non-Science Majors course with Professor Siegel. While it's well suited -- if read cover to cover -- for that task, I suggest that you read it in a slightly different manner.

Siegel's writing here is as entertaining and well-pitched for an enthusiastic layman as ever. The text brims with entertaining historical anecdotes and intuitive explanations; it illuminates the scientific process instead of dryly stating facts.

What Beyond the Galaxy really reads like, however, is a huge collection of articles. Fun, educational, lucid blog posts.

Accordingly, I think the real fun lies in reading it in bits and pieces. Each chapter is divided into a series of sections. Most of these stand alone as self-contained stories of a particular discovery, idea, or person. This granularity allows the book to meander into many entertaining corners of the historical progression of science and dive into explanations of competing cosmological theories rather than cutting back these details to streamline the whole story. From my perspective that characteristic is not a flaw. A trimmed, focused storyline written more like a novel's plot would weaken Siegel's strength at digging into informative details and making them entertaining. It would make the text a lot less fun and probably less educating to read.

One further aspect of this book stands out among popular science volumes. Fans of the prolific images in Siegel's blogging will enjoy the many pictures found throughout the text. Open it to nearly any page and you'll find at least one high quality colored print with a full, well-written explanatory caption. These alone are a wealth of information even without the main text.

So, I highly recommend keeping this book to read in bits and pieces at leisure or as a reference to learn particular concepts in astronomy as needed.

Siegel, Ethan. Beyond the Galaxy: How humanity looked beyond our Milky Way and discovered the entire Universe. Singapore: World Scientific, 2015. Print.

We Need to Study Genuine Cancer "Miracles"

Superlatives are far overused when it comes to cancer. Though "breakthroughs", "miracles", "cures", and "marvels" are regularly reported in the popular press, more and more people continue to die of the disease. An estimated 600,000 will succumb this year.

But don't be disheartened. Through the superb efforts of dedicated researchers and hard-working medical professionals, we are slowly but surely winning the "war on cancer". Early detection, preventative measures, and improved treatments have reduced the cancer mortality rate from a peak of 215 deaths per 100,000 people in 1991 to 172 deaths per 100,000 people in 2010, no "revolutionary miracles" required.

Looking to lower the rate even further, many cancer researchers are calling for a widespread effort to analyze genuine cancer miracles. Yes, they do exist, but in a decidedly less hyped fashion. In a significant portion of cancer drug trials, there are patients who exhibit incredible responses to the treatments they receive. While the average effect of a certain drug might be middling overall, these patients will take the drug and experience miraculous results -- their cancers might even disappear entirely for a time. Such rare survivors are called "exceptional responders."

Over the past decades, these exceptional responders have been mostly ignored, cast aside as amazing, yet irreproducible, anecdotes. Intriguing oddities to be published in case reports, perhaps, but not studied empirically. That may soon change. Last year, the National Cancer Institute (NCI) announced an ambitious plan to transform these cases from anecdotes to evidence, calling for any and all exceptional responses to be reported and rigorously investigated.

"Tissue samples will be obtained and molecularly profiled via whole-exome, transcriptome, and deeper targeted sequencing. All clinical and genomic data will eventually be available to interested investigators through a controlled-access database," Alissa Poh reported in the journal Cancer Discovery.

When researchers have taken steps like this in the past, they've gleaned some remarkable insights. A couple particularly glowing examples are tied to the drug everolimus. During clinical trials, two different patients saw their cancers almost entirely disappear for 14 and 18 months before returning. Subsequent analysis turned up mutations in their tumors which rendered their cancers uniquely susceptible to the drug. With that information, cancer researchers can design clinical trials that specifically target patients whose tumors have those mutations.

Examining exceptional responders particularly excites Vivek Subbiah and Ishwaria Mohan Subbiah, a husband and wife duo at The University of Texas MD Anderson Cancer Center in Houston.

"Scientists and physicians are detectives at heart," they wrote last year in the journal Future Oncology. "The in-depth analysis of these n-of-1 outlier responders calls for an approach worthy of Sherlock Holmes, where 'the grand thing [is to be able] to reason backward' with the hope of unraveling unique insights into the disease that may help the current patient and future patients with the same disease or same aberration."

They offered a suggestion to make this happen.

"There has to be a real-time, open access online registry that stores the data relating to all of these ‘miracle’ patients and all of the data that has been deposited so that all of this investigative work is accessible and useful."

The Subbiah's recommendation has just been mirrored in an editorial published to Science Translational Medicine. Harvard Medical School's Eric D. Perakslis and Isaac S. Kohane call for establishing an Exceptional Responder Network, complete with a network of clinical sites that provide free testing for verified exceptional responders, a massive online registry, and a policy of open data sharing.

If this approach is widely adopted, researchers may be able to manufacture a bounty of "breakthrough" cancer treatments truly worthy of superlatives.

(Image: AP Photo)

Do We Need to Revise General Relativity?

The idea that our Universe is filled with dark matter has been around for nearly a century. When astronomers noticed that orbital speeds towards the edges of spiral galaxies remain the same or even increase slightly, rather than decrease, they surmised that either there must be some huge unseen mass driving the rotation, or that the laws of gravity given by Einstein's General Relativity need to be changed. They elected the first option.

Over that time, cosmologists have accumulated boatloads of evidence in favor of the notion that this invisible, "dark" matter -- which neither interacts with nor emits light -- comprises roughly 84% of the mass of the Universe. So compelling is this story that millions and millions of dollars have been spent on ingenious experiments to actually detect the stuff, but thus far, the particles have remained elusive.

It is partly because of dark matter's inherent ability to not be found that, in 1983, Israeli physicist Mordehai Milgrom proposed an upstart theory to challenge its dominance. Modified Newtonian dynamics, or MOND for short, dares to go where physicists of the past dared not: It slightly tweaks the laws of gravity put forth by Einstein's General Relativity. While the changes are subtle, only affecting Einstein's equations at very low accelerations, the ramifications are massive. General Relativity has remained essentially unscathed for over a century.

And yet MOND matches its audacity with surprising veracity. It successfully accounts for galaxy rotation curves just as well, and in some cases, even a little bit better than dark matter. Moreover, no evidence has come to light that conclusively disproves MOND. That's quite an accomplishment, as the annals of physics are littered with the corpses of theories that challenged General Relativity and failed.

"The idea is sound," cosmologist Ethan Siegel writes in his recent book Beyond the Galaxy. "Surely hypothesizing that 80-85% of the matter in the Universe is of some hitherto undiscovered type... represents a greater leap than making a tweak to our theory of gravity. After all, tweaking our theory of gravity to explain Mercury's orbital motion was what led to General Relativity in the first place!"

But, as Siegel notes, full-fledged cosmological theories built from MOND cannot fully account for many findings arising from the theory of dark matter.

"Gravitational lensing, the cosmic web of structure, and cosmic microwave background observations all go unexplained in all the modified gravity theories put forth so far."

Professor Stacy McGaugh, an astronomer and cosmologist at Case Western Reserve University and one of the leading proponents of MOND, admits that the idea isn't perfect.

"A compelling physical basis for MOND is still lacking. But then, it took Newton twenty years to realize there was a good geometric reason for the inverse square law, and centuries to develop our modern understanding of gravity. These things only seem crystal clear with the benefit of hindsight. So it no doubt shall be with MOND, whatever the underlying physics."

As Sabine Hossenfelder reported last year, one potential way to test MOND could soon become available. According to the modified gravity theory, an offshoot of MOND, a black hole's shadow should be ten times larger compared to what general relativity predicts. The Event Horizon Telescope aims to image a black hole and its shadow for the first time in 2017.

Siegel succinctly sums MOND's current scientific standing in his book.

"MOND remains an attractive avenue of investigation, as it is still more successful at predicting the rotation curves of individual galaxies, overall, than the theory of dark matter is. But its failure to meet the criteria of reproducing the successes of the already-established leading theory means that it has not yet risen to the status of scientifically viable."

When it comes to MOND, McGaugh is a strict adherent to empiricism, but he also has a flair for the philosophical.

"Is our universe an unfamiliar darkness filled with invisible mass, with the 'normal' matter of which we are composed no more than a bit of queer flotsam in a vast sea of dark matter and dark energy? Or is our inference of these dark components just a hint of our ignorance of some deeper theory?"

"Ultimately, what we want is irrelevant. Science is not a consensus endeavor: the data rule."

Primary Source: Ethan Siegel. Beyond the Galaxy: How Humanity Looked Beyond Our Milky Way and Discovered the Entire Universe. 2015. World Scientific.

(Image: NASA)