Teach Particle Physics in High School

Modern physics underpins many of the greatest technological discoveries of the past few decades. The cell phones and computers upon which we rely were sewn into reality thanks to a fundamental understanding of the Standard Model of Particle Physics. That's why it's surprising that, in many high school classrooms, physics seems stuck in the past.

Indeed, in curricula across the country, physics class often takes the form of a history lesson, with live demos and labs sprinkled in for occasional hands-on experimentation. Unfortunately, the focus on the past can create the impression that physics' greatest discoveries are in the past. While it's simultaneously stimulating and essential to learn about J.J. Thomson's electrons, Ernest Rutherford's protons, Nikola Tesla's alternating current, Isaac Newton's laws of motion, and Albert Einstein's relativity, there also must be room to discuss the big questions challenging physicists today. For that, students must be given the opportunity to wrap their minds around the unimaginably miniscule. They must be taught particle physics!

The atom is not the fundamental building block of all there is. Smaller particles called protons and neutrons pack inside the atom's positively charged nucleus, while electrons flit about in a negatively charged cloud just outside. For most high school physics students, this is where the story ends. They never learn that protons and neutrons are composed of even smaller particles called quarks. They never learn that the electron has five other sibling particles -- electron neutrino, muon, muon neutrino, tau, and tau neutrino -- which are collectively called leptons. They never learn that everything in the universe, from the cereal they eat for breakfast to the galaxies above, is fundamentally composed of just six quarks and six leptons.

A fortunate few do become privy to this incredible information. Each year, roughly 10,000 students aged 15 to 19 are excused from school for a day to venture to a nearby university or research center to learn about particle physics. The International Masterclasses, organized by the International Particle Physics Outreach Group, have been running for twelve years now and engage students in 47 different countries.

The Masterclasses are a worthy effort, but ultimately, no student should have to leave school to learn about particle physics. The 14.9 million students enrolled in American public high schools should be taught particle physics in their own classrooms, even if it's just a fleeting introduction. Ensuring the education won't even be that difficult. Lawrence Berkeley National Laboratory maintains and updates an excellent resource -- The Particle Adventure -- easily accessed via the Internet. While the site's layout is dated, the site itself is easy to use. Vitally, the information is readily understandable and regularly updated to advance with the latest discoveries in particle physics.

Particle physics is cloaked in an aura of mystery. Its mere mention addles minds. But I would argue that particle physics' challenging reputation is self-imposed. By shying away from teaching the subject in high school, we cast the shroud ourselves. To enlighten a new generation of curious students and future physicists, perhaps all we need to do is lift the shroud.

(Image: AP)

The Middling Power of the Placebo Effect

Alternative medicine practitioners love the placebo effect. To many of them, it is proof that the mind can heal the body through the power of positive thinking. While that is a tempting narrative, it is very much a tall tale. The placebo effect is not nearly as powerful as it's billed to be, nor can it be controlled with a simple thought.

Commonly witnessed in clinical trials, the placebo effect arises when sugar pills, sham surgeries, or any other fake treatments prompt improvements in patients' health. Such improvements result from an amalgamation of factors. Among these are genuine physiological effects, like the brain releasing "feel-good" hormones such as endorphins or dopamine. But the placebo effect also arises from problems that plague scientific studies. For example, subjects often report improvements in their symptoms out of a desire to please the researchers, or simply because they want to feel better. We humans are notoriously bad at gauging our actual health and wellbeing.

The placebo effect's standing is also inflated through a common misunderstanding about how it is measured. Franklin Miller and Donald Rosenstein, researchers based out of the National Institute of Mental Health, set the record straight back in 2006:

"Suppose that, in an 8-week trial, 50% of the patients respond to the investigational drug and 35% to placebo. The 35% response rate is typically described as the placebo effect deriving from receiving an inactive pill (the placebo) believed to represent a real drug."

But that is not the placebo effect! They continued:

"Just because 35% of patients in our hypothetical example were observed to have a reduction in depressive symptoms... it does not follow that the placebo administered in the trial caused this response rate. Patients who get better after receiving a placebo control may have improved as a result of the natural history of the disorder, natural healing, or the clinical attention they received by virtue of trial participation. In other words, they might have shown the same improvement without taking the placebo pill."

In fact, when researchers gathered 114 randomized trials conducted on forty medical conditions for a large meta-analysis, they found that patients given a placebo generally didn't fair much better than subjects given no treatment whatsoever. Only subjective measures of pain improved with placebos. 

"Outside the setting of clinical trials, there is no justification for the use of placebos," the researchers boldly and controversially concluded.

But is it that controversial? We will never be able to rely on a placebo to mend a broken bone or treat cancer. Patients may feel better, but that is not the same as actually being better.

"A great example of this is a study of sham acupuncture versus albuterol inhaler in patients with asthma." Science-Based Medicine's David Gorski wrote. "The results showed that, yes, patients did feel better. They did feel less short of breath. However, the “hard outcome” as measured by spirometry showed absolutely no effect on lung function."

While the placebo effect's power is decidedly overstated, we can still learn a lot from studying it. Again, Gorski:

"The science of placebos is a fascinating topic that might actually have some potential applications in medicine. These applications would at the very minimum include how to design better clinical trials whose results are not confounded by placebo factors. At the most, however, they would involve understanding how neurochemical functions can affect a patient’s perception of his or her symptoms and using that understanding to maximize the effects of science-based interventions."

(Image: Elaine and Arthur Shapiro)

The Biggest Myth About the Big Bang

13.8 billion years ago, the Universe exploded into existence. Or at least that's what most laypeople probably think of the Big Bang. But as astronomically alluring as that image is, it's also a myth. The simple fact is that physicists aren't certain exactly how the Universe began, or even if it did.

After all, the primordial Universe could have counterintuitively "popped" into being from nothing at all. Or perhaps it existed eternally in another nascent form? Maybe it oozed out of some higher dimension? Heck, as science fiction author Douglas Adams imagined, it could easily have been sneezed out of the nose of a being called the Great Green Arkleseizure.

All of these are perfectly cromulent possibilities (though some are certainly less likely than others), owing to a simple fact: Physics' reach is currently limited to roughly one second after the "Big Bang." Everything before then is left to learned speculation and hypothesis.

“We don’t have any idea what happened at the purported moment of the Big Bang," Caltech astrophysicist Sean Caroll recently admitted on Science Friday. "Cosmologists… sometimes exaggerate a little bit about what it means."

That's not to say that cosmologists don't know anything. Boatloads of evidence and observation support the notion that the entire Universe was once unfathomably dense and hot, and confined to a vastly smaller area. Moreover, it expanded and cooled into everything that is today.

"The Big Bang model… the general idea that the universe has been expanding from a hot, dense early state, that’s 100 percent true…" Carroll clarified.

But the "Bang" itself is very much a myth. On Science Friday, Carroll furnished a far more correct, although decidedly less dramatic definition.

“It’s the time at which we don’t understand what the Universe was doing."

(Image: NASA: Theophilus Britt Griswold – WMAP Science Team)

The Surprising Upside of Herpes

Herpesviruses get a bad rap. Their poor reputation isn't entirely undeserved. Widely maligned for causing cold sores, mononucleosis, shingles, chickenpox, and the overly stigmatized genital herpes, the eight herpesviruses that infect humans can't really bemoan their sinister status. One in five adults in the U.S. is infected with genital herpes, typically caused by herpes simplex type 2.

Herpesviruses are also some of nature's most notorious squatters. When their infectious antics are halted by the immune system, they linger on within their human hosts in a latent phase, often for life. Their rent-free stay is almost always innocuous, but the little blighters sometimes flare up at opportunistic moments when the immune system is taxed by illness or bodily stress. One herpesvirus, the cold sore-causing herpes simplex type 1, may slightly increase the risk for Alzheimer's disease. Roughly two-thirds of Americans aged 12-70 have had an active cold sore infection, and likely still host the latent phase of the virus.

Herpesviruses do have a few upsides, however. Scientists have long wondered whether their prolonged stay inside their hosts imparts any beneficial effects on the hosts themselves, and a couple studies hint that it does! Back in 2007, a team from Washington University School of Medicine in St Louis infected young mice with a herpesvirus similar to the strain that causes mononucleosis in humans. After the mice beat back their initial infections, the invading viruses entered their latent stage. The team then infected the mice with pathogens that cause encephalitis, meningitis, and plague. Turned out, the mice with herpes showed more resistance to the bacteria than mice without the infection!

Mouse studies are useful, but they don't always translate to humans. Last year, however, a study revealed that a type of herpesvirus called cytomegalovirus (CMV), which infects 50 to 80 percent of all 40-year-olds, enhances the immune response to the influenza virus. Critically, the researchers behind the study achieved the same results in both mice and humans.

Scientists are now doing more than just analyzing the effects of a latent herpes infection; they're actively enlisting the virus in the fight against cancer. Last summer, an international team announced that they engineered the herpesvirus that causes cold sores to instead attack cancer cells. The therapy, called T-VEC, worked wonders in a phase III clinical trial involving 436 patients afflicted with late stage melanoma.

 “Patients given T-VEC at an early stage survived about 20 months longer than patients given a different type of treatment," University of Louisville cancer researcher Jason Chesney reported. "For some, the therapy has lengthened their survival by years. ”

In T-VEC, the modified herpesviruses cannot replicate in normal cells, but they gleefully infect and destroy cancer cells. What's more, they release antigens that enable the immune system to target cancer cells.

Mere months after the success of T-VEC was announced, the FDA approved the therapy for primetime use. Melanoma patients can now turn to the herpesvirus for some small glimmer of hope in their fight against cancer.

The scientists behind T-VEC are hopeful that their herpesvirus can be further modified to attack all sorts of cancer cells. What a fascinating turn of fate: that such a maligned virus can transform from pariah to potential savior!

(Image: CDC)

Our Favorite Blogger Writes a Great Book

Ethan Siegel is one of our very favorite authors here at RCS. In 2013 we named him our top science blogger. His first book, Beyond the Galaxy, covers a broad range of material right in his wheelhouse: astronomy, astrophysics, and cosmology. You can grab a copy at the world's largest bookstore.

The premise is to teach the students of an introductory college course the state of the universe as we've discovered it through the history of astronomy and physics. Basically, you're taking an Intro Astronomy for Non-Science Majors course with Professor Siegel. While it's well suited -- if read cover to cover -- for that task, I suggest that you read it in a slightly different manner.

Siegel's writing here is as entertaining and well-pitched for an enthusiastic layman as ever. The text brims with entertaining historical anecdotes and intuitive explanations; it illuminates the scientific process instead of dryly stating facts.

What Beyond the Galaxy really reads like, however, is a huge collection of articles. Fun, educational, lucid blog posts.

Accordingly, I think the real fun lies in reading it in bits and pieces. Each chapter is divided into a series of sections. Most of these stand alone as self-contained stories of a particular discovery, idea, or person. This granularity allows the book to meander into many entertaining corners of the historical progression of science and dive into explanations of competing cosmological theories rather than cutting back these details to streamline the whole story. From my perspective that characteristic is not a flaw. A trimmed, focused storyline written more like a novel's plot would weaken Siegel's strength at digging into informative details and making them entertaining. It would make the text a lot less fun and probably less educating to read.

One further aspect of this book stands out among popular science volumes. Fans of the prolific images in Siegel's blogging will enjoy the many pictures found throughout the text. Open it to nearly any page and you'll find at least one high quality colored print with a full, well-written explanatory caption. These alone are a wealth of information even without the main text.

So, I highly recommend keeping this book to read in bits and pieces at leisure or as a reference to learn particular concepts in astronomy as needed.

Siegel, Ethan. Beyond the Galaxy: How humanity looked beyond our Milky Way and discovered the entire Universe. Singapore: World Scientific, 2015. Print.

We Need to Study Genuine Cancer "Miracles"

Superlatives are far overused when it comes to cancer. Though "breakthroughs", "miracles", "cures", and "marvels" are regularly reported in the popular press, more and more people continue to die of the disease. An estimated 600,000 will succumb this year.

But don't be disheartened. Through the superb efforts of dedicated researchers and hard-working medical professionals, we are slowly but surely winning the "war on cancer". Early detection, preventative measures, and improved treatments have reduced the cancer mortality rate from a peak of 215 deaths per 100,000 people in 1991 to 172 deaths per 100,000 people in 2010, no "revolutionary miracles" required.

Looking to lower the rate even further, many cancer researchers are calling for a widespread effort to analyze genuine cancer miracles. Yes, they do exist, but in a decidedly less hyped fashion. In a significant portion of cancer drug trials, there are patients who exhibit incredible responses to the treatments they receive. While the average effect of a certain drug might be middling overall, these patients will take the drug and experience miraculous results -- their cancers might even disappear entirely for a time. Such rare survivors are called "exceptional responders."

Over the past decades, these exceptional responders have been mostly ignored, cast aside as amazing, yet irreproducible, anecdotes. Intriguing oddities to be published in case reports, perhaps, but not studied empirically. That may soon change. Last year, the National Cancer Institute (NCI) announced an ambitious plan to transform these cases from anecdotes to evidence, calling for any and all exceptional responses to be reported and rigorously investigated.

"Tissue samples will be obtained and molecularly profiled via whole-exome, transcriptome, and deeper targeted sequencing. All clinical and genomic data will eventually be available to interested investigators through a controlled-access database," Alissa Poh reported in the journal Cancer Discovery.

When researchers have taken steps like this in the past, they've gleaned some remarkable insights. A couple particularly glowing examples are tied to the drug everolimus. During clinical trials, two different patients saw their cancers almost entirely disappear for 14 and 18 months before returning. Subsequent analysis turned up mutations in their tumors which rendered their cancers uniquely susceptible to the drug. With that information, cancer researchers can design clinical trials that specifically target patients whose tumors have those mutations.

Examining exceptional responders particularly excites Vivek Subbiah and Ishwaria Mohan Subbiah, a husband and wife duo at The University of Texas MD Anderson Cancer Center in Houston.

"Scientists and physicians are detectives at heart," they wrote last year in the journal Future Oncology. "The in-depth analysis of these n-of-1 outlier responders calls for an approach worthy of Sherlock Holmes, where 'the grand thing [is to be able] to reason backward' with the hope of unraveling unique insights into the disease that may help the current patient and future patients with the same disease or same aberration."

They offered a suggestion to make this happen.

"There has to be a real-time, open access online registry that stores the data relating to all of these ‘miracle’ patients and all of the data that has been deposited so that all of this investigative work is accessible and useful."

The Subbiah's recommendation has just been mirrored in an editorial published to Science Translational Medicine. Harvard Medical School's Eric D. Perakslis and Isaac S. Kohane call for establishing an Exceptional Responder Network, complete with a network of clinical sites that provide free testing for verified exceptional responders, a massive online registry, and a policy of open data sharing.

If this approach is widely adopted, researchers may be able to manufacture a bounty of "breakthrough" cancer treatments truly worthy of superlatives.

(Image: AP Photo)

Do We Need to Revise General Relativity?

The idea that our Universe is filled with dark matter has been around for nearly a century. When astronomers noticed that orbital speeds towards the edges of spiral galaxies remain the same or even increase slightly, rather than decrease, they surmised that either there must be some huge unseen mass driving the rotation, or that the laws of gravity given by Einstein's General Relativity need to be changed. They elected the first option.

Over that time, cosmologists have accumulated boatloads of evidence in favor of the notion that this invisible, "dark" matter -- which neither interacts with nor emits light -- comprises roughly 84% of the mass of the Universe. So compelling is this story that millions and millions of dollars have been spent on ingenious experiments to actually detect the stuff, but thus far, the particles have remained elusive.

It is partly because of dark matter's inherent ability to not be found that, in 1983, Israeli physicist Mordehai Milgrom proposed an upstart theory to challenge its dominance. Modified Newtonian dynamics, or MOND for short, dares to go where physicists of the past dared not: It slightly tweaks the laws of gravity put forth by Einstein's General Relativity. While the changes are subtle, only affecting Einstein's equations at very low accelerations, the ramifications are massive. General Relativity has remained essentially unscathed for over a century.

And yet MOND matches its audacity with surprising veracity. It successfully accounts for galaxy rotation curves just as well, and in some cases, even a little bit better than dark matter. Moreover, no evidence has come to light that conclusively disproves MOND. That's quite an accomplishment, as the annals of physics are littered with the corpses of theories that challenged General Relativity and failed.

"The idea is sound," cosmologist Ethan Siegel writes in his recent book Beyond the Galaxy. "Surely hypothesizing that 80-85% of the matter in the Universe is of some hitherto undiscovered type... represents a greater leap than making a tweak to our theory of gravity. After all, tweaking our theory of gravity to explain Mercury's orbital motion was what led to General Relativity in the first place!"

But, as Siegel notes, full-fledged cosmological theories built from MOND cannot fully account for many findings arising from the theory of dark matter.

"Gravitational lensing, the cosmic web of structure, and cosmic microwave background observations all go unexplained in all the modified gravity theories put forth so far."

Professor Stacy McGaugh, an astronomer and cosmologist at Case Western Reserve University and one of the leading proponents of MOND, admits that the idea isn't perfect.

"A compelling physical basis for MOND is still lacking. But then, it took Newton twenty years to realize there was a good geometric reason for the inverse square law, and centuries to develop our modern understanding of gravity. These things only seem crystal clear with the benefit of hindsight. So it no doubt shall be with MOND, whatever the underlying physics."

As Sabine Hossenfelder reported last year, one potential way to test MOND could soon become available. According to the modified gravity theory, an offshoot of MOND, a black hole's shadow should be ten times larger compared to what general relativity predicts. The Event Horizon Telescope aims to image a black hole and its shadow for the first time in 2017.

Siegel succinctly sums MOND's current scientific standing in his book.

"MOND remains an attractive avenue of investigation, as it is still more successful at predicting the rotation curves of individual galaxies, overall, than the theory of dark matter is. But its failure to meet the criteria of reproducing the successes of the already-established leading theory means that it has not yet risen to the status of scientifically viable."

When it comes to MOND, McGaugh is a strict adherent to empiricism, but he also has a flair for the philosophical.

"Is our universe an unfamiliar darkness filled with invisible mass, with the 'normal' matter of which we are composed no more than a bit of queer flotsam in a vast sea of dark matter and dark energy? Or is our inference of these dark components just a hint of our ignorance of some deeper theory?"

"Ultimately, what we want is irrelevant. Science is not a consensus endeavor: the data rule."

Primary Source: Ethan Siegel. Beyond the Galaxy: How Humanity Looked Beyond Our Milky Way and Discovered the Entire Universe. 2015. World Scientific.

(Image: NASA)

The Fake Disease That Plagued Darwin, and Other Illnesses That Never Actually Existed

As medicine has advanced over the centuries, diseases have come and gone, but not always because they've been eradicated. Many times, widely diagnosed maladies -- some of them supposedly debilitating or deadly -- turned out not to exist when new technologies allowed a closer look. Other times, diseases simply vanished when rigorous skepticism was dutifully applied.

Chronic Lyme disease, chronic candidiasis, and non-celiac gluten sensitivity are a few questionable conditions that persist today. Time and evidence will likely "cure" them for good.

Here are five historical diseases that were eliminated by scientific scrutiny.

Suppressed Gout. Throughout Charles Darwin's adult life, he was plagued by sometimes debilitating health issues. Turning to a variety of doctors for help, he received a menagerie of diagnoses. One of these diagnoses was "suppressed gout." Gout, of course, is a genuine condition, characterized by severe pain, redness, and swelling in joints, often the joint at the base of the big toe. Suppressed gout, however, was completely fabricated. In the 19th century, many doctors believed gout was caused by an accumulation of toxic substances. Some further blamed these substances for causing a host of other discomforting symptoms. Suppressed gout thus became a diagnosis of convenience.

Joseph Dalton Hooker, a legendary British botanist and Darwin's closest friend, was highly skeptical of suppressed gout. In a correspondence with Darwin from January 1865, he wrote:

"What the devil is this 'suppressed Gout' upon which doctors fasten every ill they cannot name? If it is suppressed how do they know it is gout? If it is apparent, why the devil do they call it suppressed? I hate the use of cant terms to cloak ignorance."

Railway Spine. Train travel was common in the 19th century, as unfortunately, were train collisions. Rickety rails, coupled with shoddy construction of passenger cars, rendered rail travel a decidedly more hazardous form of transportation compared to today. This situation resulted in a number of injuries, but it also brought forth opportunists hunting for an easy profit. Doctors all over the world found themselves listening to patients claiming they had been injured in crashes, yet showing no signs of actual ailment. The term "railway spine" was created for these cases. Railroad companies adamantly denied the malady's existence, yet were forced to pay thousands of dollars to supposed sufferers. Railway spine created quite a controversy in certain sects of the medical community. Now defunct, the disorder could very well have existed. Today the symptoms might be classified as whiplash or PTSD.

Status Lymphaticus. In the early 1900s, status lymphaticus reportedly killed thousands of children and was even regarded as "the most important problem in medicine." Today, most doctors have never heard of it, and that's for a good reason: it never existed.

The supposedly deadly disease was linked to a tiny gland nestled near the heart and lungs: the thymus. Now known as a key part of the immune system, the gland was not always held in such high regard. As the cause of status lymphaticus, the thymus was thought to occasionally grow out of control, pressing upon the heart and the lungs until the victim suffocated from the inside. Closer, skeptical scrutiny eventually disproved the condition in 1931.

Ovariomania. Commonly called "Old Maid's Insanity", ovariomania was a condition usually diagnosed amongst women at the early stages of menopause, although it was sometimes diagnosed even earlier. Some doctors believed that tumors in the ovaries prompted bouts of insanity. As influential Scottish psychiatrist Sir John Batty Tuke described:

"Women who for years have been carrying tumours, when they arrive at the change of life develop aberration of intellect, and not unfrequently the character of their illusion is marked by sexuality and erotomania; they think that they are pregnant, or that they are visited at night by men."

These tumors never actually existed, but that didn't stop 19th century surgeons from conducting as many as 150,000 oophorectomies on women, involving the removal of the ovaries.

Intestinal Autointoxication. There's poop inside you. Right now. But though disgusting when outside the body, feces are fairly benign inside. Tucked away within the colon awaiting excretion, there's little harm that poop can do. Physicians of the past weren't so sure however.

Dating back to ancient Egypt, medical "professionals" once entertained the notion that putrefaction of feces inside the body causes disease. The toxins produced supposedly shortened lifespan and sparked a host of maladies. While this hypothesis was firmly debunked decades ago, the premise still fuels a cornerstone of the natural health industry: the colon "cleanse."

(Image: Maull and Polyblank)

Pre-Meds Should Suffer Through 'Hunger Games'

Occupied by re-tweeting an emoji in response to a selfie posted on Instagram, a distracted driver rams your car off the road. You awake in the hospital that night and meet the doctor in charge of repairing your fractured legs and piecing together your damaged spine. Whose eyes would you prefer to look up into from your hospital bed?

Doc A: received top-of-the-class A's in biology, chemistry, physics, biochemistry, organic chemistry, medical imaging, electronic devices, and neuroscience. He overcame coursework pressure and scored higher on science-based entrance exams than 98% of other medical school applicants.

Doc B: wrote well-marked sociology papers in Comparative Perspectives on U.S. & European Societies: Inequality, Institutional Underpinnings of the Arts & Media; Sexual Cultures; and Virtual Communities/Social Media. He aced portions of the entrance exam focusing on social studies and psychology to gain admission to medical school.

In a column written for Scientific American, Nathaniel P. Morris explains why he dislikes the pre-med track undergraduate education and wants to tone down the tough science courses in favor of more social studies education. He points out that competition is hard and stressful, basic science is difficult, and who uses that stuff anyway?

Plus, lots of students who want to help other people don't do well enough in science classes and aren't accepted to medical school.

I disagree with his view. The Hunger Games atmosphere for pre-med students is a good thing.

That's not just because I choose Doc A, every time, over Doc B. (So would you -- be honest.) I've also taught pre-meds and watched the pre-med system at work.

Pre-med coursework is much like coursework for any other difficult and technical professional occupation. These students take a broad range of tough classes in which they must excel. They are often graded against their peers and ranked. This can raise the competition to a frenzy and cause some pre-meds to break down and fail. It's heartbreaking to see a student overcome with anxiety and fear for their future after receiving a single low assignment score. It's not necessarily a pretty system.

But if preparing a ten-line calculation for Chemistry I Lab at home is overwhelming for a student, how would they fair in a ten-hour surgery in the operating room? Is that a good environment for anyone but a tremendously driven person who has learned to cope with extreme pressure and make crucial analytical decisions? Those skills are what a pre-med major is really about.

While the level of stress is very high for pre-meds, they aren't going through anything that most other science majors don't face too. There is one major difference between those science majors and pre-med majors. In return for the increased breadth of study and higher competition, pre-meds are relieved of the very hardest parts of a science major: the most advanced courses.

A pre-med major largely consists of a broad survey of introductory level science courses: biology, chemistry, physics. It then includes one or two mid-level chemistry courses including organic chemistry. It often includes calculus as well as biochemistry.

That's well short of the requirements for a full degree in chemistry or biology, much less physics or math. Some pre-meds moan about the difficulty of their physics and math coursework. Yet, they are taking only the classes that are considered the barest low-level introduction to these areas for most scientists. Pre-meds face stiff competition, but they don't face unduly difficult courses.

My observation -- in the lab courses I've taught -- is that the students who will be accepted to medical school buckle down and earn A's on their basic science coursework. They learn to cope with whatever is thrown at them and hone a mentality to rise to the difficulty of their work. The students who aren't the most successful at working hard, under pressure, on problems that are fundamentally scientific in nature don't go on to be doctors.

That's tough love. But reality is tough love. Shouldn't we teach that in college?

(AP photo)

The Many Mysteries of the Thymus

In the early 1900s, a strange disease was killing thousands of infants. Termed "status lymphaticus", it was blamed on the thymus, a small, grayish-pink gland weighing no more than 37 grams nestled between the sternum and the pericardium, the heart's reinforced sac. At the time, doctors could only hypothesize on the thymus' function, but many knew that in some cases, this little organ grew out of control, pressing upon the heart and the lungs until the young victim suffocated from the inside.

Yet, in 1931, status lymphaticus was conclusively shown not to exist. Highlighting new research and boldly admitting the medical community's collective "ignorance of the anatomy of the normal healthy human body", the editors of the respected medical journal The Lancet declared "The End of Status Lymphaticus." The article resigned thousands of infant deaths to uncomfortable mystery and served as yet another defeat in anatomists' attempts to ascertain the function of the thymus.

Dating back to antiquity, the little gland had defied explanation. Despite excising and examining the thymus from a variety of animals, physicians had collectively arrived at just two facts by the turn of the 20th century: It seemed to be involved in the immune system, and it dwindled in size as subjects grew to adulthood. The latter finding was particularly perplexing. Why, as the body grows, would an organ so dramatically shrink?

It wasn't until 1960s that the thymus' function was finally revealed. Australian immunologist Jacques Miller removed the thymus from newborn mice, and found that they had drastically reduced populations of white blood cells called lymphocytes compared to normal controls. The mice also had reduced lymphoid tissues, impaired immune responses, and suffered from higher rates of infection. Six years later, Miller teamed up with Graham Mitchell to discover that the thymus produces lymphocytes called T-cells, which aid in the production of antibodies.

To this day, however, it remains unknown why the thymus shrinks as we age. Active and growing during infancy and childhood, the thymus starts to atrophy during puberty and, over time, much of it converts to fatty tissue. In fact, by the age of fifty, only about fifteen percent of the thymus remains. By age seventy-five, the average human thymus weighs just six grams and is yellow in hue, a sickly-looking shadow of its former self.

The tendency for the thymus to shrink in vertebrates is known as thymic involution. As the process plays out, the immune system weakens, and rates of cancer, infection, and autoimmune disorders increase. Thymic involution is not directly caused by aging, so anatomists are at a loss to explain why the gland decays. A leading hypothesis suggests that thymic involution is an evolutionary trade-off. The immune system is physiologically expensive, so as complex organisms mature, immunity gets deprioritized.

No doubt much remains to be learned about this most mysterious organ.

Source: Liu, D & Ellis, H. "The Mystery of the Thymus." Clin Anat. 2016 Apr 2. doi: 10.1002/ca.22724. [Epub ahead of print]

(Image: LearnAnatomy)

More Hilariously Stupid Science Questions! And This Time, Some of Them Will Make You Think...

Hilariously stupid science questions regularly feature here at RealClearScience. In fact, I've lost count of the times we've shared them. (The simple answer is "too many.")

We label these questions "hilariously stupid," but in truth, a question sparked by scientific curiosity is never stupid. A couple gems contained within today's crop of queries especially demonstrate that though a question may seem daffy at first glance, further consideration can reveal a hidden brilliance.

Do math majors in college graduate with a degree or with a radian?

My kid asked for a Pb and Jam sandwich in his lunch tomorrow. How much lead is appropriate for a 10 year old boy to consume?

If I flip a coin 1,000,000 times, what are the odds of me wasting my time?

Is Black Lives Matter similar to dark matter?

If parallel universes exist, is there a parallel universe in which parallel universes don't exist? (One for the philosophers out there...)

How did humans evolve to fit so perfectly into clothing?

If particles do not exist unless observed, why can't I close my eyes and walk through walls?

How many milligrams are in a telegram?

The Solar System has five Dwarf planets, but why doesn't it have any Elf or Orc planets?

Why can't I weigh the earth by putting a scale upside-down? (Actually, you can! Sort of.)

via Reddit

(Image: Shutterstock)

Three Problems With the Big Bang

Somewhere around 13.8 billion years ago, the Universe began with a bang. In less than a second, the four fundamental forces -- electromagnetism, gravitation, weak nuclear interaction, and strong nuclear interaction -- which initially were joined as a single even more fundamental force, separated. Suddenly, the Universe started to expand at an exponential rate. Cosmic inflation had begun. A little later, but still within this initial second, tiny particles called hadrons formed, and neutrinos ceased to interact with other particles. For the next ten seconds, particles called leptons dominated. Their short-lived rule gave way to photons, which governed for approximately 380,000 years. Newly formed atoms of hydrogen and helium took over next, but it wasn't until around 559 million years later that stars began to shine by fusing them together.

The Big Bang is the best theory we have to explain the birth and existence of the Universe. As astrophysicist Ethan Siegel wrote in his recent book Beyond the Galaxy:

"To this very day, there is no other model that is both consistent with General Relativity and explains the Hubble expansion of the Universe, the abundances of the light elements and the existence and properties of the cosmic microwave background; the Big Bang is the only one."

But while satisfying and substantially supported by the weight of scientific evidence, the defining theory of cosmology is not perfect. There remain three key problems.

The first is the Horizon Problem. If we look far out into space, billions of light years away, we see photons with the same temperature -- roughly 2.725 degrees Kelvin. If we look in another direction, we find the same thing. What a coincidence! In fact, when astronomers look in all directions, no matter how distant, they find that all regions have the same temperature. This is incredibly puzzling, Siegel says, "since these regions are separated by distances that are greater than any signal, even light, could have traveled in the time since the Universe was born." The Big Bang offers no explanation for this fascinating quirk.

Yet another quirk unexplained by the Big Bang is the Flatness Problem. Almost all the evidence collected by cosmologists indicates that the Universe is flat. Like a sheet of paper on a desk, spacetime shows almost no curvature whatsoever. Within the context of the Big Bang, this seems extremely unlikely.

 

Lastly, we arrive at the Monopole Problem. The immense energies produced by the Big Bang should have created a magnetic particle that breaks the mold. All magnets have two poles, a north and a south. Even when a magnet is snapped in half the two poles remain. But this particle would effectively be a magnet with only one pole: a magnetic monopole!

In 2014, researchers created a synthetic magnetic monopole in the laboratory, but physicists have yet to find one of these particles in nature. Blas Cabrera, a researcher at Stanford and the current leader of the Cryogenic Dark Matter Search experiment, detected a candidate monopole back in 1982, but no other experiments have replicated his results.

So what do we do with these three puzzling problems? Do we simply ignore them?

"It is tempting to look past these three problems as not problems at all, but rather as simply the conditions that the Universe started off with..." Siegel writes. "The Big Bang has enough successes that it is easy enough to sweep these problems under the rug and not be bothered by them."

"But that would be a terrible, non-scientific attitude to take... As soon as we convince ourselves that something is a question that science cannot answer, it becomes a self-fulfilling prophecy."

Scientists do not turn away from problems because they are difficult. To the contrary, scientists tackle problems precisely because they are difficult! To the discoverer goes the notoriety, and to the world goes something far more important: knowledge.

Primary Source: Ethan Siegel. Beyond the Galaxy: How Humanity Looked Beyond Our Milky Way and Discovered the Entire Universe. 2015. World Scientific.

(Images: AP Photo/Elise Amendola, NASA / WMAP Science Team, Sbyrnes321)

The Altruistic Beginnings of Big Pharma

The pharmaceutical industry undoubtedly ranks near the top of the world's most vilified businesses. Earning a combined $1 trillion in revenue, companies like Pfizer, Bristol-Myers Squibb, Roche, Novartis, and countless others are easy targets of populist outrage.

Some of it, of course, is warranted. Various companies have been found guilty of fraud under the False Claims Act. Most notably, in 2012, GlaxoSmithKline was ordered to pay $3 billion for failing to report safety data, bribing doctors, and promoting medicines for unlicensed uses. According to numerous reports, these unsavory acts are widespread throughout the industry.

If Edward Robinson Squibb, the pioneering medical inventory who founded and lent his name to pharmaceutical giant Bristol-Myers Squibb, were aware of this unscrupulous situation, he would not be impressed. Squibb died more than a century ago, but he should serve as a role model for the pharmaceutical industry today.

As a young student at Philadelphia's Jefferson Medical College in the 1840s, Squibb became enraptured with the seemingly magical skills of early surgeons, particularly those of his professor: Dr. Thomas Dent Mütter. Squibb watched in awe as Mütter repaired all manner of maladies, despite being handcuffed by inconsistent means of inducing anesthesia. Many times, Mütter would be forced to perform complicated operations on patients who were wide awake, making each surgery a delicate dance between subject and physician. Squibb resolved to remove that handicap.

"His vision was to provide doctors and surgeons with stable, constant chemicals for their work, and thereby make ether surgeries safer, more popular, and even more widely accepted," Cristin O'Keefe Aptowicz wrote in her book Dr. Mütter's Marvels.

Squibb labored for long hours in isolation to both concoct a perfect ether compound to induce anesthesia and create a consistent way to manufacture it. In 1854, after years of work, he finally succeeded, crafting a still that used steam to produce a uniform ether gas. The discovery was guaranteed be a goldmine, but Squibb gave it all away for free.

"Instead of rushing to patent either the process or the still -- both of which were conceived, created, tested, and perfected by Squibb alone -- he gave them to the world for free, publishing an article on the apparatus, including a detailed diagram of the design, in the American Journal of Pharmacy," Aptowicz wrote.

Four years later, Squibb founded the company that would become Bristol-Myers Squibb. But though he was now able to profit from his efforts, Squibb did not lose his altruistic mentality.

"Through his company and through his personal work, Squibb would become an advocate for transparency between patient and health-care provider and between doctor and medicine supplier," Aptowicz recounted. "He was instrumental in launching the movement that produced... the first of a series of consumer protection laws that, among other things, required drugs to be labeled with their active ingredients..."

While many today would undoubtedly agree that the pharmaceutical industry has strayed from Squibb's salubrious ideals, the reality isn't so cut and dry. The modern pharmaceutical industry owes much of its maligned reputation to system and circumstance. Game changing advances that occurred regularly decades ago are today not so readily attained. Breakthrough treatments of the past have essentially eliminated a number of medical conditions, leaving fewer, and more difficult, health problems to tackle. Yet the economic pressures from outside investors remain as strong as ever. This leads pharmaceutical executives to inflate prices and disguise slightly tweaked drugs as innovative new products.

Moreover, pharmaceuticals are a high-risk industry. New drugs cost billions of dollars to research and produce, with no guarantee that they will survive the FDA's rigorous review process. However, critics reasonably counter that average pharmaceutical company profits remain extremely high.

We cannot expect the modern pharmaceutical industry to completely emulate the altruism of its long-dead forefather, but one would hope that the executives behind Big Pharma could learn a thing or two from Squibb's honorable example.  

(Image: Rept0n1x / Wikimedia Commons)

This Diet Is Virtually Guaranteed to Keep You Cancer Free for the Rest of Your Life

Scientists have discovered a diet that ensures you will live cancer free for the rest of your life. It's being hailed as a "miracle," a "marvel," a "breakthrough," and even a "quantum leap in nutrition."

Past studies have indicated that pretty much everything in life causes cancer. To circumvent this unpleasant truth, scientists had to think outside the box. The diet they created is nothing short of revolutionary.

There's no cooking, no grocery shopping, and no annoying delivery drivers. In fact, there's no food or water whatsoever! You don't consume anything!

Nine subjects took part in the study, which was not published in a peer-reviewed journal. All of the participants abstained from eating or drinking for the duration of the study. When examining the results, the researchers were utterly amazed to find that none of the subjects showed any signs of cancer.

"The 'no-nutrient' diet was not associated with any form of cancer," the researchers reported. "Moreover, the three children that took part in the study showed no signs of autism."

Vindicated by the new research is popular health blogger, Vani Hari, also known as "The Food Babe." For years, Hari has urged her followers to avoid all toxins and chemicals (the 'no-nutrient' diet has none), an effort that has provoked harsh scrutiny from the scientific community. That independent scientists have now proven her correct beyond a shadow of a doubt is a delightful piece of irony.

The researchers don't plan to perform a follow-up study, but they do intend to lobby Congress to revise the recently-released 2015-2020 U.S. Dietary Guidelines to include the 'no-nutrient' diet.

"We expect the Guidelines to be updated very quickly in light of our incontrovertible findings," they said in a press release.

The researchers admitted one minor side effect of their diet: All of the subjects passed away after five days.

Author's Note: This article is satire and all of its content is completely fabricated. The author does not actually recommend depriving yourself of food and water. Nor does he seek to trivialize the plight of cancer patients or denigrate those diagnosed with autism. The article is intended to highlight -- in an utterly absurd fasion -- the often ridiculous nature of nutrition science and how it is reported in the popular press.

(Image: Salimfadhley/Wikimedia Commons)

The Strange Glass Born in Nuclear Explosions

5:30 A.M., Monday July 16th, 1945: The day dawned brighter than ever before over the New Mexico desert. But it was not the Sun's soothing rays that set the landscape alight; it was the radiant flash of the very first atomic bomb.

Trinity, the nuclear offspring of the Manhattan Project, detonated with the force of 21,000 tons of TNT. The accompanying fireball reached temperatures of 8,430 degrees Kelvin, hotter than the surface of the sun, and sent a mushroom cloud of smoke and debris soaring more than seven miles into the sky.

That day, every human on the planet was reborn into a nuclear era, one where mankind now held the power to end its existence. Also born that day was an otherworldly, greenish glass, a physical reminder of the cataclysmic explosion. Scientists dubbed the strange material trinitite.

The ghostly glass littered the ground for hundreds of meters around the blast site, though it might be more accurate to say that it "transformed" the ground. The sand, which blanketed the desert the day before, had been replaced by this new material. Walking on it was like setting foot on the surface of an alien world.

Trinity's atomic blast catalyzed the transformation. Amidst destructive turbulence and searing heat, sand was thrown up into the fireball, where it melted, reformed, and rained down upon the ground. Scientists discovered proof for this storm strewn all over in the form of trinitite beads -- molten drops that solidified before they hit the ground. Years later, these pebbles are still surfacing as ants excavate them from their tunnels.

Despite its distinctly eerie appearance, trinitite really isn't that much different from sand. The glass is composed of silicon dioxide, better known as quartz, the second-most abundant mineral in Earth's continental crust. Closer inspection, however, reveals a material tainted with trace amounts of forty different elements, many of them radioactive.

In fact, to this very day, trinitite remains radioactive, buzzing with activity from isotopes of cobalt, barium, europium, uranium, and plutonium. It's safe to handle, but one would be ill advised to make jewelry out of it.

Much of the trinitite created on that fateful July day more than sixty years ago has now been bulldozed and buried, but rare specimens do reside in the hands of collectors. Rarer still, is red trinitite, which gets its color from the presence of copper. When scientists examined samples of red trinitite under a microscope, they found metallic, round blobs within the glass. These "chondrules" were melted pieces of iron and lead from the bomb itself, mementos encased in an atomic glass.

Primary Source & Images: Eby, N., Hermes, R., Charnley, N. and Smoliga, J. A. (2010), Trinitite—the atomic rock. Geology Today, 26: 180–185. doi: 10.1111/j.1365-2451.2010.00767.x

(Top Image: Shaddack)

What It's Like to Actually See an Atomic Explosion

Most everyone has a pretty good idea of what an atomic explosion looks like. Through images and video, we know the flash, the fireball, the mushroom cloud. Seeing it all in person is quite different, however.

One of the few firsthand accounts immortalized to paper comes courtesy of the inimitable Richard Feynman, who was present for the very first detonation of a nuclear weapon. The test, codenamed "Trinity" was carried out on July 16, 1945 in the Jornada del Muerto desert of New Mexico. The 20-kiloton blast was the culmination of years of work by the scientists of the Manhattan Project. One of those scientists, the 27-year-old Feynman, sought to view his handiwork with his own eyes:

They gave out dark glasses that you could watch it with. Dark glasses! Twenty miles away, you couldn't see a damn thing through dark glasses. So I figured the only thing that could really hurt your eyes (bright light can never hurt your eyes) is ultraviolet light. I got behind a truck windshield, because the ultraviolet can't go through glass, so that would be safe, and so I could see the damn thing. 

Time comes, and this tremendous flash out there is so bright that I duck, and I see this purple splotch on the floor of the truck. I said, "That's not it. That's an after-image." So I look back up, and I see this white light changing into yellow and then into orange. Clouds form and disappear again--from the compression and expansion of the shock wave. 

Finally, a big ball of orange, the center that was so bright, becomes a ball of orange that starts to rise and billow a little bit and get a little black around the edges, and then you see it's a big ball of smoke with flashes on the inside of the fire going out, the heat. 

All this took about one minute. It was a series from bright to dark, and I had seen it. I am about the only guy who actually looked at the damn thing--the first Trinity test. Everybody else had dark glasses, and the people at six miles couldn't see it because they were all told to lie on the floor. I'm probably the only guy who saw it with the human eye. 

Actually, Feynman wasn't the only person who chose not to don their safety glasses that day. Ralph Carlisle Smith, the future assistant director of Los Alamos Scientific Laboratory, also observed the explosion with the naked eye. Here's what he saw:

"I was staring straight ahead with my open left eye covered by a welders glass and my right eye remaining open and uncovered. Suddenly, my right eye was blinded by a light which appeared instantaneously all about without any build up of intensity. My left eye could see the ball of fire start up like a tremendous bubble or nob-like mushroom. I Dropped the glass from my left eye almost immediately and watched the light climb upward. The light intensity fell rapidly hence did not blind my left eye but it was still amazingly bright. It turned yellow, then red, and then beautiful purple. At first it had a translucent character but shortly turned to a tinted or colored white smoke appearance. The ball of fire seemed to rise in something of toadstool effect. Later the column proceeded as a cylinder of white smoke; it seemed to move ponderously. A hole was punched through the clouds but two fog rings appeared well above the white smoke column."

There are other accounts, of course, from those who did not actually see an atomic explosion, but felt its effects infinitely more than either Feynman or Smith. Over 100,000 people lost their lives when atomic bombs were dropped on Hiroshima and Nagaski. Here are a few of their stories.

Source: "Surely You're Joking, Mr. Feynman!"

(Image: Jack Aeby)

Will Science Drive Religion Extinct?

Religion is declining in America.

This is actually something fairly new. For decades, religion has been on the wane in developed countries worldwide, with statistical models going so far as to predict its eventual extinction in nine countries: Australia, Austria, Canada, the Czech Republic, Finland, Ireland, the Netherlands, New Zealand and Switzerland. America was pretty much the sole country bucking the trend to nonbelief. No longer.

In 1998, 62 percent of Americans said they were “moderately” or “very” religious. In 2014, that number dropped to 54 percent. According to a recent study, irreligion is particularly pronounced amongst younger Americans.

"Nearly a third of Millennials were secular not merely in religious affiliation but also in belief in God, religiosity, and religious service attendance, many more than Boomers and Generation X’ers at the same age," the authors wrote. "Eight times more 18- to 29-year-olds never prayed in 2014 versus the early 1980s."

In light of the new data, it seems inevitable that as demographics change over a matter of decades, religious practitioners will become a minority group in the United States. What's driving the decline?

While a variety of factors are likely at play, I'd like to focus on what may be the most significant contributor: science.

We are perhaps the first generation of humans to truly possess a factually accurate understanding of our world and ourselves. In the past, this knowledge was only in the hands and minds of the few, but with the advent of the Internet, evidence and information have never been so widespread and accessible. Beliefs can be challenged with the click of a button. We no longer live in closed, insular environments where a single dogmatic worldview can dominate.

As scientific evidence questions the tenets of religion, so too, does it provide a worldview to follow, one that's infinitely more coherent.

Sir James George Frazer, often considered one of the founding fathers of modern anthropology, wrote that -- when stripped down to the core -- religion, science, and magic are similar conceptions, providing a framework for how the world works and guiding our actions. He also noted that humanity moved through an Age of Magic before entering an Age of Religion. Is an Age of Science finally taking hold?

Bemidji State University psychology professor Nigel Barber expounds upon Frazer's thoughts even further.

"[He] proposed that scientific prediction and control of nature supplants religion as a means of controlling uncertainty in our lives. This hunch is supported by data showing that more educated countries have higher levels of non belief and there are strong correlations between atheism and intelligence."

Frazer's hunch is also supported by a recent study published journal Personality and Individual Differences. Querying 1,500 Dutch citizens, a team of researchers led by Dr. Olga Stavrova of the University of Cologne found that belief in scientific-technological progress was positively associated with life satisfaction. This association was significantly larger than the link between religion and life satisfaction. Moreover, using the World Values Survey, they extrapolated their findings worldwide. As Ronald Bailey reported in Reason:

Stavrova and company concluded that the "correlation between a belief in scientific–technological progress and life satisfaction was positive and significant in 69 of the 72 countries." On the other hand, the relationship between religiosity and life satisfaction was positive in only 28 countries and actually negative in 5 countries.

"Believing that science is or will prospectively grant... mastery of nature imbues individuals with the belief that they are in control of their lives," Stavrova concluded.

So not only does science dispel religious belief, it also serves as an effective substitute for it. Science will never drive religion completely extinct, but religion may be marginalized to a small minority bereft of influence.

One of science's primary aims is to seek out knowledge that will hopefully better our world and the lives of all who live on it. That's something we all can believe in.

(Image: AP)

Six of the Strangest Cancer "Cures"

At least 14 million people in United States are currently diagnosed with cancer, and around half of them, at one time or another, have pursued alternative therapies for their disease. While some of these therapies can help alleviate the debilitating side effects associated with cancer, none are effective at treating the disease  or curing it outright.

But facts and evidence haven't stopped snake oil salesmen from pushing their ineffective panaceas. Here are six of the strangest "cancer cures" ever sold:

1. Emu Oil. Some products that FDA regulators examine present a genuine challenge to classify as legitimate or bogus. "Pure emu oil" was not one of them. Its claim to "eliminate skin cancer in days" was particularly specious.

Harvested from the adipose tissue of the emu, a large flightless bird, emu oil may impart some medicinal benefits, but they are thus far unconfirmed.

2. Electrohomeopathy. In the 19th century, Count Cesare Mattei found a way to capitalize on the burgeoning practice of homeopathy. First, add "electro" to its name. Second, sell custom electric devices to bolster traditional homeopathic treatments. The scheme worked brilliantly. The practice, itself, did not. Despite its ineffectiveness, it is still practiced today, particularly in bastions of naturopathic medicine like India, Pakistan, and Bangladesh.

Homeopathy is bunk. Providing a spark of electricity doesn't change that.

3. The Grape Cure. Grapes make for a delicious, nutritious snack and even produce a remarkable burst of plasma when microwaved! But while the multifaceted fruit is good for a great many situations, it isn't effective at curing cancer.

Tell that to Johanna Brandt, who pioneered a grape-only diet for curing cancer. Dr. Stephen Barrett dispels her quackery.

"There is no scientific evidence that the Johanna Brandt's "Grape Cure" has any value. Even worse, her recommended diet is deficient in most essential nutrients and can cause constipation, diarrhea, cramps, and weight loss that is undesirable for cancer patients."

4. Germanic New Medicine. According to Ryke Geerd Hamer, the founder of Germanic New Medicine, severe diseases like cancer result from shocking events that trigger psychological conflict. This conflict manifests physically as disease. To cure the disease, you simply need to resolve the conflict.

Doubling down on crazy, Hamer claims that evidence-based medicine is a Jewish conspiracy designed to kill non-Jews. The German Cancer Society and the German Medical Association strongly disagree.

5. Zap Away the Parasites. For decades, Hulda Regehr Clark claimed to have "The Cure for All Cancers.” The "cure" of which she spoke and wrote so glowingly, was a "zapper" device that supposedly removed disease-causing parasites from the body and resulted in a 95 percent cure rate. Sales of her books and unfounded treatments earned her millions of dollars.

Of course, her cure has never been substantiated by any sort of evidence, nor could it save her. Clark died of cancer in 2009.

6. Venus Flytrap Extract. Due to their carnivorous nature and quirky looks, the venus flytrap is one of the few plants that is actively poached, so much, in fact, that it is at risk of extinction. No doubt contributing to the plant's desirability are dubious claims that it can "eat cancer." Venus flytrap extract is sold in the form of an herbal remedy called Carnivora. Despite its fantastic name, no clinical studies have shown the supplement to be effective in the treatment of cancer.

(Images: AP Photo/Wilmington Star-News, Matt Born, File, djpmapleferryman, Public Domain, Stefano Zucchinali)

Large Comets May Have Liquid Water Cores. Could They Contain Life?

IN 2005, one day before the comet Tempel 1 made its closest approach to the Sun, NASA scientists got a chance to embrace their inner Hulks. Like rambunctious schoolchildren giddy to cause a little mayhem, they smashed an 820-pound impactor into the comet at tremendous speed, and then -- undoubtedly with large grins plastered upon their faces -- watched what happened.

Almost instantly, a massive cloud of dust began spewing from the 72-trillion-kilogram comet. Subsequent analysis from the nearby Deep Impact probe revealed the presence of silicates, carbonates, metal sulfides, amorphous carbon, and hydrocarbons, as well as water ice, within the plume -- in short, the stuff that life is made of. When the enriched dust cloud dissipated, scientists were able to view their handiwork: a crater 328 feet wide and 98 feet deep.

In the wake of NASA's Deep Impact mission, interest in comets grew by orders of magnitude. Scientists had their first concrete evidence that the frozen hunks of water, rock, and various gases contained the building blocks of life. No longer mere objects to be charted by astronomers and ogled by sky watchers, comets now demanded an existential reverence.

TRAVEL BACK 4 billion years and you might find yourself in the middle of a storm of cataclysmic proportions. At this time, when the planets of the young Solar System weren't neatly synced into their elliptical orbits, it has been theorized that Uranus and Neptune rammed into a reservoir of icy comets, sending asteroidal and cometary debris raining down on the inner planets. During the Late Heavy Bombardment, as the event is called, the Earth was getting slammed, so much, in fact, that if life existed, the surface may have been sterilized. As many as 22,000 objects rocked our home over a period of 300 million years. However, in subsurface cracks created by the pummeling, life could have been boosted, or even seeded. Recent research suggests that impacts of comets containing organic compounds could generate peptides, the building blocks of proteins. The Solar System's most cataclysmic storm could very well have been a drizzle of life.

EVEN MORE ASTOUNDING, some of the comets that struck Earth could have already contained life. The chances are remote, but it is possible. According to recent research published to the journal Astrobiology, large comets with a radius of over 10 kilometers could contain liquid water at their cores. The decay of radioactive isotopes of aluminum or iron could supply the heat necessary to melt the inner ice. Katharina Bosiek, along with her colleagues Michael Hausmann and Georg Hildenbrand, suggest that a thick layer of dust could protect the core's liquid environment from solar radiation, echoing learned speculations found in prior research. Their findings make the hopeful words of Nalin Chandra Wickramasinghe, the Cardiff University astrobiologist who was one of the earliest proponents of panspermia, believable.

"Supposing comets were seeded with microbes at the time of their formation from pre-solar material, there would be plenty of time for exponential amplification and evolution within the liquid interior," he wrote in 2009.

In this view, large comets could be seen as enchanting snow globes just waiting to be smashed upon fertile ground, thus releasing the microbial life contained inside. It's not inconceivable. Some of Earth's extremophiles display surprising resilience to the inhospitable conditions of space, and they didn't even evolve there.

Skepticism is called for, however. Given the sometimes transient nature of comets and the harsh conditions of space, it's hard to imagine that life, if it ever existed inside them, could still exist today. Still, the tantalizing notion makes a mission to the Solar System's Kuiper Belt or Oort Cloud, where as many as 100,000 comets reside, that much more tempting.

Reference: Bosiek Katharina, Hausmann Michael, and Hildenbrand Georg. "Perspectives on Comets, Comet-like Asteroids, and Their Predisposition to Provide an Environment That Is Friendly to Life." Astrobiology. March 2016, ahead of print. doi:10.1089/ast.2015.1354.

(Image: NASA)

The Mystery of Right Whale 1334

It seems odd to say that scientists were ecstatic about the opportunity to shoot a critically endangered whale, but that was exactly how Katie Jackson and her colleagues at the North Atlantic Right Whale Program felt when they saw Whale 1334 on a mild February day in 2013 off the coast of Jacksonville, Florida.

The weapon of choice was a harmless one, of course. A bolt from the large crossbow would certainly harm or kill a human, but it would be little more than a pinprick to an animal the size of a school bus, and a valuable pinprick at that. A mechanism at the end of the bolt would collect a tiny piece of blubber from 1334, enough for biologists to sample and study her DNA. When Jackson's partner Tom Pitchford connected with the shot, the duo was elated.

For decades, 1334's genetic information had been prized more any other right whale's. Over a timespan of thirty years, she had been the most productive mother of all North Atlantic right whales, giving birth nine times. Yet her comings and goings were puzzling to say the least. She did not show up in regions where the whales typically congregated, and she disappeared for years at a time. In her recent book Resurrection Science, journalist M.R. O'Connor expounded upon the mystery.

"She was first seen off the southern coast back in the early 1980s and reappeared there periodically. But unlike the others, 1334 did not show up in the Bay of Fundy [off the coast of Maine] in the summers with the rest of the right whales. No one saw her again, until she appeared in Florida three years later with a new calf. And then the same thing happened three years later... 1334 gave birth during years when biologists saw calving rates stall and even decline in the general population. In 2000, she was the only right whale to give birth to a calf."

Considering that just five hundred right whales remain in the world, 1334's mysterious, yet prolific procreating was instrumental in keeping the species alive. Could there be secrets in her DNA that might prevent their extinction?

As O'Connor described, a right whale pregnancy is a monumental task. Pregnant females must consume as many as 4 million calories a day in the form of miniscule zooplankton. The binging doesn't stop even when the calf is born after a yearlong gestation, for that's when the nursing begins, which roughly lasts another year. Due to the great expense of reproduction, female right whales are able to delay pregnancy until they've stored up enough energy in the form of blubber to afford giving birth.

Thus, when Trent University geneticist Brad White started examining 1334's DNA in spring 2014, he had a hunch that her genotype permitted her to birth calves regardless of good or poor nutrition. That hunch is still being explored.

"Nothing has jumped out yet about the DNA profile," he told RCS in an email.

Of course, 1334's success could be more attributed to her behavior than to her hard-wiring. The massive whale goes her own way, and it's entirely possible that she's stumbled upon more hospitable grounds to mate and rear a calf. Scientists still don't know exactly where she travels.

That an animal weighing well north of 100,000 pounds can disappear so easily is remarkable, especially considering that right whales were once hunted and killed like cattle.

O'Connor simultaneously laments and appreciates that the mystery of 1334 remains unsolved.

"As much as I want to know where 1334 goes, I cheered her elusiveness and hoped that the ocean is still big enough for her to escape the forces threatening her kind."

Primary Source: M. R. O'Connor. Resurrection Science: Conservation, De-Extinction and the Precarious Future of Wild Things, St. Martin's Press, 2015

(Image: AP)