RealClearScience Journal Club

Science Figures Interpreted and Analyzed by RealClearScience

This Is How Many Microbes You'll Eat Today

Ross Pomeroy - December 10, 2014

When you have a Big Mac, medium fry, and a medium Coca-Cola for Lunch, you're not just consuming 1,070 calories, 64 grams of sugar, 43 grams of fat, and 1,150 milligrams of sodium; you're also eating 238,000 microbes -- mostly bacteria, with a few thousand yeast and mold organisms as well.

That's the finding from a trio of scientists at UC-Davis, who, for the first time, tallied the number and type of microorganisms present in the average American diet.

For their study, which was published Tuesday to PeerJ, researchers Jenna Lang, Jonathan Eisen, and Angela Zivkovic purchased and prepared a full day's worth of meals for three separate diets: an average American diet, a USDA-recommended diet, and a vegan diet. The American diet consisted of food from Starbucks and McDonald's, as well as frozen and packaged food from the grocery store. The USDA diet offered cereal, a variety of vegetables, and a turkey sandwich, among other selections. The vegan diet had lots of vegetables, nuts, and fruits, and rounded out with tofu, almond milk, and vegetable protein powder. All of the diets tipped the energy scale at a little over 2,200 calories.

After painstakingly purchasing, preparing, and cooking a day's worth of meals for each diet, the researchers meticulously weighed everything, then obliterated all of it in a blender for analysis.

The USDA diet contained the most microbes, roughly 1.26 billion. The vegan diet came in a distant second at just over 6 million, while the average American diet lagged behind at 1.39 million. Aerobic and anaerobic bacteria dominated the counts. Yeast and mold constituted a far smaller portion.

The researchers were quick to caution that these are only daily estimates, affected largely by the choice of foods. For example, the USDA diet had three foods -- yogurt, Parmesan cheese, and cottage cheese -- which contained live and active bacterial cultures. These sources immensely inflated the microbe count.

Lang, Eisen, and Zivkovic conducted the study because they noticed a dearth of information concerning the microbes that we eat each and every day.

"Far more attention has been paid to the microbes in our feces than the microbes in our food," Lang wrote.

"There could be interesting relationships between the nutritional content of the foods that we eat, the microbes that associate with those foods, and our gut microbiome, not just because we are “feeding” our gut microbes, but because we are eating them as well."

"Further studies are needed to determine the impact of ingested microbes on the intestinal microbiota, the extent of variation across foods, meals and diets, and the extent to which dietary microbes may impact human health."

Source: Lang JM, Eisen JA, Zivkovic AM. (2014) The microbes we eat: abundance and taxonomy of microbes consumed in a day’s worth of meals for three diet types. PeerJ 2:e659 http://dx.doi.org/10.7717/peerj.659

(Image: AP)

Study: Summer Jobs Reduce Violent Crime Among Disadvantaged Youths

Ross Pomeroy - December 4, 2014

Give a disadvantaged youth a summer job, and he or she will be much less likely to commit a violent crime. That's the conclusion from a randomized controlled trial just published to the journal Science.

University of Pennsylvania criminologist Sara Heller oversaw the study, which took place in 13 high-violence schools in areas of Chicago. 1,634 students participated in the summer of 2012. Almost all of them were minorities, and more than 90% were on free or reduced lunch. 350 students were randomly assigned to 25-hour per week summer jobs, another 350 were given 15-hour per week jobs along with 10 hours of social-emotional learning classes "aimed at teaching youth to understand and manage the aspects of their thoughts, emotions, and behavior that might interfere with employment," and the remaining students carried on with their lives as normal. Jobs were paid at the Illinois minimum wage and lasted 8 weeks.

With the help of the Chicago Police Department, Heller observed arrest data for the duration of the study and 13 months after. Arrests for violent crime decreased 43% among the two treatment groups compared to the control group. Property and drug-related crimes slightly increased, but the differences were statistically insignificant.

Importantly, Heller found that the largest decreases in violent crime rates came months after the jobs program ended, meaning that it wasn't specifically being in the program that prevented crime; the experience may have engendered lasting behavioral change.

There was little difference in effect size between the two treatment groups -- either the job-alone or the job plus classes -- which indicates that the classes weren't the sole reason for the reduction in violent crime.

So what was? In societal studies like these, the answer is very open to interpretation.

"The experimental design cannot tease out which program element or elements generate the decrease in violence," Heller admitted.

Youth employment programs have been studied in the past with mixed results. Most of the time, the program costs seem to outweigh the societal benefit. In this case, the program cost an estimated $3,000 per student ($1,400 for wages and $1,600 for administrative costs) while yielding around $1,700 in benefits from reduced crime. So the benefits did barely outweigh the administrative costs.

However, Heller insists that preventative programs like these are still more cost-effective than remediative punishment like prison.

"The results echo a common conclusion in education and health research: that public programs might do more with less by shifting from remediation to prevention. The findings make clear that such programs need not be hugely costly to improve outcomes for disadvantaged youth; well-targeted, low-cost employment policies can make a substantial difference, even for a problem as destructive and complex as youth violence."

Source: Sara B. Heller. "Summer jobs reduce violence among disadvantaged youth." Science. 5 DECEMBER 2014 • VOL 346 ISSUE 6214

(Images: AP, Science)

Chernobyl Radiation Changed Rodent Hair Color?

Alex B. Berezow - December 1, 2014

When people think of radioactivity, many imagine it converting cute, fluffy animals into scary, green, glowing mutants. But, that's just a myth. Radioactivity is invisible. The reason we associate radiation with "glowing green" is because many types of instrument dials (such as a clock face) were painted with radioluminescent paint, a mixture that contained a radioactive isotope (often radium) and other chemicals that would emit green light in response to the radiation. Similarly, while it is true that some nuclear power plants produce a hauntingly blue glow, this is not because the radioactive fuel itself is glowing, but because of a strange phenomenon known as Cherenkov radiation, in which particles moving faster than the speed of light emit photons, generally in the UV to blue light range.

However, this is not the whole story. The great radiation/color narrative has taken yet another twist. A team of scientists led by Zbyszek Boratyński has reported in the journal Scientific Reports that Chernobyl radiation has changed the hair color of local rodents.

To determine this, the team captured bank voles in the vicinity of the Chernobyl nuclear plant and measured the nearby soil radiation. Digital photographs were taken of the fur on the animals' backs, and the amount of red color was assessed. The authors found that voles exposed to greater levels of radiation had less red pigmentation. (See graph.)

Why this should be the case is not perfectly clear, but the authors propose an intriguing explanation. The production of red pigment (pheomelanin) requires the consumption of antioxidants. Protection of the body from free radicals, which are generated by radiation, also requires the consumption of antioxidants. So, the authors hypothesize that the rodents synthesize less red pigment in order to save antioxidants for a far more crucial purpose.

There is still much work to be done.

First, the correlation coefficient (r = -0.15) is rather weak. This implies that only about 2% of the observed variation in fur color is explainable by the radiation. (The authors did, however, use an alternate method on a sub-sample which showed a greater correlation.) Repeating the experiment with a larger sample may be necessary.

Second, the ecological significance is unknown. It is possible that voles with differently colored fur are more/less susceptible to predation, or it could be that slight changes in fur coloration have no effect whatsoever. The authors plan to investigate this further.

Finally, the proposed molecular mechanism needs to be verified.

Still, regardless of the outcome of their subsequent investigations, the authors appear to have gained an interesting insight on how animal life has learned to adapt to the less-than-ideal conditions at the Chernobyl disaster site.

Source: Zbyszek Boratyński et al. "Increased radiation from Chernobyl decreases the expression of red colouration in natural populations of bank voles (Myodes glareolus)." Scientific Reports 4: 7141. Published: 21-November-2014. doi:10.1038/srep07141

(AP photo)

How the Polish Buried Their Vampires

Alex B. Berezow - November 26, 2014

Belief in vampires was a global phenomenon. Cultures all over the world once believed that certain humans wander the Earth after their death, engaging in acts of decidedly anti-social behavior. Though most sophisticated scholars know that wooden stakes and sunlight work best, each culture had invented various methods of dispatching those vampires permanently into the afterlife.

For instance, in the 1600s and 1700s in Drawsko Pomorskie, a town in northwestern Poland, a suspected vampire was buried with a sickle across his body and/or a stone in his mouth. If the undead was to attempt a nightly prowl, the sickle would disembowel or decapitate him; if that didn't work, the stone would prevent him from biting anybody. Problem solved. (See photos below.)

But just who were these suspected vampires? What could one possibly do in life to become such a social outcast that your neighbors just assumed you might return from the grave to do them harm? The prevailing hypothesis is that these unlucky souls were immigrants. As is the case today, Europeans tended to view outsiders unfavorably. Foreigners are strange, and if anyone is going to turn into a vampire, surely it would be one of those oddballs who smelled weird and talked funny.

With this hypothesis in mind, American and Canadian researchers examined skeletons exhumed from a cemetery outside Drawsko Pomorskie in an excavation that began in 2008. Of the 285 inhabitants, six of them were suspected vampires. (Two of them are pictured above. The first photo depicts a skeleton with a sickle around its neck, while the second photo depicts a skull with a stone placed in its mouth.) Interestingly, the potential vampires were not segregated, but buried alongside everybody else.

At that time, northwestern Poland was a region that had a large number of immigrants. Therefore, the authors predicted that the six suspected vampires, due to their "outsider" status, were likely to be immigrants. To test this hypothesis, they examined strontium isotope ratios (Sr-87/Sr-86) in the skeleton's teeth. Because strontium is a Group II metal with two valence electrons, it chemically behaves like calcium, allowing it to take the place of calcium in bones and teeth. Sr-86 is a stable isotope, but Sr-87 slowly forms from the beta-decay of rubidium-87. (The half-life of rubidium-87 is 49 billion years.) Variations in the strontium isotope ratio generally result from differences in geographic location.

The region around Drawsko Pomorskie has a strontium istope ratio of about 0.710 to 0.711. If the skeletons of the potential vampires were indeed foreigners, the researchers would have found a different ratio in their teeth.

But they did not. This strongly implies that the skeletons were of locals, not foreigners. So, why did they receive postmortem anti-vampire therapy? The authors conclude:

Individuals ostracized during life for their strange physical features, those born out of wedlock or who remained unbaptized, and anyone whose death was unusual in some way – untimely, violent, the result of suicide, or even as the first to die in an infectious disease outbreak – all were considered vulnerable to reanimation after death.

One possibility the authors entertain is that the suspected vampires were the first victims of a cholera epidemic, but there is little evidence beyond historical speculation to support this.

Unfortunately, it appears for those souls destined to become vampires, death did not bring peace, but rather a continuation of their earthly tribulations.

Source: Gregoricka LA, Betsinger TK, Scott AB, Polcyn M. "Apotropaic Practices and the Undead: A Biogeochemical Assessment of Deviant Burials in Post-Medieval Poland." PLoS ONE 9(11): e113564. (2004) doi:10.1371/journal.pone.0113564

(Image: Nosferatu)

Team Boosts Silver's Antimicrobial Power 1000x

Ross Pomeroy - November 26, 2014

Indian scientists have coupled silver molecules with carbonate ions to hugely enhance the metal's antibacterial activities. If adopted, the unexpected breakthrough could save 1,300 tons of silver, worth roughly $766 million dollars, each year.

Silver is already widely used in medicine as well as the fields of food sanitation and water purification to the tune of around 6,000 tons per year, or about a quarter of annual production. A silver concentration of just 50 parts per billion in drinking water kills more than 99% of bacteria. For this reason, silver is used to disinfect drinking water in many developing countries and even onboard the International Space Station.

In an effort to improve the antimicrobial power of silver, the researchers added various anions to water containing 50 parts per billion of silver. Carbonate, composed of one carbon atom and three oxygen atoms, showed the most potential. Upon further experimentation, they found that a mixture of just 25 parts per billion of silver with the addition of a 20 parts per million of carbonate was enough to reduce E. coli levels in water 100,000 times over. 50 parts per billion of silver alone reduced E. coli amounts 100 times over.

Further testing elucidated why the pairing of silver and carbonate is so successful. Like a boxer strategically softening up his opponent, carbonate destabilizes bacteria's peripheral membrane proteins, allowing silver to sneak in and deliver its knockout blow with far less resistance.

Each year, diarrhea kills 760,000 children under the age of five. According to the World Health Organization, safe drinking water is one of the best ways to prevent the infection. The researchers are hopeful their discovery will make it easier to spread proper water sanitation throughout the developing world.

"This work leads to a new paradigm in the field of affordable water purification by reducing the cost of antimicrobial treatment, particularly in the developing world, without disinfection by-products."

Environmentalists also have reason to cheer the finding. Silver nanoparticles currently leach into oceans and rivers via waste treatment, and this may be damaging marine ecosystems by killing off the good bacteria that support them. Thus, lower silver concentrations in drinking water may be good news for marine bacteria.

(Image: Alchemist-hp, Swathy et. al.)

Source: Swathy, J.R. et al . Antimicrobial silver: An unprecedented anion effect. Sci. Rep. 4 , 7161; DOI:10.1038/srep07161 (2014)

Drug Overdose: The Real American Epidemic

Alex B. Berezow - November 24, 2014

Recently, there has been much talk of various "epidemics" in America. The three most commonly mentioned are suicide, gun violence, and drug overdose. A close examination of the data, however, reveals two surprises: First, one of them is not actually an epidemic. Second, one of them is a much bigger epidemic than most people realize. (See chart.)

The "suicide epidemic" (red line) has received the most attention as of late. This is for good reason. At a rate of 12.54 deaths per 100,000 Americans, the suicide rate is at a 25-year high. The CDC, which provides publicly accessible data (via WISQARS) from 1999 to 2012, shows that the suicide rate over that period has increased by nearly 19.7%.

Similarly, school shootings and other mass killings result in highly partisan debates about the "gun violence epidemic." However, the CDC data does not show that this even is an epidemic. Instead, the homicide-by-firearm rate (purple line) has been declining from a high of 4.27 per 100,000 in 2006 to 3.76 per 100,000 in 2012, a roughly 12% drop. The average American is more than three times as likely to commit suicide than to be shot and killed.

The most shocking data are the deaths due to unintentional drug poisonings (green line). From 1999 to 2012, deaths by drug overdose increased from 4.00 per 100,000 to 10.54 per 100,000, a whopping 164% increase. While the suicide rate has slowly climbed over the past decade, the death rate from unintentional drug overdoses has skyrocketed. Indeed, the term "epidemic" was invented for trend lines like this. (Note: More detailed information on what exactly constitutes unintentional drug poisoning can be found in the ICD-10 under codes X40-X49.)

It should be noted that accidental drug overdoses include far more people than just celebrities and gangbangers who snort cocaine and guzzle alcohol. Indeed, a substantial proportion of overdoses are with prescription drugs, such as opioids (e.g., OxyContin, Vicodin) and benzodiazepines (e.g., Xanax, Valium, Ativan). Additionally, as reported this week in an article appropriately titled "The Great American Relapse," The Economist notes that heroin is making a comeback. Deaths from heroin overdoses have doubled from 2010 to 2012 as opioid painkiller addicts forego expensive prescription meds for cheaper heroin from the street.

As it turns out, our society's habit of reaching for the medicine cabinet for every (heart)ache and pain is quite literally killing us.

The explosive growth in the number of deaths due to unintentional drug overdoses is nothing short of a national emergency. Yet, the phenomenon gets very little attention in the popular press. Unfortunately for our highly medicated society, this is one problem that we cannot solve by popping more pills.

Source: Centers for Disease Control and Prevention, National Center for Injury Prevention and Control. Accessed: 23-Nov-2014.

(AP photo)

Have Men Reached the Limits of Running?

Ross Pomeroy - November 21, 2014

For decades, mankind's physical abilities have steadily increased. With every broken world record, humans have demonstrated the potential to throw farther, run faster, and jump higher. More recently, scientists, athletes, and spectators alike have wondered if or when we'll reach a ceiling. A new study published in PLoS ONE shows that, as far as distance running is concerned, we may be getting close.

Researchers from the University of Washington and the Mayo Clinic teamed up to analyze the 40 fastest male performances in the 5,000 meters, the 10,000 meters, and the marathon for each year from 1980 to 2003. They also tallied and plotted the number of athletes who turned in elite performances each year, deemed to be under 13:20 for the 5000 meters, 27:45 for the 10000 meters, and 2:10 for the marathon (shown below).

The researchers found that, for the past decade, performance in the 5,000 and 10,000-meter races has leveled off. In fact, there have been no new world records for either since 2004 and 2005 respectively, the longest gap between world records since the 1940s.

Men still seem to be making progress in the marathon, however. In fact, the world record (still pending official ratification) was set at the Berlin Marathon in September. Kenya's Dennis Kimetto ran the 26.2 miles in 2:02:57.

"All indices of [marathon] speed show a nearly linear increase in speed with an increased number of elite performances over the three plus decades we sampled," lead researcher Timothy Kruse reported.

Kruse and his colleagues announced three possible interpretations of the data. First, improved drug testing may be preventing athletes from using compounds like erythropoietin to boost the number of red blood cells in the blood stream and thus improve their aerobic capacity, a process colloquially known as "blood-doping." Second, more lucrative financial incentives for marathons may have drawn elite runners away from the 5,000 and 10,000-meter events. And third, men may be nearing a "physiological upper limit."

As men near the 2-hour mark for the marathon, which would require a blistering average speed of 13.1 miles per hour, many openly wonder whether or not that milestone will be eclipsed. If current trends continue, it's certainly possible.

Source: Kruse TN, Carter RE, Rosedahl JK, Joyner MJ (2014) Speed Trends in Male Distance Running. PLoS ONE 9(11): e112978. doi:10.1371/journal.pone.0112978

Active Moms May Be the Cure for Obesity

Ross Pomeroy - November 18, 2014

As Americans collectively remain overweight and obese, many scientists have sought to understand precisely why this is so. Most broadly, the answer is tied to socioenvironmental evolution, which encompasses a host of factors. For example, we're eating differently than we used to, we're eating more than we used to, and we're moving a lot less than we used to.

Now that we've gotten ourselves stuck in such a heavy state, scientists are concerned that it will be difficult to dislodge ourselves. Specifically, if studies in mice are any indication, obesity risk may be transferable to offspring via epigenetics. In other words, parental obesity may effect how children's genes are expressed, making kids more likely to store excess calories as fat and more likely to develop diabetes. Considering that the rate of childhood obesity has more than doubled in the past thirty years, this is certainly possibly. However, more research needs to be performed, to first convincingly confirm these effects in humans, and then to potentially create treatments to counter them.

Today, in the journal Mayo Clinic Proceedings, post-doctoral researcher Dr. Edward Archer of the University of Alabama provided a theoretical framework to help guide these efforts. He dubbed his idea the Maternal Resources Hypothesis. Formed via an analysis of the current scientific literature, it states that because mothers are, on average, eating more and exercising less than in years past, they are storing more energy as fat and passing on more nutrients to fetuses in utero, resulting in larger babies with more fat at birth. Larger babies, coupled with a more sedentary lifestyle during childhood, means that these children will be predisposed to obesity. Since the number of fat cells is largely set during childhood, obesity early on can elevate a person's risk for obesity for the rest of their life. If, indeed, these children remain obese into adulthood and have children, the cycle continues.

Archer's hypothesis is intriguing. Now that it's out there, future studies will either confirm or refute it.

If it does prove correct, Archer believes that it highlights a key tool to alleviating the current obesity pandemic. He urges women planning on motherhood to be physically active throughout their entire lives in order to prepare their metabolisms for pregnancy and thus have metabolically healthy children.

"Active moms are the cure," Archer said.

The onus for obesity isn't only on moms, however. Last year, a study showed that children born to obese fathers had discernible differences in gene expression compared to children born to normal weight fathers.

Would-be moms and dads should strongly consider living a lifestyle that involves eating less and moving more. Not just for themselves, but for their kids.

Source: Archer, Edward. "The Childhood Obesity Epidemic as a Result of Nongenetic Evolution: The Maternal Resources Hypothesis." Mayo Clin Proc. n Dec 2014; http://dx.doi.org/10.1016/j.mayocp.2014.08.006

Simmer Down: Viruses Not 'Fourth Domain' of Life

Alex B. Berezow - November 17, 2014

Biologists have categorized life into three large domains: Bacteria, Archaea (weird, bacteria-like microbes), and Eukarya (unicellular and multicellular organisms such as fungi, plants, and animals that possess nucleated cells). Under this classification system, viruses are left out in the cold. They certainly are not "alive" in the classical sense because they are not capable of metabolizing or replicating on their own. But it does not feel quite right to classify them as "inanimate," either, because they are built of biological molecules and contain genetic information. Thus, for the most part, viruses languish in the no man's land between the living and the dead.

The debate about how to classify viruses received a jolt with the discovery of extremely large viruses (such as Pandoravirus) that are so gigantic they can be seen with a light microscope and contain more genetic information than some bacteria. It has been proposed, due to some intriguing similarities in DNA sequences -- specifically, in the gene that encodes for an enzyme called RNA polymerase -- that such large viruses actually constitute a "fourth domain" of life. If that is the case, then perhaps all viruses should be awarded this new status.

"Sacrebleu!" say French scientists in a recent issue of Trends in Microbiology. Considering viruses to be a fourth domain would unnecessarily complicate evolutionary biology. For instance, the authors indicate that using RNA polymerase to redraw the tree of life presents a gigantic challenge. Large viruses do not all cluster into a single new domain. Instead, classifying life based on RNA polymerase would likely demand the creation of several new domains. (See figure.)

Such a phylogenetic tree is unwieldy, or as evolutionary biologists call it, "non-parsimonious." A foundation of building evolutionary trees is that they ought to be as simple as possible. This is referred to by scientists as the principle of parsimony but is more colloquially known as Occam's Razor. Essentially, a model that requires fewer assumptions (in this case, evolutionary changes) is superior to a model that requires more.

This is not the only problem with a viral fourth domain. The biggest difference between cells and viruses is their method of replication. All three domains of life replicate by cell division, which implies that this trait was derived from the Last Universal Common Ancestor (LUCA). (In other words, LUCA is the theoretical ancestor of Bacteria, Arcahea, and Eukarya.)

Viruses, which do not replicate by cell division, probably evolved independently multiple times, "here, there, and everywhere," as the authors conclude. Some probably evolved before LUCA, and others well after LUCA. Many have likely exchanged genetic material via horizontal gene transfer. Lumping them all into a fourth domain, therefore, makes little sense.

Though the debate over the classification of viruses may at first seem to be purely academic, it touches upon underlying questions that are of much greater significance: What exactly is life, and how did it evolve? The answer to those questions may be partially found within the enigmatic world of the viruses.

Source: Patrick Forterre, Mart Krupovic, and David Prangishvili. "Cellular domains and viral lineages." Trends in Microbiology, 22 (10): 554-558. October 2014. 

Ancient Reptile Is Either Adorable or Terrifying

Ross Pomeroy - November 13, 2014

Around 252 million years ago, a Great Dying occurred. In a geologic blink of an eye, 96% of all marine species and 70% of all terrestrial species went extinct. Volcanoes, rapid climate change, or an asteroid impact may have been to blame; it could have even been all of the above. Regardless of what caused the Great Dying, the effect was akin to hitting a giant "reset" button on Earth. The planet became a blank slate, and new species emerged to write their stories.

It was in this setting that paleontologists believe that Garjainia madiba, a newly discovered species of reptile, thrived. G. madiba belonged to the Erythrosuchidae family, a group of apex predators that gained a widespread foothold on the new Earth. Closely related to modern day Crocodilians, the erythrosuchids are known for their distinct, large, and deep head, making them kind of "cute." The largest among them was Erythrosuchus, which sported a massive head on a squat frame. Erythrosuchus was 16 feet long and 7 feet tall.

G. madiba was big, but it wasn't that big. Estimates put its length at around 6 feet. The smaller size likely suited it just fine, however. In the wake of the Great Dying, much of the bigger land animals had died off, so G. madiba may have been one of the largest at the time. Geologically, it's been pinned down as the oldest known erythrosuchid in the southern hemisphere. Larger species like Erythrosuchus likely evolved later on.

The current G. madiba fossil was found in South Africa, but another was found in Europe. To the group of paleontologists who described the discovery yesterday in PLoS ONE, that demonstrates "that erythrosuchids became established as the largest terrestrial predators across a broad swath of Pangaea within five million years of the end-Permian mass extinction event."

(Images: Mark Witton, Dmitry Bogdanov)

Source: Gower DJ, Hancox PJ, Botha-Brink J, Sennikov AG, Butler RJ (2014) A New Species of Garjainia Ochev, 1958 (Diapsida: Archosauriformes: Erythrosuchidae) from the Early Triassic of South Africa. PLoS ONE 9(11): e111154. doi:10.1371/journal.pone.0111154

More Journal Club

Sci-Fi Now Reality: Mind Control of Gene Expression
November 11, 2014

Mind control -- specifically, the ability to manipulate machines or the environment through the power of our...

A Universe of Stars May Exist Outside Galaxies
November 6, 2014

In 1998, researchers using the Hubble Space Telescope made an astounding discovery. Gazing at the Virgo...

These Are the Most Common Sexual Fantasies
November 3, 2014

Sexual fantasizing is a notoriously closeted subject. This silence is problematic, however, as it reinforces...

Journal Club Archives