RealClearScience Journal Club

Science Figures Interpreted and Analyzed by RealClearScience

Suicides in U.S. Are Up, Especially in Rural Areas

Alex B. Berezow - May 19, 2015

Suicide remains a deeply strange phenomenon. Its disturbing nature is likely bolstered by what we perceive to be a grotesque violation of our basic biological strive for survival. Though biomedical science has exterminated many of the ailments which cut human life short in previous generations, it has yet to conquer the demons that haunt our minds. Suicide is a problem in rich and poor countries alike, and perhaps the most paradoxical study of the subject found that suicide rates tend to be highest in places with the highest levels of happiness.

New research has added yet another layer of complexity. The CDC reports that in the United States, suicide rates increased in both metropolitan and nonmetropolitan counties from 2004 to 2013. (See chart.)

There are two trends that stick out: (1) While suicide rates are up everywhere, small cities, towns and rural areas showed the largest increase. In those places, suicide rates increased by about 20%, while they only increased by about 7% in large cities. (Note: "Large fringe" refers to suburban counties.) (2) Suicide rates are lowest in cities and gradually worsen as the counties become more rural. In 2013, the suicide rate was 17.6 per 100,000 people in rural areas, but only 10.3 per 100,000 in large cities.

Another study released in March corroborate these results: Young people aged 10-24 living in rural areas are about twice as likely to commit suicide as those living in cities. Again, over time, the rural suicide rate increased at a greater pace.

What is driving this trend? That is hard to say. An analysis in The Atlantic suggests that greater access to guns, feelings of social isolation, fewer doctors, and stigmatization of mental illness all possibly contribute to the higher suicide rate in rural America. Whatever the cause, America's suicide epidemic is getting worse.

Source: Centers for Disease Control and Prevention. "QuickStats: Age-Adjusted Rates for Suicide, by Urbanization of County of Residence — United States, 2004 and 2013." MMWR 64 (14): 401. April 17, 2015.

(AP photo)

Infections Linked to Decrease in Cognitive Ability

Ross Pomeroy - May 18, 2015

Danish researchers have uncovered an association between infection and decreased cognitive ability. The finding was published last Wednesday to PLoS ONE.

A wide range of bacterial and viral infections, from influenza and pneumonia to hepatitis and encephalitis, can seriously stress the immune system. Oftentimes, the battle inflicts collateral damage on systems across the body. Some of these systems may not even be directly involved. For example, scientists are increasingly noting that infections and associated immune responses like inflammation can affect the brain.

This peaked the curiosity of researchers at the National Centre for Register-based Research at Aarhus University in Denmark. So, with a nationwide register of 161,696 young Danish men at their disposal, they decided to investigate whether or not infections were in any way linked to cognitive ability.

The Danish Conscription Registry tracked almost all males born between 1976 and 1994, collecting a variety of data, including the time, number, and detail of all infections requiring hospitalization. At age 19, all of the men in the register took a 3-hour test designed to measure cognitive ability. In hunting for an association between infection and cognitive ability, the researchers controlled for parental educational level, year of testing, birth order, multiple birth status, birth weight, gestational age, a parental history of infections, parental and individual history of psychiatric disorders, and substance abuse.

Analyzing the data, the researchers discovered that the more infections a subject endured throughout his life, the lower he scored on the test of cognitive ability. Subjects who were hospitalized once for an infection scored 0.83 points lower on a 100-point scale, while subjects who fell ill to 5 or 6 different infections scored 5.29 points lower. Unsurprisingly, infections of the central nervous system were associated with larger detriments, but respiratory, gastrointestinal, and skin infections also registered significant decreases in cognitive ability. (Below: Graph shows the unadjusted decreases in cognitive ability.)

The researchers also found that time factored into cognitive ability scores. Subjects with more recent infections faired worse than those with more distant infections.

What could account for the link between infection and a decrease in cognitive ability?

"The observed associations might be due to a biologically-mediated effect of the infection or associated immune responses causing an acute and possibly transient effect on general cognitive ability," the researchers say. "Inflammation and immune components can directly affect the glutamate, serotonin and dopamine systems that are considered central in cognition."

So does this mean that you get a little duller every time you succumb to an infection? Perhaps. But maybe not. Reverse causality can't be ruled out.

"Lower cognitive ability may be a risk factor for acquiring infections," the researchers admit. "Studies have indicated that immune related genes might be implicated in cognition, and individuals with genetic liability towards a lower general cognitive ability might also be more genetically vulnerable towards infections."

The study's size and extensive controls were key strengths, however, the study could not account for the effects of less severe infections that didn't require hospitalization and did not include women.

Source: Benros ME, Sørensen HJ, Nielsen PR, Nordentoft M, Mortensen PB, Petersen L (2015) The Association between Infections and General Cognitive Ability in Young Men – A Nationwide Study. PLoS ONE 10(5): e0124005. doi:10.1371/journal.pone.0124005

(Image: AP)

The World Could Get Rid of Fossil Fuel Electricity in Just 25 Years with Nuclear Power

Ross Pomeroy - May 15, 2015

Fossil fuel electricity could be replaced with nuclear power in just 25 years, cutting worldwide human carbon emissions by half, a new analysis published to PLoS ONE finds.

Climate change is widely recognized as a global threat, but thus far there's been little effective resolve to curtail carbon emissions, which are largely responsible. A world powered by carbon-free renewable energy would undoubtedly be a better place, but getting there is the tricky part. The combined power of solar, wind, geothermal, and hydropower cannot feasibly electrify the world at this time. In a number of years, when energy storage technology improves and solar panel efficiency increases, that could very well change.

But there is a solution available right now that's reliable, clean, safe, and affordable: nuclear energy.

Staffan Qvist, a physicist at Uppsala University in Sweden, and Barry Brook, a Professor of Environmental Sustainability at the University of Tasmania, wondered how long it would take for nuclear power to be deployed in order to replace all fossil fuel electricity, which primarily comes from coal and natural gas. So they analyzed the cases of Sweden and France, two countries that successfully completed large-scale expansions of nuclear power.

In the early 1960s, Sweden began a massive project to build nuclear power plants. By 1986, half of the country's power came from nuclear, CO2 emissions per capita fell 75% from the peak in 1970, and energy costs were among the lowest in the world.

Beginning in 1973, France embarked on an ambitious path to free itself from foreign oil and generate almost all of its power from nuclear energy. Today, nuclear produces 75% of the country's electricity at the 7th cheapest rate in the European Union.

Qvist and Brook calculate, based on a per capita rate, that if the world built nuclear power plants as fast as Sweden did between 1960 and 1990, all coal and natural gas power plants could be phased out in 25 years. If the world emulated France's historical rate of construction, the phase out would take 34 years.

"Continued nuclear build-out at this demonstrably modest rate, coupled with an electrification of the transportation systems (electric cars, increased high-speed rail use etc.) could reduce global CO2 emissions by ~70% well before 2050," they write.

One could endlessly speculate on the potential ramifications of such an undertaking. Nuclear weapons proliferation, handling and disposal of radioactive waste, and the chance of a nuclear disaster are all consequences that will have to be measured against the known benefits of nuclear energy. Optimistically, it's entirely possible that in a world dedicated to nuclear power, scientific and technological innovation will rapidly solve all of the current drawbacks.

"Replacement of current fossil fuel electricity by nuclear fission at a pace which might limit the more severe effects of climate change is technologically and industrially possible—whether this will in fact happen depends primarily on political will, strategic economic planning, and public acceptance," the authors conclude.

Source: Qvist SA, Brook BW (2015) Potential for Worldwide Displacement of Fossil-Fuel Electricity by Nuclear Energy in Three Decades Based on Extrapolation of Regional Deployment Data. PLoS ONE 10(5): e0124074. doi:10.1371/journal.pone.0124074

(Image: AP)

Human Gene Expression Changes with the Seasons

Ross Pomeroy - May 12, 2015

Fall is a beautiful time of year, a time of transition. Leaves of deciduous trees turn from green to amber, brown, and red, before finally breaking off and drifting to the ground. In spring, the changes are reversed. Nascent green leaves sprout from branches to absorb the sun's forthcoming summer rays.

Though it may surprise you, humans experience seasonal changes, too. The changes aren't nearly as overt, however, for they mainly occur within our genes. But these changes can be very consequential for our health, a new study published in Nature Communications reveals.

Most everyone knows that infectious diseases like influenza are seasonal, but fascinatingly, so are autoimmune diseases like arthritis and diabetes, as well as cardiovascular disease and psychiatric illness. This prompted researchers primarily based out of the University of Cambridge to hypothesize that our genetics might factor in.

And so, a team supervised by medical geneticist John Todd and statistician Chris Wallace analyzed blood and adipose (fat) tissue samples from more than 16,000 people across the globe, from countries like the United States, Iceland, Gambia, England, and Australia, specifically focusing on patterns of gene expression.

Todd, Wallace, and their team found widespread seasonal gene expression in 5,136 of 22,822 genes tested. Some genes were more active in winter, while others were more active in summer. These patterns of expression remained the same even in both the northern and southern hemispheres. In other words, the same genes that were more active in the European summer (between June and September) were more active in the Australian summer (December, January, and February).

Of particular interest, the ARNTL gene, which is known to suppress inflammation in mice, was more active in summer and less active in winter. Lower expression would make the body more prone to an autoimmune response, which could partially account for the increased severity and rate of diseases like arthritis and multiple sclerosis in winter. Also of note, certain genes known to boost the antibody response from some vaccines were more active in winter, indicating that vaccines may be more effective at certain times of the year.

"These data provide a fundamental shift in how we conceptualize immunity in humans," the researchers say.

Todd and Wallace aren't yet sure how these seasonal variations came to be, but they guess that daylight and temperature serve as vital cues.

This research opens a new realm of study on seasonal immunity in humans. Perhaps, when we learn more, medical treatments could eventually be tailored to the season.

Source: Dopico, X. C. et al. Widespread seasonal gene expression reveals annual differences in human immunity and physiology. Nat. Commun. 6:7000 doi: 10.1038/ncomms8000 (2015).

Study Finds No Evidence for 'Aspartame Sensitivity'

Ross Pomeroy - May 8, 2015

Late last month, Pepsi announced that it will remove the artificial sweetener aspartame from its diet Pepsi products by the end of this year, bowing to consumer concerns that the sweetener poses a number of health risks. Questionable Internet sources have stoked these fears, accusing aspartame of causing cancer, multiple sclerosis, blindness, seizures, memory loss, depression, and birth defects. Alternative medicine guru Dr. Joseph Mercola even called the sweetener "By far the most dangerous substance added to most foods today."

It is no wonder, then, that some individuals worry that they may be "sensitive" to aspartame, and report experiencing symptoms like headache, nausea, dizziness, and congestion after consuming food containing the sweetener. A study published in March to the journal PLoS ONE put their claims to the test, however, and found no evidence of any acute adverse response to aspartame.

48 self-reported aspartame sensitive individuals took part in the study, which was conducted primarily by researchers at Imperial College London. These individuals, who were actively avoiding aspartame in their diets on account of their symptoms, were then matched by gender and age to 48 non-sensitive individuals.

Subjects in both groups were given a cereal bar laced with 100mg of aspartame -- roughly the same amount in a can of diet soda -- or a normal bar. Subjects provided blood samples before the test and four hours later, as well as urine samples at three different times over the following 24 hours. All symptoms were assessed and monitored for four hours after eating. A week later, the process was repeated so that subjects could eat both bars. The study was double blind, so neither the experimenters nor the subjects knew what bars they were receiving.

Analyzing the results, the researchers found that "none of the rated symptoms differed between aspartame and control bars, or between sensitive and control participants." Moreover, all subjects' blood work remained normal, as did every biomarker in the urine analysis.

The blood work and urine analysis were particularly telling. Aspartame opponents correctly point out that the chemical breaks down into phenylalanine, methanol, and aspartic acid, three chemicals that they insist are extremely toxic. These chemicals can indeed be dangerous, but only in high amounts. Turns out, the concentration of phenylalanine actually decreased in subjects' blood. Methanol and aspartic acid levels were so minute that neither of the chemicals were even detected. The researchers also noted that methanol is produced in higher concentrations in the body after drinking fruit juice.

"This independent study gives reassurance to both regulatory bodies and the public that acute ingestion of aspartame does not have any detectable psychological or metabolic effects in humans," the researchers conclude.

It should be noted that study was conducted over a short timespan, and thus can't rule out the possibility of adverse health effects resulting from long-term consumption of aspartame. However, a considerable number of studies dating back 30 years have already looked at this and concluded that the sweetener is completely safe.

Source: Sathyapalan T, Thatcher NJ, Hammersley R, Rigby AS, Pechlivanis A, Gooderham NJ, et al. (2015) Aspartame Sensitivity? A Double Blind Randomised Crossover Study. PLoS ONE 10(3): e0116212. doi:10.1371/journal.pone.0116212

Video Gamers Have Better Connected Brains

Alex B. Berezow - April 28, 2015

Amongst the wider public, video gamers do not have the best reputation. They are perceived, somewhat unfairly, as socially awkward, bespectacled, pimply-faced geeks. However, new research provides them something of a "Revenge-of-the-Nerds moment": Action video gamers (AVGs) have more gray matter and better connectivity in certain subregions of the brain.

The research team, headed by principal investigator Dezhong Yao, were led to examine brain structure and function among AVGs because of a plethora of previous evidence that showed that expert AVGs had superior cognitive abilities compared to amateurs. For instance, expert AVGs possess better attention skills and eye-hand coordination. Futhermore, it was already known that expert AVGs had more gray matter in various brain regions.

Armed with this prior knowledge, the team used functional MRI to examine the brains of 27 expert AVGs (i.e., action video gamers who were regional or national champions) and 30 amateur AVGs (i.e., noobs), focusing specifically on networks within the insular cortex that are associated with attention and sensorimotor function. (See figure.)

The figure depicts brain pathways with enhanced functional connectivity in expert AVGs compared to amateurs. Note that anterior (green), transitional (yellow) and posterior (red) regions of the brain showed greater connectivity in the experts, particularly in the left hemisphere. Subsequent analysis showed that expert AVGs also had more gray matter in the left insular cortex and central insular sulcus. Thus, the authors conclude that action video gaming can increase gray matter volume and integration of networks associated with attention and sensorimotor function.

It turns out that the years I spent playing Wolfenstein, Counterstrike and Unreal Tournament didn't completely go to waste.

Source: Diankun Gong, Hui He, Dongbo Liu, Weiyi Ma, Li Dong, Cheng Luo & Dezhong Yao. "Enhanced functional connectivity and increased gray matter volume of insula related to action video game playing." Scientific Reports 5, Article number: 9763. Published: 16-April-2015. doi:10.1038/srep09763

(AP photo)

Bacteria Killed with Silver Transform into 'Zombies,' Kill Other Bacteria

Ross Pomeroy - April 27, 2015

Researchers have discovered that bacteria killed by silver particles can be used to kill other bacteria, an ability termed the "zombie effect." Their work is published in Nature's Scientific Reports.

Most people are aware of the element silver. The soft, lustrous metal is cherished for its ornamental value. But fewer are aware that silver is extremely effective at killing microorganisms. In a two-pronged assault, silver increases the permeability of the bacteria's cell membrane and interferes with its metabolism. The dual attacks lead to the overproduction of reactive oxygen compounds, and eventually, cell death.

In their study, scientists Racheli Ben-Knaz Wakshlak, Rami Pedahzur, David Avnir, all based out of the Hebrew University of Jerusalem, killed pathogenic bacteria with silver nitrate, subsequently filtered out the dead bacteria, then placed the dead bacteria in a culture of living bacteria. After six hours, up to 99.999% of the live bacteria joined the ranks of their zombie brethren beyond the grave.

The silver-killed bacteria aren't turning into zombies, of course. The researchers demonstrated that heat-killed bacteria don't kill their live counterparts, but the leftover silver solution used to kill the first bacteria does. This indicates that silver-killed bacteria are simply carrying silver particles that can be passed on to other bacteria. Thus, it would be more accurate to call them deadly land mines of decaying matter. "Zombies" is a bit more poignant, however.

Could this intriguing effect be harnessed in any way? The researchers still aren't sure. For now, their study serves as an interesting demonstration of a cool process that's already occuring.

Source: Wakshlak, R.B.-K., Pedahzur, R. & Avnir, D. Antibacterial activity of silver-killed bacteria: the "zombie" effect. Sci. Rep. 5, 9555; DOI:10.1038/srep09555 (2015).

(Top Image: Alchemist-hp)

Political Partisanship: In Three Stunning Charts

Ross Pomeroy - April 24, 2015

While most everyone has been saying it, science now supports it: Political partisanship is the worst it's been in over half a century, and it's increasing at an exponential rate.

That's the finding of an interdisciplinary group of academics who analyzed roll call votes to statistically model partisanship in the U.S. House of Representatives dating back to 1949. Their results, published in PLoS ONE, are embodied in three stunning charts.

The first shows the number and proportion of times that representatives from different parties and the same party vote the same way. Notice how the proportion of cross-party pairs and same-party pairs used to overlap, and now they hardly do. The series of graphs shows that fewer and fewer representatives of different parties vote together, but the few that do cross party lines are doing so more often.

 

The second graph's message is transparent enough. Members of opposite parties are voting in contrasting ways on roll call votes more than ever, and the disagreements appear to be increasing exponentially, roughly 5% every year according to the researchers.

The third chart clusters Republican (red) and Democrat (blue) representatives on a spectrum of ideology (defined by how often they vote with the rest of their party) then links opposite party members according to their votes together. The links grow larger and darker the more often representatives vote across party lines. The graphs' evolution over time is simply remarkable.

 

The researchers also found -- unsurprisingly -- that partisanship correlates with failure to introduce and pass legislation.

Things aren't looking good for bipartisanship in the United States of America. Compromise is what made our country great, yet it seems to be something we've forgotten how to do. One can't help but wonder when, or if, we'll remember.

Source: Andris C, Lee D, Hamilton MJ, Martino M, Gunning CE, et al. (2015) The Rise of Partisanship and Super-Cooperators in the U.S. House of Representatives. PLoS ONE 10(4): e0123507. doi:10.1371/journal.pone.0123507

(AP photo)

Beware of Possible Cuisine-Drug Interactions

Alex B. Berezow - April 20, 2015

Taking drugs, whether legal or illegal, creates problems. One of them is that drugs can interact with each other, often in bad ways. Frustratingly, some drugs are known to interact even with the various foods we like to eat. From the viewpoint of food-drug interactions, the most problematic food may be the humble grapefruit, which is known to interact with about 85 drugs, ranging from antidepressants and statins to clot-busters and Viagra.

Grapefruit isn't the only troublemaker. Dairy, garlic, and caffeine have also been implicated. This fact led a team of Macedonian researchers to ask a much broader question: Are certain cuisines susceptible to adverse food-drug interactions?

To determine this, the team assembled data on known food-drug interactions and linked it to data on common ingredients found in the world's cuisines. (See figure. The drugs are listed by ATC code, an international classification system. Also, note that the data is presented per mille, not percent.)

Panel A shows that Asian, Latin American and southern European cuisines will probably have adverse interactions with drug categories B, C and V, which affect blood/blood forming organs, the cardiovascular system, and "various" other locations, respectively. The authors specifically blame garlic and ginger as the problematic ingredients.

Panel B shows that drug categories D (dermatologicals), G (drugs affecting the genito-urinary system and sex hormones), and J (systemic anti-infectives, such as antibiotics and antivirals) will likely cause trouble for North Americans and most Europeans. They react with milk.

Panel C displays that category R (respiratory system) drugs are irksome throughout much of the world, mostly because of their interaction with coffee, tea and grapefruit. (However, note that the scale on panel C is much smaller than panel A and B, meaning there are far fewer possible cuisine-drug interactions for this category.)

Analyzing all the potential food-drug interactions, the authors determine that milk and garlic are the two most problematic ingredients in the world. (See figure. Panel A depicts where milk is commonly used, while Panel B depicts where garlic is commonly used.)

The authors point out that the spread of culture around the world -- due both to globalization and travel -- will bring people into contact with foods with which they have had little prior experience. Since about 70% of Americans are taking at least one prescription drug (and 20% are taking five or more), it would be quite useful to know how various cuisines interact with them.

Source: Milos Jovanovik, Aleksandra Bogojeska, Dimitar Trajanov & Ljupco Kocarev. "Inferring Cuisine - Drug Interactions Using the Linked Data Approach." Scientific Reports 5, Article number: 9346. Published: 20-March-2015. doi:10.1038/srep09346

(AP photo)

The First Ever Video of a Cracking Joint

Ross Pomeroy - April 15, 2015

Scientists based out of the University of Alberta have -- for the first time -- imaged a joint cracking in real time, effectively putting to rest a decades-long debate in the process. They revealed their success in the journal PLoS ONE.

Doubtless you've experienced the physiological wonder that is a cracking knuckle. The audible pop it makes can sometimes be heard across an entire room, making many bystanders wince. But they probably have nothing to cringe about. While joint cracking may sound painful, it's not associated with any adverse health effects -- arthritis, for example.

Everyone knows that bending or stretching a joint is what causes it to crack, but what's going on under the skin? First off, a joint is where two bones meet. At the ends of each bone is soft, cushioning cartilage. Connecting the cartilage -- and thus the bones -- is a synovial membrane that's filled with a thick, lubricating fluid. Bending the joint can cause the membrane to stretch, which in turn causes the pressure inside it to drop and a bubble of dissolved gas to form within the fluid. The whole process is called tribonucleation.

“It’s a little bit like forming a vacuum,” says Professor Greg Kawchuk, the lead researcher. “As the joint surfaces suddenly separate, there is no more fluid available to fill the increasing joint volume, so a cavity is created...”

For decades, prevailing wisdom has held that the popping noise is tied to these bubbles, but scientists have debated whether the sound is caused by the bubble's formation or its collapse.

Thanks to Kawchuk and his team, we now know it's the former. When they watched a volunteer's knuckles crack inside an MRI machine in real time, the pop clearly occurred when the bubble formed. Moreover, the bubble persisted well after the sound was heard.

"This work provides the first in-vivo demonstration of tribonucleation on a macroscopic scale and as such, provides a new theoretical framework to investigate health outcomes associated with joint cracking," the researchers say.

Source: Kawchuk GN, Fryer J, Jaremko JL, Zeng H, Rowe L, Thompson R (2015) Real-Time Visualization of Joint Cavitation. PLoS ONE 10(4): e0119470. doi:10.1371/journal.pone.0119470

More Journal Club

Women Preferred 2:1 in Academic Science Jobs
April 13, 2015

The lack of women in science and engineering has long been a sore spot in academia. Even though girls are...

These Five Chemicals Cause the Most Injuries
April 13, 2015

There are many compounds that can cause serious bodily harm. Some of the deadliest you can probably name:...

Pathogen Jumped from Humans to Rabbits in 1976
April 6, 2015

Zoonotic diseases, such as the plague and Ebola virus, jump from animals to humans. Often, but not always,...

Journal Club Archives