Chewing gum has already been shown to boost cognitive function and suppress appetite. Now scientists have provided another reason to do it.
According to new research in PLoS ONE, gum traps harmful bacteria that can cause dental cavities. When you spit it out, that bacteria is removed.
The researchers calculated that a single piece of typical "tab" gum can trap up to 100,000,000 bacteria, or roughly ten percent of the microbial load in saliva. They further estimated that gum removes a similar amount of bacteria as flossing, but noted that flossing targets different areas of the mouth.
For the study, five biomedical engineering students generously donated their time to chewing two different standard pieces of gum for various lengths of time ranging from 30 seconds to ten minutes. Afterwards, the gum was spit into a cup filled with sterile water and analyzed.
The researchers found that the longer a piece of gum is chewed, the more microbial species from the mouth it captures. However, after thirty seconds of chewing, the gum starts to lose its adhesiveness, which means that it traps fewer bacteria overall.
Not all gum may benefit your dental health. Sugar-sweetened gum effectively feeds oral bacteria. When these microbes ferment sugars, the biofilm on your teeth grows more acidic, which, in turn, leads to cavities. Gum with artificial sweeteners, however, does not have this effect. In fact, some artificial sweeteners actually have antimicrobial properties.
The researchers think their study may lead to the production of more advanced gums that vastly improve dental health.
"Our findings that chewing of gum removes bacteria from the oral cavity, may promote the development of gum that selectively removes specific disease-related bacteria from the human oral cavity."
Source: Wessel SW, van der Mei HC, Morando D, Slomp AM, van de Belt-Gritter B, et al. (2015) Quantification and Qualification of Bacteria Trapped in Chewed Gum. PLoS ONE 10(1): e0117191. doi:10.1371/journal.pone.0117191
(Images: AP, Wessel et. al.)
Peter Sunde, a co-founder of the file-sharing website The Pirate Bay, recently spent time in prison for aiding and abetting copyright crimes. Last November, he described prison life to The Guardian.
What is most difficult to cope with is the boredom, Sunde says. The days in prison merge into a grey mass, indistinguishable from each other. Sunde has trouble sleeping at night. “You become brain-dead in here,” he says. “A guy who has been here a long time said it best: what I miss most are new memories.”
Sunde's depressing assessment may not be far off. According to a new review published in the journal Frontiers in Psychology, prison does the brain little good.
Reviewers centered out of VU University Amsterdam in the Netherlands poured through the scientific literature for any studies that compared the cognitive functions of inmates to those of the general population. Their search turned up only seven studies, but those seven studies told a convincing tale.
"Distinct executive function deficits were found in attention, set-shifting, working memory, problem-solving and inhibition," the researchers report.
Considering that a previous analysis found a strong link between executive function deficits and criminality, the researchers argue that prisons should focus on mending inmates' brains with a more "enriched" environment, instead of simply keeping them locked up. Doing so could very well lower recidivism rates, which currently range between 35 and 67 percent.
"Prison... is... a clear example of an impoverished and sedentary environment... characterized by a lack of demand on self-regulating functions," the researchers say. "For example, prisoners sit or lie on their beds for a striking 9.36 hours per day on average, besides the hours spend sleeping. As we know from animal studies, an impoverished environment has a negative effect on the prefrontal cortex, a brain region crucial for executive functions."
The researchers admit that they can't be sure whether the prison environment is actually deteriorating executive functions. After all, antisocial individuals who commit crimes likely already suffer from impaired mental faculties. It would be easy to find out, however.
"We advise recruiting new detainees for a baseline measurement, and reassessing these new detainees within a certain timeframe, e.g. three or six months," they suggest.
Source: Meijers J, Harte JM, Jonker FA and Meynen G (2015). Prison brain? Executive dysfunction in prisoners. Front. Psychol. 6:43. doi: 10.3389/fpsyg.2015.00043
Zzzzzzap! Ahhh, the gratifying sound of another insect biting the dust on a humid summer night. The hauntingly seductive blue glow of the bug zapper attracts thousands of unsuspecting insects to their untimely demise. Phototactic creepy-crawlies simply cannot resist moving toward the light, and when they arrive, a 2,000-volt wire mesh awaits them.
As handy as these tiny execution chambers are, they are not at all useful for farmers who are trying to protect precious stored grains and other crops from being eaten by insects. Instead, it is typical for farmers to control pests, such as beetles and moths, with insecticides. Because insecticides are toxic to humans, however, finding a substitute would be ideal. Now, such an alternative might exist.
Japanese researchers from Tohoku University describe in the journal Scientific Reports that certain wavelengths of visible light are lethal to certain species of insects. For instance, blue light (wavelength = 467 nm) was nearly 100% lethal to fruit fly pupae, while ultraviolet A light (wavelength = 378 nm) was only about 40% lethal. (See figure; ignore lower case letters.)
The authors went on to show that blue light (467 nm) also had a lethal effect on fruit fly eggs. Any adults that survived the light treatment had greatly reduced lifespans and produced fewer eggs.
Similarly, other insects were killed by visible light, but at different wavelengths. The authors showed that pupae of the London Underground mosquito (Culex pipiens molestus) were killed by violet/indigo light (417 nm), while pupae of the confused flour beetle (Tribolium confusum) were killed by several different wavelengths of light, ranging from violet to blue.
The authors conclude that lethality due to visible light is wavelength-dependent and species-specific. In other words, some kinds of light kill some types of bugs, while other types of light kill other types of bugs. Additionally, previous research suggests that the insects are dying because blue light likely triggers the production of reactive oxygen species, which damage important cellular structures and molecules.
An obvious benefit from their research is that farmers could use this knowledge to kill certain species of pests, while leaving friendlier insects unharmed. It would also have the benefit of reducing pesticide use. The downside is that multiple wavelengths of light (and hence, multiple LEDs) would be required to kill all of the pests that threaten farmers. Thus, UVC light, the most lethal type of UV light, would be far more efficient at killing insects. However, UVC light itself has drawbacks, such as its indiscriminate killing of insects and toxicity to humans and other mammals.
Source: Masatoshi Hori, Kazuki Shibuya, Mitsunari Sato & Yoshino Saito. "Lethal effects of short-wavelength visible light on insects." Scientific Reports 4, Article number: 7383. Published: 09-December-2014. doi:10.1038/srep07383
Last week, the CDC released data that made big news: Six Americans die daily from alcohol poisoning. But, as usual, the most interesting information laid beyond the headlines.
The CDC ranked each U.S. state according to the (age-adjusted) death rate from alcohol poisoning. (Note: DC, DE, HI, ND, and VT were not included because the number of deaths was too low for reliable statistical analysis.) The results are reformatted in Excel and shown below:
Whoa. What is going on in Alaska? The explanation offered by the Boston Globe was that the large Native population in Alaska was to blame for the state's high numbers.
There is a lot of truth to that assertion. By race, the CDC reports that Native Americans have the highest death rate from alcohol poisoning (49.1 per million). No other race comes close. (The second worst rate is for Hispanics at 9 per million.) Alaska and New Mexico, which had the highest rates of alcohol poisoning deaths, are also ranked #1 and #2, respectively, for percentage of population that is Native American, according to census data. Similarly, South Dakota and Oklahoma, which rank #3 and #4 in terms of percent Native population, also have high rates of alcohol poisoning deaths.
But, this does not explain all of the CDC's findings. Montana, which ranks #5 for percent Native population, has an alcohol poisoning death rate that is below the national average of 8.8 deaths per million. And, Hawaii, which has a large Native population (though they are not technically considered "Native Americans"), has so few deaths from alcohol poisoning that the CDC did not consider the state in its analysis.
So, what explains the unusually high alcohol poisoning death rates in Alaska and New Mexico? Perhaps it is sheer boredom. Boredom, as it turns out, is a risk factor for substance abuse.
Consider life in Alaska. It's a beautiful state, but in the winter time, sunlight is scarce. On the shortest day of the year, Anchorage, the most populous city, receives merely 5.5 hours of sun; Fairbanks gets less than 4 hours. It is quite possible that living a substantial portion of one's life in what must feel like never-ending darkness drives some people toward alcoholism.
How about New Mexico? Boredom probably plays a role there, too. In the U.S. West, many Native Americans live on reservations, which are notoriously poverty-stricken. And poverty, which can lead to boredom, is itself linked to substance abuse.
Source: Centers for Disease Control and Prevention. "Vital Signs: Alcohol Poisoning Deaths — United States, 2010–2012 ." MMWR 63 (53): 1238-1242. January 9, 2015.
Women seeking graduate and doctorate degrees in science no longer fall victim to a "leaky pipeline," a new report in the journal Frontiers in Psychology shows.
According to the decades-old metaphor, women are more likely than men to leave science at multiple points from the beginning of college through attaining academic tenure.
But while the pipeline certainly "leaked" in the 1970s, it appears to have been repaired in the 1990s.
Northwestern University's David Miller and Duke University's Jonathan Wai discerned the finding through analyzing longitudinal data from the National Survey of College Graduates and the Survey of Doctoral Recipients. Breaking the data down by field -- computer and mathematical science, engineering, life science, physical science, and social science -- they found that, in every single one, the percentage of women who earn a bachelor's degree then go on to receive a doctorate is roughly equal to men.
The closing of this gap doesn't mean that women scientists don't face bias, of course. A much-talked-about 2012 study found that female scientists applying for a lab manager position were rated less competent and were offered smaller starting salaries than equally experienced male applicants. Another study found that faculty ignored emails more frequently from prospective female graduate students than prospective male graduate students. And just last year, a report in PLoS ONE showed that female students are more likely to be victims of sexual harassment during field experiences than their male counterparts.
So even though science still harbors a latent bias against women, the current report shows that it doesn't seem to be stopping female graduates from reaching the upper echelons of scientific achievement.
Source: Miller DI and Wai J (2015). The Bachelor’s to PhD STEM Pipeline No Longer Leaks More Women Than Men: A 30-Year Analysis. Front. Psychol. 6:37. doi: 10.3389/fpsyg.2015.00037
If your parents or grandparents were like mine, you probably heard this as a child heading out to play in the snow: "Put your hat and scarf on. If you don't, you'll catch a cold." Years later, all grown up with a microbiology doctorate hanging on my wall, I know that viruses cause colds, not chilly weather. Their admonitions, while well-intentioned, were based on nothing but folklore and superstition. Right?
Perhaps not. New research published in PNAS shows that colder temperatures may make your immune system more susceptible to rhinoviruses, the most common cause of colds.
It was previously known that rhinoviruses are more likely to infect the nasal cavity than the lungs because of the lower temperature. The nasal cavity, which is more exposed to the environment, has a lower temperature (33-35 degrees C) than the lungs (37 degrees C). Rhinoviruses prefer lower temperatures, so they replicate more efficiently in the nasal cavity. The authors were able to reproduce this in mouse airway epithelial cells. (See figure.)
As shown above, after about 40 hours, a mouse-adapted rhinovirus (RV-1BM) incubated with mouse cells at 33 degrees (blue circles) was 1,000 times more efficient at replicating than a virus incubated at 37 degrees (red circles). (Note that the Y-axis is logarithmic. Also, ignore RV-1B, which is not mouse-adapted.) Thus, the data shows that the authors' model confirms previous observations.
Then, the authors turned to the real question: Why? What is it about rhinoviruses that makes them prefer 33 degrees? Previous research has failed to provide an explanation. There appears to be nothing about the virus itself that makes it favor lower temperatures. So, the authors turned their attention to the host's immune response. And here, they found their answer.
Mouse cells infected with rhinovirus produce a weak innate immune response at 33 degrees C. The authors discovered that secretion of interferon-beta, a molecule important in the body's anti-viral response, is much lower at the colder temperature. (See figure.)
Furthermore, the authors found that other cells are less responsive to the presence of interferon-beta at 33 degrees. In other words, the lower temperature diminishes the host's front-line immune system.
This research hints that keeping your nasal cavity warm might be an effective way to prevent rhinovirus infection. Maybe Mom was right, after all.
Source: Ellen F. Foxman et al. "Temperature-dependent innate defense against the common cold virus limits viral replication at warm temperature in mouse airway cells." PNAS. Published online before print. 5-Jan-2014. doi: 10.1073/pnas.1411030112
As the heart of the United States braces for a wicked cold snap next week, there's also chilling news coming out of the journal PLoS ONE. On Wednesday, neuroscientists from the United Kingdom reported that cold is contagious. Yes, just looking at someone who's shivering or experiencing frigid temperatures can cause parts of your own body to become colder.
For the study, the researchers had 36 participants sit in a temperature-controlled room and watch videos of actors placing one of their hands in either visibly steaming water, ice water, or neutral still water. Each subject watched ten total videos, four each featuring warm and cold water with different actors who used either their left or right hand, as well as two control videos. All of the videos lasted two minutes.
Throughout the process, the researchers closely monitored subjects' heart rates and hand temperatures. While watching the warm and neutral videos did not produce any changes in subjects' hand temperature, watching the cold videos caused a small, but unmistakable drop. The temperature of subjects' right hands fell by an average of 0.1 degrees Fahrenheit, and the temperature of their left hands fell 0.4 degrees. There was no change in heart rate.
Why didn't the warm videos prompt a warming effect? The authors suggest that the warmth of the water, as indicated by the steam, may not have been as visible. On the other hand, the ice water was clearly frigid. They also note a prior review which showed that it's easier to elicit a decrease in skin temperature than an increase.
The current study presents further support and adds a physical condition to emotional contagion, the tendency for two individuals to mimic each other's expressions and emotional states. Emotional contagion is thought to be mediated by mirror neurons, brain cells that fire both when an animal peforms a certain action or observes that action. The study also broadly substantiates an extreme case of human temperature fluctuation documented in 1920 by scientist J.A. Hadfield, who worked with a patient who was able to selectively adjust their right and left hand temperature by as much as 5 degrees Fahrenheit through suggestions of heat or cold.
Source: Cooper EA, Garlick J, Featherstone E, Voon V, Singer T, et al. (2014) You Turn Me Cold: Evidence for Temperature Contagion. PLoS ONE 9(12): e116126. doi:10.1371/journal.pone.0116126
Humans aren't the only species to slur their speech under the influence of alcohol. According to a new study published in PLoS ONE, zebra finches do it, too.
To discern the result, researchers from Oregon Health and Science University served up grape juice and ethanol mixers to a host of thirsty songbirds. The birds were allowed to drink at their leisure. All the while, the researchers closely observed and recorded their vocalizations.
Most of the birds reached an intoxication level equivalent to a human downing roughly 2-3 drinks over two hours. One finch, however, didn't quite know when to quit, imbibing to the level of a binge-drinking human, about 4-5 drinks over two hours.
Compared to the finches that were served only juice, the finches that drank alcohol sang with a lower amplitude and more disorder. The duration, pitch, and frequency modulation of their song also differed, although not at statistically significant levels.
"Overall, alcohol has clear effects on zebra finch song, establishing this species as an informative model to study the effects of alcohol on a cognitive skill with similarities to human speech acquisition," the researchers say.
Believe it or not, scientists still aren't precisely sure how alcohol causes slurred speech. But since we know more about the far less complex zebra finch brain, we might be able to pinpoint the pathways involved and extrapolate to humans.
The current study represents the first published work which characterizes zebra finch drinking songs. First author Christopher Olson previously announced the discovery in 2012 at the Society for Neuroscience meeting.
Source: Olson CR, Owen DC, Ryabinin AE, Mello CV (2014) Drinking Songs: Alcohol Effects on Learned Song of Zebra Finches. PLoS ONE 9(12): e115427. doi:10.1371/journal.pone.0115427
(Images: Niagara Falls Aviary,
The bacterium that causes bubonic plague or Black Death, Yersinia pestis, is a relatively new species. Research suggests it diverged from its nearest living bacterial ancestor no more than 6,400 years ago. During this transition, the genetics of Y. pestis changed. Most notably, it acquired genes that helped it survive inside fleas.
However, a big mystery remained. The ancestor from which Y. pestis evolved, called Yersinia pseudotuberculosis, is deadly to fleas. The bacterium causes lethal diarrhea in about 40% of them. Now, researchers report in the journal PNAS how a key mutation in Y. pestis fixed this problem and aided the bacterium's ability to spread worldwide.
Relatively limited prior knowledge of the pathogenesis of Y. pseudotuberculosis meant the authors had to go on the molecular biology equivalent of a scavenger hunt. Using a process called cellular fractionation, in which the bacteria were chopped up, separated into various parts, and fed to fleas, the authors found the toxic protein resided in the bacterial membrane. Then, the authors further separated the proteins using gel electrophoresis (see image) and identified the proteins whose expression differed between Y. pestis and Y. pseudotuberculosis. Their hunt eventually narrowed in on protein #12 (see image), which they identified as an accessory protein to an enzyme called urease.
Specifically, the protein accessory is called UreD. And here, the authors struck gold. In Y. pseudotuberculosis, the ureD gene is functional; in Y. pestis, it is mutated and nonfunctional. The result is a broken urease enzyme in Y. pestis. Presumably, urease (which breaks down the chemical urea to produce ammonia) poisons the flea. This would explain, therefore, why Y. pseudotuberculosis causes a lethal diarrhea, while Y. pestis does not.
Driving their point home, the authors restored ureD function in Y. pestis. As predicted, Y. pestis with functional urease killed many of the fleas.
Thus, the authors concluded that there exists strong evolutionary pressure for Y. pestis to lose its urease activity. As microbiologist Stanley Falkow famously said, "The goal of every bacterium is to become bacteria." For Y. pestis, achieving this goal is facilitated by dropping the urease enzyme because it allows more fleas to survive. And the more fleas that survive, the more rats (and humans) that can become infected.
It should be noted, however, that though the mutation in ureD makes Y. pestis nontoxic to fleas, the little insect still does not escape unharmed. Instead of causing diarrhea, Y. pestis forms a biofilm inside of the flea's digestive tract. This causes the flea to regurgitate its blood meals, which in turn has two effects: First, it facilitates transmission of the bacterium from inside the flea gut to rats and humans; second, it causes the flea to die slowly from starvation.
As it turns out, the evolution of Y. pestis was of little benefit to any species other than itself.
Source: Iman Chouikha and B. Joseph Hinnebusch. "Silencing urease: A key evolutionary step that facilitated the adaptation of Yersinia pestis to the flea-borne transmission route." PNAS. Published online before print: 1-Dec-2014. doi: 10.1073/pnas.1413209111
(Photo: Wikimedia Commons)
When you have a Big Mac, medium fry, and a medium Coca-Cola for Lunch, you're not just consuming 1,070 calories, 64 grams of sugar, 43 grams of fat, and 1,150 milligrams of sodium; you're also eating 238,000 microbes -- mostly bacteria, with a few thousand yeast and mold organisms as well.
That's the finding from a trio of scientists at UC-Davis, who, for the first time, tallied the number and type of microorganisms present in the average American diet.
For their study, which was published Tuesday to PeerJ, researchers Jenna Lang, Jonathan Eisen, and Angela Zivkovic purchased and prepared a full day's worth of meals for three separate diets: an average American diet, a USDA-recommended diet, and a vegan diet. The American diet consisted of food from Starbucks and McDonald's, as well as frozen and packaged food from the grocery store. The USDA diet offered cereal, a variety of vegetables, and a turkey sandwich, among other selections. The vegan diet had lots of vegetables, nuts, and fruits, and rounded out with tofu, almond milk, and vegetable protein powder. All of the diets tipped the energy scale at a little over 2,200 calories.
After painstakingly purchasing, preparing, and cooking a day's worth of meals for each diet, the researchers meticulously weighed everything, then obliterated all of it in a blender for analysis.
The USDA diet contained the most microbes, roughly 1.26 billion. The vegan diet came in a distant second at just over 6 million, while the average American diet lagged behind at 1.39 million. Aerobic and anaerobic bacteria dominated the counts. Yeast and mold constituted a far smaller portion.
The researchers were quick to caution that these are only daily estimates, affected largely by the choice of foods. For example, the USDA diet had three foods -- yogurt, Parmesan cheese, and cottage cheese -- which contained live and active bacterial cultures. These sources immensely inflated the microbe count.
Lang, Eisen, and Zivkovic conducted the study because they noticed a dearth of information concerning the microbes that we eat each and every day.
"Far more attention has been paid to the microbes in our feces than the microbes in our food," Lang wrote.
"There could be interesting relationships between the nutritional content of the foods that we eat, the microbes that associate with those foods, and our gut microbiome, not just because we are “feeding” our gut microbes, but because we are eating them as well."
"Further studies are needed to determine the impact of ingested microbes on the intestinal microbiota, the extent of variation across foods, meals and diets, and the extent to which dietary microbes may impact human health."
Source: Lang JM, Eisen JA, Zivkovic AM. (2014) The microbes we eat: abundance and taxonomy of microbes consumed in a day’s worth of meals for three diet types. PeerJ 2:e659 http://dx.doi.org/10.7717/peerj.659