Despite what you may have heard during Shark Week, Megalodon -- the largest shark that ever lived -- is extinct. At least, that's what all of the best scientific evidence tells us.
The fossil record also clues us in to other fascinating facts about Megalodon. The giant weighed between 53 and 65 tons and grew to as much as 60 feet in length! Additionally, its teeth were over seven inches long, it could bite down with a force between 10.8 and 18.2 tons, and it frequently consumed large whales!
As exciting as it might be to encounter one of these behemoths on a fishing outing (and of course live to tell the tale), this is not at all likely to happen. One, you probably wouldn't survive, and two, you'd be a little late by roughly 2.6 million years.
That's the most complete estimate for when Megalodon went extinct, courtesy of a new analysis from Catalina Pimiento at the Florida Museum of Natural History and Christopher F. Clements at the Institute of Evolutionary Biology and Environmental Studies at the University of Zurich, published to PLoS ONE.
A previous estimate dated Megalodon's extinction to as recently as 1.5 million years ago, but the reports upon which that number are based are widely deemed unreliable. Pimiento and Clements arrived at the new number by combining the most recent, confirmed records of Megalodon fossil findings from the scientific literature and applying a probabilistic model to determine the most likely extinction date.
The model, termed Optimal Linear Estimation, has previously only been used to ascertain the extinction dates of modern species, so it's unknown how well it will apply to a species that lived millions of years ago. But given that the charismatic Megalodon has an abundant fossil record, the researchers are confident that their estimate will hold up, especially considering that the date of Megalodon's extinction has never before been quantitatively assessed.
The precise reasons for Megalodon's extinction are any paleontologist's guess. Ocean cooling and a drop in sea level may have been to blame, or possibly a decline in the mega shark's food supply -- large whales. Competition from killer whales may also have contributed.
Source: Pimiento C, Clements CF (2014) When Did Carcharocles megalodon Become Extinct? A New Analysis of the Fossil Record. PLoS ONE 9(10): e111086. doi:10.1371/journal.pone.0111086
(Image: Wikimedia Commons)
Thanks to my favorite troublemaker, Hank Campbell, the "who is more anti-vaccine" debate has sprung up again. In 2012, we co-authored a book, Science Left Behind, in which we argued that the anti-vaccine movement began with the political Left, but spread to religious conservatives and libertarians. However, because the most visible public spokespeople for the anti-vaccine movement (e.g., Robert F. Kennedy, Jr., Bill Maher, and Jenny McCarthy) are mostly on the political Left, we continue to believe that the Left should bear most of the blame. However, some writers argue that the anti-vaccine movement is truly a bipartisan phenomenon.
New CDC data helps shed some more light on the issue. The CDC has compiled an updated list which depicts vaccine exemption rates in each U.S. state. (See map.)
As shown above, 11 states have 4% or more of the kindergarten population exempted from vaccines. (Generally speaking, the number of religious/philosophical exemptions dwarfed the number of medical exemptions.) The worst 11 states are listed below, from most exemptions to least, in addition to their Obama-Romney 2012 presidential vote margin (as a quick-and-dirty proxy for how liberal or conservative the state is):
Oregon (7.1%); Obama +12
Idaho (6.4%); Romney +32
Vermont (6.2%); Obama +36
Michigan (5.9%); Obama +9
Maine (5.5%); Obama +15
Alaska (5.3%); Romney +14
Arizona (4.9%); Romney +9
Wisconsin (4.9%); Obama +7
Washington (4.7%); Obama +15
Colorado (4.6%); Obama +5
Utah (4.4%); Romney +48
*Note: Illinois, Minnesota, Missouri and Wyoming did not report 2013-2014 data. However, according to the CDC's 2012-2013 data, Illinois had a vaccine exemption rate of 6.1%. (Its vote margin was Obama +17.) The other states had exemption rates below 4%.
The award for the most anti-vaccine state in the country goes to Oregon. This is not a surprise; the citizens of Portland are also afraid of fluoride. Thus, 4 of the 5 most anti-vaccine states are solid blue. (If Illinois is included, 5 of the 6 most anti-vaccine states are solid blue.) Including Illinois, 8 of the 12 most anti-vaccine states voted for Obama.
Which states are the most pro-vaccine (i.e., the states with exemption rates below 1%)?
Mississippi (<0.1%); Romney +11
West Virginia (0.2%); Romney +26
Virginia (0.6%); Obama +4
Alabama (0.7%); Romney +23
Delaware (0.8%); Obama +19
Louisiana (0.8%); Romney +17
New York (0.8%); Obama +28
Kentucky (0.9%); Romney +22
The two most pro-vaccine states are solid red, and 5 of the 8 most pro-vaccine states overwhelmingly voted for Romney.
There are a few other points worth making. First, the anti-vaccine movement has a strong presence in the West. The Western U.S., particularly states like Alaska, Idaho, and Washington, have a strong libertarian streak. This libertarianism surely plays a significant role in anti-vaccine ideology. Second, as a whole, the conservative and religious Deep South is the most pro-vaccine part of the country.
The bottom line is that the CDC data makes it very difficult to argue that conservatives and liberals share equal blame in the anti-vaccine war. Anti-vaxxers are clearly more associated with the political Left.
Source: Centers for Disease Control and Prevention. "Vaccination Coverage Among Children in Kindergarten — United States, 2013–14 School Year." MMWR 63 (41): 913-920. (Oct. 17, 2014.)
The CDC's Morbidity and Mortality Weekly Report is a fascinating treasure trove of the macabre. Recently, it published a chart depicting how Americans committed suicide by age group in 2011. (See above.) While firearms are the most popular method in all age groups, a fascinating trend emerges: As age increases, suffocation (including hanging) becomes less popular and firearms become more popular. Why?
The CDC does not speculate, but perhaps the likeliest explanation is access to firearms. Older Americans are more likely to own guns than younger Americans. According to Pew, 26% of 18-29-year-olds own a gun, while 40% of people aged 50+ own a gun.
The issue of access might also partially explain global data. The U.S. leads the way in firearm suicides. According to a 2008 study by the World Health Organization, nearly 61% of suicides among American men are with guns. No other country comes even close. (2nd place goes to Uruguay, with about 48%.) Among women, America and Uruguay tie for 1st place, with about 36% of suicides being due to firearms. (The next closest country is Argentina, with roughly 26%.) Switzerland, which has a large number of guns per capita, also has a relatively high suicide-by-firearm percentage (among men, but not women).
It is widely believed that people who do not have access to one particular method of suicide will find another way. But, this is incorrect. Suicide is often an impulsive decision. That is why making suicide more difficult appears to reduce the suicide rate. In Prevention and Treatment of Suicidal Behaviour: From Science to Practice (PDF), Keith Hawton describes how the suicide rate fell in the United Kingdom as the composition of the domestic gas supply changed. Sticking one's head in the oven and inhaling carbon monoxide was a common method of suicide in the UK. However, as the level of carbon monoxide in the gas supply dropped, so did the suicide rate. Hawton goes on to discuss how reducing the availability of firearms also appears to lower the suicide rate.
Currently, the U.S. is experiencing the highest suicide rate in 25 years, and many analysts are referring to a "suicide epidemic." A sensible policy to lower the suicide rate in America would be to make gun ownership more difficult. But given our current political climate, that idea is almost certainly dead in the water.
Source: Centers for Disease Control and Prevention. "QuickStats: Percentage of Suicide Deaths, by Mechanism and Age Group — United States, 2011." MMWR 63 (38): 845. (Sept. 26, 2014.)
States with higher levels of religiosity and conservatism seek out more sexual content on the Internet, according to a new study.
Psychologists Cara MacInnis and Gordon Hodson of Brock University analyzed Internet use data in 2011 and 2012 with Google Trends, specifically measuring the search volume of the terms "sex," "gay sex," "porn," "xxx," "free porn," gay porn," and the Google Image search term "sex." They then examined how the data was associated with state level religiosity and conservatism. An immense 2011 Gallup survey of 350,000 people supplied those values. Religiosity was gauged by the percentage of "very religious" individuals within each state and the percentage of individuals considering religion "an important part of their daily lives." Conservatism was determined by the proportion of self-identified conservatives.
Initially, MacInnis and Hodson found a strong link between a state's level of religiosity and the search volume for "gay sex," "gay porn," and "sex." But after controlling for demographic variables like GDP and the rate of individuals below the poverty line, the associations for "gay sex" and "gay porn" evaporated. However, the link with the search term "sex" remained. Overall, the strongest and most persistent link was between conservatism and a Google Image search for "sex."
The results, published last Friday in the journal Archives of Sexual Behavior, seem to support a 2009 study, which found that religious and conservative states tend to have higher adult content subscriptions per capita. Both studies beg the question: Do conservatives and religious people nurture a veiled, paradoxical fascination with sex?
It is possible. Since conservatives and religious people outwardly display more restrained views toward sex, curiosities might be more likely to manifest in private. The authors also note previous research which found that teens in more religious states engage in more premarital unprotected sex.
MacInnis and Hodson also offer an alternative explanation for the data, one that exonerates conservatives: The associations could be liberals' fault.
"It is possible that liberal citizens living in states higher in religiosity or conservatism search more for sexual content due to living in a more sexually-restricted environment."
In contrast to the partisan squabbling and heckling their study is sure to incite, the authors attempt to render a somewhat bipartisan conclusion.
"At minimum, these internet-search data clearly demonstrate that those living in states with greater proportions of very religious or conservative citizenry nonetheless seek out and experience the forbidden fruit of sexuality in private settings."
Souce: Cara C. MacInnis, Gordon Hodson "Do American States with More Religious or Conservative Populations Search More for Sexual Content on Google?" 03 Oct 2014. Archives of Sexual Behavior DOI: 10.1007/s10508-014-0361-8
Patient: Good evening. I just called because I thought it was a cookie and ...
Physician: ... you have taken a bite of a dishwasher tablet haven’t you?
Patient: Yes. [...] I was watching a game on TV and I wanted to eat something, my wife is so clever, she put it with cookies we buy for our grandchildren.
It's a tale as old as mindless eating itself. Distracted, you reach for the cookie box, lift a morsel out, and take a bite. But when you taste soap instead of chocolaty goodness, you reluctantly turn your gaze from the television to the "cookie box" and realize you've been duped. Then -- naturally -- you blame the error on your spouse.
The above conversation actually took place, by the way -- it was captured by scientists during the course of completing a study recently published in PLoS ONE. French researchers recorded over 30,000 phone calls to the Marseille Poison Control Center over a period of 14 months in an attempt to determine why people accidentally consume household chemicals like cleaning solutions and personal hygiene products. Here's another sample from those recordings:
Patient: I’m calling because, in fact, you are going to laugh, but I actually screwed things up and took a tube of hair gel for a mayonnaise one.
One thing the researchers noticed when reviewing the calls was that people would commonly complain that the mistakenly eaten non-food product looked or smelled a lot like real food. In light of this common thread of anecdotes, they planned an experiment. Subjects would be brought into the laboratory and shown four real products -- two food and two non-food -- while inside an fMRI scanner. Researchers would watch subjects' brain activity for gustatory inferences, patterns centered in the orbitofrontal cortex, the fusiform gyrus, and the insular cortex that materialize when viewing appetizing foods.
Fourteen subjects took part in the experiment. The researchers found clear signs of gustatory inferences when subjects viewed Cottage Happy Shower Tequila Sunrise, a shower gel that conspicuously sounds a lot like a drink you'd order when vacationing in Cancun, and even features a push-pull top like those found on sports drinks for easy d̶r̶i̶n̶k̶i̶n̶g̶ lathering.
But despite the tempting packaging and even more enticing smell, you don't want to drink Cottage Happy Shower Tequila Sunrise ("a" in the image above). It won't make you feel very good. Yet that's exactly what at least one mentally sane 41-year-old woman did, totally on accident. Given their results, the researchers partially blame deliberate marketing tactics meant to disguise non-food products as food. They suggest that the similarity actually "fools" our brains.
"The consequence of a food metaphor applied to a hygiene product is that implicit gustatory inferences can be found in the brain of the consumers. Such inferences most certainly participate in the accidental ingestion of a hygiene product," the researchers say.
Obviously we can't lay all of the blame at the feet of manufacturers, but one has to admit, a Tequila Sunrise does sound delicious.
(Images: Upstate Medical University, Basso F, Robert-Demontrond P, Hayek M, Anton J-L, Nazarian B, et al)
Source: Basso F, Robert-Demontrond P, Hayek M, Anton J-L, Nazarian B, et al. (2014) Why People Drink Shampoo? Food Imitating Products Are Fooling Brains and Endangering Consumers for Marketing Purposes. PLoS ONE 9(9): e100368. doi:10.1371/journal.pone.0100368
Understanding trends in fertility is one of the most important tasks for demographers. Population growth, or the lack thereof, is linked to economic activity. For instance, as a general rule, wealthy countries have lower fertility rates than poor ones. That is why the "problem" of overpopulation is a self-correcting one; as the developing world becomes more advanced, we will expect its fertility rate to fall.
A new paper published in PNAS adds another layer of complexity to the fertility picture. The authors, both from Princeton University, established short- and long-term links between a rise in unemployment and a drop in fertility. Using birth records, they analyzed over 110 million conceptions in American-born women between 1975 and 2009. (Immigrants were excluded because it was not possible to determine when they had children. They also tend to have more children than the native born.) Women were divided into cohorts based on age and state of birth, and the unemployment rates for the mothers' states of birth were used in the analysis. (The authors note that, while it is true future mothers can move to different states -- perhaps ones with lower unemployment -- most (roughly 2/3) remain in the states in which they were born.)
Their analysis shows a clear correlation: As unemployment increases (x-axis), conception rates drop (y-axis) in every cohort, except women aged 40-44. The biggest drop occurred in women aged 20-24.
Further analysis showed that women who experienced a period of higher unemployment from the ages of 20-24 were likelier not to have any children at all. Specifically, the authors estimate that, if unemployment rises by 1 percent, 5 more women per 1,000 will choose to remain childless. The authors admit that the overall effect of the unemployment-fertility link on society, however, is small: The total number of conceptions for American-born women is 1,916 per 1,000; women aged 20-24 are predicted to have only 14 fewer conceptions for every 1 percentage point increase in unemployment.
Small though it is, the trend is real. The authors created a "big picture" graph, showing how the conception rate falls as the national unemployment rate rises. (The blue rectangles indicate recessions.)
But note that the overall trend in conception rate is down. That is because there are likely other factors at play.
As discussed above, wealthier countries have fewer children. The United States has grown wealthier since 1975. The following chart, from Wolfram Alpha, shows how real GDP per capita (measured in constant 2005 dollars) has increased from $24,601 in 1975 to $49,506 in 2007.
Furthermore, female education is linked to a drop in fertility. The percentage of American women aged 25-29 who hold a bachelor's degree has increased from under 15% in 1969 to nearly 35% in 2009, as depicted in this graph from the Population Reference Bureau:
Thus, it is quite likely the long-term drop in conception rate among American-born women has multiple causes. It would be interesting to know which of these factors has the greatest impact.
Source: Janet Currie and Hannes Schwandt. "Short- and long-term effects of unemployment on fertility." PNAS. Published online before print: 29-Sept-2014. DOI: 10.1073/pnas.1408975111
"If science and medicine are so great, then why are so many people dying of cancer?" This question has been asked of me more than a few times. The answer is complex and multifaceted.
(1) Most importantly, we are dying of cancer because more of us are living long enough to die of cancer. Thanks to scientific and technological advances, Americans no longer drop dead of diphtheria (which, in 1900, was the #10 cause of death). In 1900, the average American lifespan was a paltry 46.3 years for males and 48.3 years for females. By contrast, in 2010, life expectancy was 76.2 years for men and 81.1 years for women. Of course, as Thaddeus Sim smartly points out on his blog, that doesn't mean that old people didn't exist in 1900. They did. But, a far smaller percentage of Americans made it to old age: Fewer than half of Americans made it to age 60 in 1900, but more than half of Americans made it to age 80 in 2000.
The point is that life expectancy and the percentage of Americans reaching old age are both increasing. That explains why, as a paper in The New England Journal of Medicine showed, cancer was the #8 cause of death in 1900 but the #2 cause of death in 2010.* We aren't dying of cancer because of Monsanto's pesticides and GMOs, as one lady recently said to me in an e-mail. We are dying of cancer because we are running out of things to die from. As George Johnson explained in the New York Times:
"[P]eople between 55 and 84 are increasingly more likely to die from cancer than from heart disease. For those who live beyond that age, the tables reverse, with heart disease gaining the upper hand. But year by year, as more failing hearts can be repaired or replaced, cancer has been slowly closing the gap."
(2) We are becoming better at diagnosing cancer. That's not necessarily good news. It is still possible that an early detection will not cause you to live a moment longer. (This is a phenomenon referred to as lead time bias.) But, knowing the cause of death is better than not knowing, and we have become quite good at medical diagnostics.
(3) We are, indeed, slowly winning the war against cancer. How so? As a 2014 paper in CA: A Cancer Journal for Clinicians explains, a combination of factors -- including early detection, preventative measures, and improved treatment -- has reduced the cancer mortality rate from a peak of 215 deaths per 100,000 people in 1991 to 172 deaths per 100,000 people in 2010. A graph from the paper beautifully summarizes:
The red line indicates the number of deaths that would have occurred had the mortality rate not fallen to the actual rate (blue line). Thus, the drop in the cancer mortality rate saved the lives of 1.3M Americans since 1991.
That sounds like another win for science. Cancer, you're on notice.
*Note: In 1900, the top three causes of death were due to infectious disease: pneumonia/influenza, tuberculosis, and gastrointestinal infections. By contrast, in 2010, the top three causes of death were heart disease, cancer, and noninfectious respiratory disease. This is yet another triumph of science, particularly microbiology, as my favorite microbiology textbook claims.
Source: Rebecca Siegel, Jiemin Ma, Zhaohui Zou, Ahmedin Jemal. "Cancer Statistics, 2014." CA: A Cancer Journal for Clinicians 64 (1): 9-29. Jan/Feb 2014. DOI: 10.3322/caac.21208
You hold in your hands a delicious slice of pizza. Globs of grease sparkle on its cheesy surface. The toppings are piled on and arrayed splendidly. Is this food or art? You start to salivate. But you're not the only one.
Within your gut, bacteria of the phylum Bacteroidetes are salivating, too (figuratively, of course). Fat is one of their favorite foods, and they're about to receive a feast.
Researchers have long pondered our relationship with the 100 trillion bacteria residing within our guts. It seems these squatters dip their flagella in a multitude of pots. Our resident bacteria aid in digestion, influence the immune system, change how we store fat and metabolize sugar, and even prevent allergies.
Therefore, it might come as little surprise that they may also be controlling our minds.
In an article published in the September issue of BioEssays, scientists Joe Alcock, Carlo C. Maley, and C. Athena Aktipis reviewed the research on how microbiota affect the brain, and believe there's a strong case that bacteria influence overall eating behavior. It seems that the bacteria in our guts don't simply wait for whatever leftovers we have to offer. They actively seek out their preferred meals through tricksy deception.
“Microbes have the capacity to manipulate behavior and mood through altering the neural signals in the vagus nerve, changing taste receptors, producing toxins to make us feel bad, and releasing chemical rewards to make us feel good,” Aktipis says.
Around 100 million neurons are stationed in the gut, collectively forming the enteric nervous system, also called the "Second Brain." The enteric nervous system is connected to the human brain via the vagus nerve. Thanks to this setup, bacteria are granted streamlined pass to the brain, and they're equipped to take advantage. For example, microbes have genes that allow them to produce hormones like serotonin and dopamine.
Studies in humans have shown that probiotics can improve mood. In mice, the affects are even more pronounced. When a team transplanted the gut bacteria from fearless mice into more anxious mice, the anxious mice began displaying markedly bolder behavior. The behavioral change worked in reverse, too.
"Like microscopic puppetmasters, microbes may control the eating behavior of hosts through a number of potential mechanisms including microbial manipulation of reward pathways, production of toxins that alter mood, changes to receptors including taste receptors, and hijacking of neurotransmission," the researchers write.
Basically, bacteria will send positive signals to the brain when you eat foods that they like, and negative signals when you eat foods they don't like.
Since fecal and oral bacteria can be transferred between individuals, particularly among people living together, the authors consider an intriguing possibility.
"The obesity epidemic could be contagious as a result of obesity-causing microbes transmitted from person to person. A social network study of 12,067 people found that a person’s chance of becoming obese increased by 57% if a friend had become obese. This raises up the possibility that cravings and associated obesity might not be socially contagious as the authors of the social network study suggest, but rather truly infectious, like a cold."
The authors offered up some routes for further research. For instance, they hypothesize that populating an animal's gut with a microbe that feeds on seaweed will cause the animal to develop a preference for seaweed. They also suggest that a diverse microbiome will inhibit manipulation on the host because the microbes will be too busy competing or cooperating with each other.
Far more controlled research on humans will be needed before we can truly gauge the extent to which bacteria control our food preferences.
Source: Alcock, J., Maley, C. C. and Aktipis, C. A. (2014), Is eating behavior manipulated by the gastrointestinal microbiota? Evolutionary pressures and potential mechanisms. Bioessays, 36: 940–949. doi: 10.1002/bies.201400071
(Image: San Diego State)
"Defecation is a major activity of daily living." Thus begins a new paper in the journal Scientific Reports, and truer words have rarely been written.
Most humans who are physically unable to drop a deuce, perhaps from cancer or an anatomical anomaly, must undergo a colostomy. In this medical procedure, the large intestine is diverted via a surgically created opening (called a stoma) to an external bag that collects feces, which must be emptied regularly. Many people who wear such devices suffer from psychological problems and a lower quality of life. An alternative -- any alternative -- is preferable.
Japanese researchers have turned to anorectal transplants. In their estimation, this has the potential to surprass traditional reconstructive surgical techniques, which do not properly restore anal function. In order to develop the technique, the authors practiced on four healthy beagles, performing the very first successful anorectal autotransplant in dogs. (In an autotransplant, the transplanted tissue comes from the same animal.)
The researchers removed the existing anorectal region, replaced it with a graft formed from other nearby tissues, and reconnected the blood vessels and nerves. The dogs were then fed a liquid diet. The transplant in Dog #1 was unsuccessful, and it was euthanized the next day. Dogs #2 and #3 survived for three and four days, respectively, before being euthanized.
Dog #4 survived nine days, and the researchers injected it with a dye (indocyanine green) in order to monitor and collect data on blood flow to the tissue graft, a procedure known as an angiography. (See figure.)
As shown above, figure (i) depicts the anorectal region, while figures (j) and (k) depict the anorectal region before and after the angiography had begun. The colored dots represent areas in which the authors were interested in collecting blood flow data. (Red = positive control; Blue = negative control; Yellow = anal canal; Green = perianal skin.) Blood flow, as illustrated in figure (l), was considered satisfactory and compared favorably to a dog that never underwent surgery.
Likely because the dogs were only fed a liquid diet, they did not defecate after the surgery. So, the authors cannot definitively determine if the graft functioned properly. However, demonstrating normal blood flow post-surgery is a positive first step.
Now, you might be wondering, Why did they practice this technique in dogs? Good question. Other researchers have successfully attempted similar procedures in rats and pigs. However, the authors claim that these animal models are unsatisfactory because they do not exhibit the same bowel control that dogs and humans do. In other words, man and man's best friend can poop (or not) on command. Thus, it is best to perfect the technique in dogs.
How about non-human primates? They are problematic because, as the authors write, "they are not usually trained with good lavatory manners." Indeed.
Source: Jun Araki et al. "Anorectal autotransplantation in a canine model: the first successful report in the short term with the non-laparotomy approach." Scientific Reports 4, Article number: 6312. Published: 10-Sept-2014. doi:10.1038/srep06312
A new study published in the journal Nature has found that zero-calorie artificial sweeteners promote glucose intolerance in mice and humans.
Glucose intolerance is marked by higher-than-normal levels of blood sugar. It's a known precursor to diabetes and a risk factor for cardiovascular disease.
Scientists at the Weizmann Institute of Science in Israel spearheaded the research using a step-by-step approach. First, the researchers added three artificial sweeteners -- either sucralose, aspartame, or saccharin -- to the water of three groups of mice. Three other control groups of mice were given normal water or water with added sucrose or glucose. 11 weeks later, the mice consuming artificial sweeteners exhibited marked glucose intolerance compared to control groups.
Since most artificial sweeteners aren't broken down by the digestive system, the research team hypothesized that the glucose intolerance was mediated by changes to gut bacteria. To test this, they induced glucose intolerance in both lean and obese mice with the artificial sweetener saccharin, then gave both groups antibiotics to kill off gut bacteria. After the antibiotic regimen, both groups of mice saw their glucose response and blood sugar levels return to normal.
For further confirmation that the gut flora was regulating glucose intolerance, the team transplanted gut bacteria from mice fed saccharin to control mice. The control mice exhibited signs of glucose intolerance six days later.
To see whether their results in mice would carry over to humans, the researchers conducted two experiments. First, they examined a group of 381 individuals without diabetes and queried them about their intake of artificial sweeteners. They found that artificial sweetener intake was linked to various metabolic parameters, including increased weight, a larger waist-to-hip ratio, and higher blood sugar levels. Second, the researchers fed seven subjects the FDA’s maximal acceptable daily intake of saccharin per day -- equal to roughly 40 cans of diet soda -- for seven days and observed what happened. Four of the individuals developed significantly poorer glycemic responses. The other three subjects also showed worse responses, however the differences did not reach statistical significance.
The authors of the study feel their results merit a serious "reassessment" of massive artificial sweetener use. But others aren't so sure.
Professor Naveed Sattar, a Professor of Metabolic Medicine at the University of Glasgow, says that the study's findings in regards to mice are interesting, but noted that we should be skeptical of extending them to humans.
“The findings of this study do not prove that sweeteners pose any real risk to humans. If there are any risks, we need well controlled studies in humans to find them.”
Sir Stephen O'Rahilly, a Professor of Clinical Biochemistry and Medicine at the University of Cambridge, agreed with Sattar.
"The authors report an association between artificially sweetened beverages and markers of diabetes in 381 people; however, a recent study involving more than a third of a million people showed no association between consumption of artificially sweetened drinks and the development of diabetes. Suez et al also report adverse effects on glucose levels after 7 days of saccharin ingestion. However these experiments were undertaken in only seven people, so must be deemed preliminary."
According to the Harvard School of Public Health, "The health benefits of artificial sweeteners are inconclusive." Some studies show that they are healthy replacements for sugar, while others have showed no net benefits.
While the current study presents intriguing findings in mice, its human components lack rigor. So don't feel compelled to ditch diet soda just yet.
In general, it's probably best to avoid habitually consuming copious amounts of sugar or artificial sweeteners.
Source: Jotham Suez et. al. "Artificial sweeteners induce glucose intolerance by altering the gut microbiota." Nature. 18 Sept. 2014. doi:10.1038/nature13793