Is time travel possible? Physics gives us absolutely no reason to believe that it could be. Last month, research on the possibility of a sort of time travel was published and breathlessly reported by popular science media. Can time travelling as envisioned by Madeleine L’engle and Doctor Who now actually occur? The feasibility of this new claim relies upon two fantastic concepts of modern physics: quantum clones and wormholes.
Quantum cloning sounds like a science fiction plot device. Simply put, it means to make an exact replica of a quantum state. Like many scientific plot devices, it is not real. (Some research has gone into imperfect cloning, but a clone by definition is a perfect copy; this is a misnomer.) Quantum cloning is forbidden in the universe as we know it by a rule with an equally silly name: the no cloning theorem. Without resorting to mathematical proofs, we can say that quantum cloning is impossible because it would allow us to beat the quantum uncertainty principle.
In a new paper, physicists claim that we can create a quantum clone of a state from the past. This is a sort of time travel. It’s an exciting idea, and the authors include a handy “circuit diagram” for their idea. (See figure, below right.)
The important things to observe in this arcane technical figure are the three lines at the bottom, which I’ve circled in red. These are wormholes! The quantum state to be copied, called ρ, is measured by UICM which gives out a measured state. The measured state then must enter into a wormhole and be retrieved again. To make this process work well, you need lots and lots of wormholes. In fact it only makes perfect clones when you have so many wormholes that you can’t count them all.
So, what’s the problem? For one thing, wormholes may not exist. But, even if they do, there is no known way to send anything through one. Wormholes, at present, are entirely science fiction.
Albert Einstein and Nathan Rosen first proposed wormholes in the 1930s. Certain configurations of wormholes are theoretically allowed by solutions to Einstein’s theory of general relativity, which geometrically describes gravity. These allowable wormholes have never been observed experimentally. Another great physicist, John Wheeler, then discovered that nothing could be sent into such a wormhole and recovered out of it again within one universe. Other types of theoretical wormholes may be traveled through, but only with the aid of such illusory media as “negative mass matter.”
The obvious fantasy of this idea is clear without resorting to deep analytical methods of quantum mechanics and gravitational theory. Further problems still arise with time travel due to simple logic. You could travel back in time and stop your grandfather from meeting your grandmother. Then you would never be born in the first place. But then you could not have gone back to separate your grandparents. These types of paradoxes of causality plague any idea of time travel to the past.
Be skeptical of any theory that produces the possibility of time travel, particularly if it relies on wormholes. That’s not a theory based on physical reality, but rather fodder for science fiction writers and television shows.
Source: TA Brun, MM Wilde, and A Winter. "Quantum state cloning using Deutschian closed timelike curves." via Arxiv.
ON THE MORNING of May 21, 2010, Cherry Woods was taking a walk around her suburban Houston neighborhood, when she witnessed an ominous sight. Two, large dogs were barreling towards her from down the block. Woods began sprinting towards her home, but the canines quickly overtook her, knocking her down and ferociously ripping at her legs.
Hearing the commotion from inside the house, Woods' husband Harold tore outside and fought to free his wife from the frenzied animals. Unsuccessful at first, he didn't give up. Cherry's life possibly hung in the balance. It was at that moment, when all was looking grim, that an unlikely rescuer joined the fray.
Lima, the couple's cat, leapt out of the nearby bushes and scratched one of the dogs. The attacking pair promptly released Cherry and turned their attention to the crouched, hissing feline. Thanks to the momentary distraction, Harold was able to drag his bleeding wife to safety.
"I’m very glad that we had [Lima] and that she was here, because when it came down to my wife getting hurt, she jumped right in. It’s amazing," Harold Woods told KHOU.
THERE IS a common stereotype that cats are far less affectionate than dogs. Dogs actually want you around; cats just want you to scoop the food. But while many stereotypes are true, this one is a misconception. Cats are just as fond of their owners; their feelings are just a tad more nuanced, and their adoration less demonstrative.
To begin to understand why, we can look back in time to before dogs and cats were domesticated. Thousands of years surviving in distinct ecological niches have molded the two species' behavior into starkly contrasting models. Dogs, by their nature, are pack animals, living in a ranked community and subservient to a leader. Together, they hunt and kill larger prey. Cats, on the other hand, are mostly solitary, but occasionally form communities of related individuals. As capable hunters in their own right, they have no pressing need to group together.
Thousands of years of living with humans have shaped dogs and cats in similar ways, but the vestiges of ancient evolutionary instincts still linger. At home, your pooch is effectively subservient to you; you are the pack leader. Contrastingly, your cat views you as an individual sharing their space. Your dog is faithful to you by nature, but your cat's affection must be earned.
When it is, cats are just as loving as dogs; you just have to know what to look for. In his new book, Cat Sense, anthrozoologist John Bradshaw clues us in.
"The upright tail is probably the clearest way cats show their affection for us." Bradshaw says.
Rubbing their owners' legs or nearby objects also indicates fondness, as do petting invitations. When a cat jumps on your lap, rolls on their back, or subtly maneuvers to make a body part more accessible, they want to be touched.
"By accepting stroking, cats are engaging in a social ritual that is reinforcing the bond with their owner."
According to Bradshaw, purring also shows contentment, however, it is not necessarily a dead giveaway.
"A purring cat also may just be hungry, or mildly anxious. Some continue to purr even when their body language indicates they are angry."
IN MANY WAYS, the human-cat relationship is much like that of lovers in the early stages of dating: the affection is there, it's just difficult to read. Cats seem aloof and unexpressive because they aren't totally accustomed to sociality. A study conducted earlier this year found that cats -- when hearing the voice of their owner -- orient their heads and ears towards the sound. Moreover, their pupils dilate, which the researchers say is a sign of excitement. A different study, in which animal behaviorists recorded hours and hours of interactions between owners and their cats, found that cats seem to remember kindness and lovingly return it at later instances. Additionally, they noted that food is just as much a token of affection as it is a source of nourishment.
Both dogs and cats must be habituated to humans at a young age. Puppies should be handled between 7 and 14 weeks. For kittens, this sensitive period is narrower -- 4 to 8 weeks -- and more crucial. Felines who don't meet humans until ten weeks or later may fear them for the rest of their lives.
If properly cared for and respected, cats are just as doting as dogs. They won't joyously slobber all over you, but they will like you, in their own, peculiar way.
(Image: Cat via Shutterstock)
Several high-profile suicides among former NFL players have stirred a debate over whether or not repetitive concussions in violent sports such as football can cause cognitive decline later in life. This neurodegenerative condition has been called CTE, or chronic traumatic encephalopathy.
Boston Globe sports writer Bob Ryan recently penned a very provocative article about America's most popular game. In it, he lamented the violent nature of football and the fact that almost every player, at some point, gets injured. He then went on to say this:
Football has an enormous appeal to many people who are borderline psychopaths in a manner that no other sport — and this includes the very virile sport of hockey — does not.
That is an absurd statement. First, it is quite unlikely that the medical research community has any data on which sport psychopaths find most enjoyable. Second, I am forced to conclude that Mr. Ryan has never attended a South American soccer match, one of which resulted in the beheading of a referee. (Of course, the fans thought the decapitation was justified since the referee had just murdered one of the players.) In Poland, soccer hooligans regularly beat each other up. Some of them aren't even fans of the game; they simply engage in the activity as some sort of demented pastime.
Setting aside Mr. Ryan's dubious analysis, though, there is a legitimate enough reason to wonder if getting bonked in the head over and over causes long-term damage. Let's leave behind the anecdotes. What does the science say?
Unfortunately, not much. A recent article by Stella Karantzoulis and Christopher Randolph in Neuropsychology Review examined the evidence. CTE isn't a "new" disease; it's been recognized for several decades in boxers and referred to as "dementia pugilistica." However, the trouble with drawing any connection between sports concussions and neurodegenerative disease is the fact that most studies have been done, as the authors say, out of convenience. In other words, typically only athletes who were suspected of having CTE are autopsied. This introduces an enormous bias into the rather limited evidence base.
Other problems for a football-CTE link include: (1) The rate of suicide among former NFL players is actually lower than the population at large; (2) the symptoms of CTE are similar to other neurodegenerative diseases, such as Alzheimer's; and (3) there is no medical consensus about the distinguishing features of CTE. For instance, the presence of a clump of proteins called tau in the brain (tauopathy) is used to diagnose CTE, but tauopathy also exists in Alzheimer's and nearly two dozen other diseases.
Of course, none of this is to say that sports concussions don't cause CTE. Maybe they do. Or maybe CTE doesn't really exist. It could be that repetitive concussions trigger neurodegenerative disease in individuals who are already genetically predisposed toward it. There just isn't enough scientific evidence to conclude one way or the other. Long-term cohort studies are needed to settle this issue.
Please keep that in mind next time you hear about how football is supposedly causing former players to commit suicide.
Source: Stella Karantzoulis & Christopher Randolph. "Modern Chronic Traumatic Encephalopathy in Retired Athletes: What is the Evidence?" Neuropsychol Rev. Nov 2013.
We all know how we feel when sleep-deprived: tired, groggy, and grumpy. But have you ever considered all of the ways just a single poor night of sleep may be messing with you? Science has revealed a great many ramifications that you've probably never thought of.
1. You're more depressed and anxious. In 2008, researchers assessed 226 individuals who had six or more hours of sleep the previous night and 112 individuals who had less. The "poor sleep" group scored significantly higher in levels of stress, depression, and anxiety compared to those that slept longer.
2. You pee more the next night. During the night, urine production naturally declines, permitting us to achieve uninterrupted sleep. But if you're sleep deprived from the night before, this mechanism doesn't work as efficiently. Examining 10 male and 10 female subjects over a 48-hour period, scientists found that when sleep deprived, both genders produce "markedly" larger amounts of urine, potentially translating to additional nighttime visits to the bathroom.
3. You eat more, and more unhealthily. What happens if young men get four hours of sleep instead of eight? They consume about 560 additional calories the following day. Moreover, when both men and women are sleep-deprived, they choose foods like pizza or doughnuts over healthier fare. According to researchers at Berkeley, a lack of sleep seems to dampen activity in the brain's frontal lobe -- an area tied to complex decision-making -- and elevates activity in the reward centers.
4. If you're a man, you think that women want to have sex with you. Compared with rested men, men deprived of sleep for one night rate women as more interested in having sex. The researchers warn that this more risqué perception could lead to increased incidents of inappropriate advances and sexual harassment.
5. You look sadder and less attractive. In two separate studies undertaken by Tina Sundelin at Stockholm University, untrained observers compared photographs of subjects who had been awake for 31 hours with photos of the same subjects after eight hours of sleep the night before. Observers were blinded to the conditions and viewed the pictures in random order. In the first study, observers rated sleep-deprived people as less attractive. In the second study, sleep-deprived people were deemed to looker sadder.
6. You feel more excluded. According to sleep researcher Tina Sundelin, "Pretty much everyone gets upset if they feel others are excluding them, but... a sleep-deprived person reacts even more strongly to social exclusion than their well-rested peers do."
(Image: Tired Panda via Shutterstock)
Peering through his telescope, astronomer Edwin Hubble observed that the Universe was destined to end in ice. Quantum theories say that it might end in fire. Thermodynamics, an older and even more inviolable pillar of physics, says that it may end in a different way, with all things dim, tepid, and sluggish: in equilibrium.
The eventual fate of the universe, from this view, is something called heat death. Thermodynamics dictates that large systems evolve toward equilibrium over time. This is a balanced, calm state where no more reactions are favorable; nothing has energy to gain or lose compared to anything else.
The universe currently varies tremendously in composition from place to place. We live among glowing filaments of matter and energy, clustered together in a background of vast nothingness. Stars, planetary systems, galaxies, nebulae, black holes -- all are incredibly concentrated specks in the colossal void of space.
If the entire universe can be understood as a thermodynamic system, this theory spells out the certain consequence. In the far distant future, all these hot heavy specks will all be spread out into the enormous cold void, mixing until everything is a thin uniform mist. Like boiling water added to a bowl of cold soup, the two extremes will balance out and leave lukewarm broth.
Heat death originated from the work of several prodigious physicists who began the study of understanding how machines transform heat into mechanical work. Lord Kelvin, Sadi Carnot, and others formed an empirical understanding of how steam engines and other suppliers of motive force do this. They discovered that the machines were harnessing the tendency of energy to flow from hot areas to cold ones. Eventually, the entire system settles down to an intermediate temperature and no more net energy transfer occurs. (This is the maximization of entropy.)
In space, molecules, atoms and subatomic particles will collide with one another, spreading their momentum and energy from the fast to the slow. The motion of all the particles of the universe will gradually thermalize (become random); they will collide and interact with no change of energy. Eventually they will careen out into empty space after some unlucky bounce to spend eternity traveling alone.
Can this dull fate be avoided? Not if the universe is a completely thermodynamic system. However, it is unclear whether the gravitational forces within and between enormous astrophysical objects can be accurately described this way. Brilliant physicist Freeman Dyson believes gravity will prevent heat death from ever occurring.
In any case, don’t lose sleep over this (or any other) far distant cosmic outcome. If it occurs, this equilibrium process will take what seems like an infinitely long time. (Can you imagine the 10^100 years -- a.k.a., "googol" -- required for a large black hole to evaporate completely? I cannot.) We essentially have an eternity of time to enjoy our universe before it fades gently into that all-encompassing night.
As far as mountains go, the Atapuerca Mountains in Spain aren't much to look at. In many places, they amount simply to scrub-covered, limestone hills rather than towering, craggy heights. If the mighty Rockies of North America could speak, they might very well be scoffing.
But at Atapuerca, the focus is more on the sediments below the ground than the rocks above it. The area is home to a treasure trove of buried archaeological riches: fossils and tools belonging to the earliest known species of ancient humans. Rightfully so, the United Nations and the World Heritage Organization have designated the archaeological sites at Atapuerca as protected World Heritage Sites, for providing "an invaluable reserve of information about the physical nature and the way of life of the earliest human communities in Europe."
The most famous site at Atapuerca, Sima de los Huesos -- "The Pit of Bones" -- is precisely that. Located at the bottom of a 43-foot chimney in the winding cave system of Cueva Mayor, it contains approximately 5,500 ancient human bones dated at over 350,000 years old! Now, drawing upon this piled wealth of history, Matthias Meyer, a lead researcher at the Max Planck Institute for Evolutionary Anthropology, and a team of colleagues have recovered and analyzed the earliest known human DNA.
DNA, as you may very well know, is the molecular instruction manual for how to build life, and the DNA at Sima de los Huesos is thought to belong to Homo heidelbergensis, a group of extinct humans roughly comparable in height and looks to Neanderthals. Drilling into a femur present at the site, the team collected about two grams worth of bone, then isolated DNA using a recently discovered method that employs silica to make the process more efficient. The team focused on the DNA contained within mitochondria -- the powerhouses of cells -- which holds vastly fewer genes than does nuclear DNA, which is contained within cells' nuclei. Because mitochondrial DNA is passed down exclusively from mothers, there are usually no changes from parent to offspring. This makes it a powerful tool for tracking ancestry, which is precisely what the researchers used it for.
After sequencing 98% of the mitochondrial DNA genome, Meyer and his colleagues estimated the specimen's age using the length of the DNA branch as a proxy. The femur clocked in at around 400,000 years old, placing its former owner in the Middle Pleistocene and making the DNA by far and away oldest human DNA ever collected. The previous record belonged to 100,000-year-old Neanderthal DNA.
The team then attempted to determine the specimen's position in the ancient human family tree and were surprised to find that the owner did not share a common ancestor with Neanderthals, but instead with Denisovans, a mysterious subspecies of human discovered in 2008 that last shared an ancestor with Neanderthals and Homo sapiens about one million years ago. Indeed, the more scientists discover about our prehistoric ancestors, the further they seem to fall down Alice's Rabbit Hole. Things just get curiouser and curiouser.
Meyer presented three possibilities that could account for the team's unexpected findings.*
"First, the Sima de los Huesos hominins may be closely related to the ancestors of Denisovans."
"Second, it is possible that the Sima de los Huesos hominins represent a group distinct from both Neanderthals and Denisovans that later perhaps contributed the mtDNA to Denisovans."
"Third, the Sima de los Huesos hominins may be related to the population ancestral to both Neanderthals and Denisovans."
Next up, Meyer plans to assemble nuclear DNA sequences from the specimens at the Pit of Bones in the hopes of learning even more about where they fit within the annals of human evolution. This will be a tall task, however. With a half-life of 521 years, DNA breaks down fairly rapidly even under the most optimal conditions: encased in glaciers or buried beneath arctic tundra, for example. Furthermore, with about 21,000 genes, human nuclear DNA presents a much more complex tome to completely piece together.
Source: Matthias Meyer et. al. "A mitochondrial genome sequence of a hominin from Sima de los Huesos." Nature. December 2013. doi:10.1038/nature12788
(Image: Javier Trueba, MADRID SCIENTIFIC FILMS)
Mothers are spending much less time on physical activities like housework, child care, laundry, food preparation, postmeal cleanup, and exercise and much more time on sedentary activities such as watching television, using computers, or commuting, according to a new study. Such a marked reallocation of time could be significantly driving the rise in obesity among women.
Examining the American Heritage Time Use Study, a large collection of data sets extending all the way back to 1965, University of South Carolina exercise scientist Edward Archer and a team of researchers found that mothers of younger children (5 years or less) now engage in 13.9 fewer hours per week of physical activity while concurrently spending an additional 5.7 hours on sedentary pursuits. Mothers of older children (ages 6 to 18) spend 11.1 fewer hours on physical activities and 7.0 hours more on sedentary ones.
This fundamental shift in activity translates to a significant and sustained reduction in energy expenditure. Reduced caloric expenditure, combined with an increase in consumption, spurs weight gain. Compared to mothers in 1965, mothers of younger children now expend 1,572.5 fewer calories per week while mothers of older children now expend 1,237.6 fewer calories per week.
Two key societal changes explain this trend. Thanks to evolving attitudes on gender roles, women have now taken their rightful place alongside men in the workplace. This isn't necessarily benefiting their health, however. Since the 1960s, occupations have grown increasingly sedentary (which may by itself explain the obesity epidemic among men). Both men and women often sit in driver's seats of cars while traveling to work, where they continue to sit all day in front of computer screens. Moreover, household activities are far easier and less strenuous then they used to be. Doing the laundry once involved hanging clothes out to dry; dishes had to be scrubbed by hand; and food wasn't as widely available in a simple, ready-to-eat format. Men are also increasingly performing household chores.
Though society now predisposes Americans to inactivity, Archer resolutely believes that men, women, and especially mothers and expecting mothers, need to find ways to be more physically active.
"Children raised by inactive, sedentary, and therefore unhealthy caregivers may have an increased risk of being inactive, sedentary, and unhealthy as adults," he posits.
Growing evidence shows that parents' lifestyles affect their kids' behavior. If parents sit around and watch a lot of television, their kids likely will, too. In this, mothers have an outsized role, because the simple fact is that women spend more time on average raising children than men.
The study isn't perfect. Though the American Heritage Time Use Study contains data from tens of thousands of mothers, it is all based on self-report, a method which Archer has ardently railed against in the past. It should also be made clear that changes in mothers' time-use don't completely explain the obesity epidemic among women.
According to Archer's data, mothers' physical activity and sedentary time have remained fairly consistent since the 1980s. At first glance, this seems suspicious. If physical activity is constant, then how can it be fueling rising levels of obesty? The answer, Archer says, is that chronic idleness drives us to overeat. Since the 1950s, research has shown that activity helps keep appetite in line with energy expenditure. When individuals become inactive, this sensitivity dampens, and so they tend to consume more energy then their bodies actually need.
Both men and women face the challenge of ballooning waistlines together. All told, 35.5% of men and 35.8% of women are obese. Altered dietary patterns, a reduction in sleep, and increased levels of pollution are three of the many causes being considered. Of these, diet likely gets the most attention, as evidenced by laws in New York that restrict soda sizes, the almost religious popularity of nutrition fads, and frequent calls to punish companies that sell junk food. But Archer thinks that lack of physical activity is just as much to blame, if not moreso.
"Inactivity has increased significantly over the past 45 years and may be the greatest public health crisis facing the world today," he says.
In order to combat the rising trend in sedentary lifestyles, mothers and fathers need to exemplify healthy lifestyles and make it easy for their kids to be physically active.
"The take home message is that the hand that rocks the cradle rules the health of the next generation," Archer says.
Source: Edward Archer et. al. "Maternal Inactivity: 45-Year Trends in Mothers’ Use of Time." Mayo Clinic Proceedings. 2013;88(12):1368-1377
Everybody in Seattle knows Jeff Renner. He's the weather guy on KING5 News, the local NBC affiliate. Each night at around 11:15 p.m., he provides us with little nuggets of knowledge, from the extended forecast to the type of tires we should use when crossing the mountains.
Like all scientists, Mr. Renner has his own special lexicon. Meteorologists are known to use some phrases which may not make an awful lot of sense to the average person. For instance, you might very well hear your weather guy (or gal) say, "The barometic pressure is 30.2 inches and falling." What on Earth does that mean? Well, it helps to know a little chemistry.
Barometric pressure -- which is measured using a barometer -- is another name for atmospheric (air) pressure. This is the pressure that air molecules exert on the surface of the Earth, largely because of their weight. But, pressure isn't measured in inches; it is measured in pounds per square inch, millibars, pascals, or some other unit. So, why does your meteorologist measure pressure in inches?
The term is a throwback to the days when barometers were made using mercury. (Today, they are digital.) A glass tube was filled to the rim with mercury, and then it was inverted and placed into a reservoir that also contains mercury. The reservoir is open to the atmosphere. Air molecules that push down on the mercury in the reservoir cause mercury in the tube to rise. If atmospheric pressure increases, the mercury rises further; if atmospheric pressure decreases, the mercury level falls. (Check out this video on how to make a classical mercury barometer.)
As it turns out, "normal" atmospheric pressure at sea level causes mercury to rise to a level of 760 millimeters, written as 760 mmHg. Since Americans reject the metric system, we convert 760 mm to 29.92 inches. Atmospheric pressure typically fluctuates between 29 and 31 inches, with higher altitudes experiencing lower air pressure.
Now, we can translate that mystifying statement, "Barometric pressure is 30.2 inches and falling," into useful English. What your weatherman meant was, "If we were using an old-school barometer, the mercury would be 30.2 inches high and falling."
Note that the most important part of that statement isn't the numerical value of the atmospheric pressure, but the fact that it is falling. Falling air pressure indicates that bad weather is coming, while rising air pressure indicates good weather.
Bizarrely enough, some people can sense changes in atmospheric pressure. You probably know an old person who says, "Storm's comin'; I can feel it in my bones." Old people aren't making that up. People with arthritis are sensitive to low atmospheric pressure because their bones expand a little, causing pain.
The Hitchhiker's Trilogy is one of the best works of science fiction ever penned. As supremely funny as it is intelligent, the series chronicles the adventures of a band of intergalactic misfits as they traverse the universe by way of a stuck-out thumb. Along the way, they trek to prehistoric Earth, converse with a supercomputer the size of a small city, and discover the answer to life, the universe, and everything. (Though sadly, they never actually learn the question.)
Featured in the story are a great many futuristic inventions that -- for the most part -- better the lives of those who use them. Here are eight that we wish existed right this very moment!
8. Nutrimatic Drinks Dispenser
The nutrimatic drinks dispenser is the marriage of a Star Trek-style replicator and a modern-day vending machine. Douglas Adams describes it thusly:
"When the 'Drink' button is pressed it makes an instant but highly detailed examination of the subject's taste buds, a spectroscopic analysis of the subject's metabolism, and then sends tiny experimental signals down the neural pathways to the taste centres of the subject's brain to see what is likely to be well received."
So, theoretically it should produce a beverage that is not only keenly tailored to suit one's metabolic needs, but also tastes pretty good to boot.
However in the book, this rarely happens. The machine instead vends a concoction described as "almost, but not quite, entirely unlike tea."
7. Crisis Inducer
The crisis inducer is a watch-like device that convinces its wearer that an emergency is taking place, thus boosting the user's awareness and energy levels by engaging the fight-or-flight response. Crises can conveniently be set at any desired severity.
With luck, a creative software designer will translate this idea into a downloadable app available on any smartphone.
6. Joo Janta 200 Super-Chromatic Peril Sensitive Sunglasses
Want to develop "a relaxed attitude to danger"? Simply slot one of these glasses over your eyes. "At the first hint of trouble they turn completely black and thus prevent you from seeing anything that might alarm you."
Just make sure not to wear them while driving.
5. Doors with Personality
Opening doors is often a mundane experience, so why not give the job to doors that actually enjoy their work? Doors manufactured by the Sirius Cybernetics Corporation come equipped with chirpy personalities, and love nothing more than to open and close. Not only that, they cheerily and profusely thank anybody who decides to walk through.
4. Matter Transference Beams
The matter transference beam is directly comparable to the transporter in Star Trek. It works by "tearing you apart atom by atom, flinging those atoms through the sub-ether, and then jamming them back together again..." If rendered from sci-fi to real life, it would undoubtedly herald a revolution in transportation.
Take note, however: When used for the first time, the process isn't altogether pleasurable, leaving the user dazed, hung-over, and often with an ear-splitting headache; "not as much fun as... a good solid kick in the head."
3. Point of View Gun
Simple in its purpose, the point of view gun causes the target to see things from the point of view of the shooter. If used responsibly, imagine what good such a "weapon" might do in Washington!
2. Infinite Improbability Drive
Perhaps the hallmark invention of The Hitchhiker's Trilogy, the infinite improbability drive allows for faster-than-light travel. Operating on a curious facet of quantum theory -- that the atoms of an object are immensely likely to be concentrated in one place, but that there's an immeasurably small chance that just one of them might be far off somewhere else -- the drive controls probability to transport all of an object's atoms anywhere in the universe almost instantaneously. Doesn't make sense? That's okay. None of the characters in the book understood it either.
1. Advanced Towels
As everyone knows, the towel is the most useful thing a person can own. Capable of providing warmth, comfort, and protection, a towel can even be used as a gas mask or a weapon for hand-to-hand weapon. You'd be daft to leave home without one.
Though towels are already indispensable, it's high time that they be adapted to the rigors of modern life. In the book, towels are enhanced with complex circuitry, fortified with food replacements, and reinforced with industrial-strength seams. Companies and inventors can start there.
Picture a laboratory more than half a mile long, half a mile wide and half a mile tall. Now, carry it to Antarctica and bury it under a mile of ice. This is the Ice Cube particle detector, located at the Amundsen-Scott station at the South Pole. This vast structure functions as a telescope for viewing neutrinos, elusive particles that flow through our bodies in the trillions every second and yet are close to impossible to detect.
Ice Cube’s job is specifically to find neutrinos with extremely high energy: particles with of roughly 30 to 1200 TeV (teraelectronvolts) or about 2-100 times more energy than the particles produced at the LHC. The origins of these very energetic particles are extremely powerful events in the distant universe about which we know very little.
Last week, Ice Cube reported finding 28 neutrinos from deep outer space in its first two years of operation (2011-2012). We don’t yet know exactly where these tiny particles came from, but the enormous energies that they were carrying mean that they likely come from far-distant events of enormous scale.
Sources could include gamma-ray bursts, formed by ultra-powerful supernovae and quasar emissions, which are the destructive energy released by thousands of stars being sucked into the supermassive black hole in the center of a massive galaxy. Quasars can be 4 trillion times brighter than the sun, or roughly as bright as 100 very large galaxies. Beyond these observations, the ability to see extremely energetic particles may shine a light on processes that explain dark matter or hypothetical supersymmetric particles.
Deep underground in a sheet of incredibly clear ice is the ideal location for Ice Cube. Most neutrino detectors are buried far underground to reduce false detections triggered by cosmic rays. Then the really hard part begins.
A neutrino must pass through a huge amount of matter to have even a minute chance of interacting with any of it to produce something you can see. For smaller detectors, this is a tank of incredibly pure water. But it is not feasible to build (and fill with pristinely pure water) a tank a cubic kilometer in size.
The collision of the neutrino with an atomic nucleus produces certain byproduct particles, depending on the type of neutrino. The detection medium must be extremely pure so that any detection of byproduct light is traceable back to a neutrino of a precise direction and energy.
These byproduct particles emit a blue glow called Cerenkov radiation as they travel. To see this, you need a large, transparent expanse of material for them to fly through. Flaws in the ice can cause the faint light signals to be distorted or spread out and lost. But deep in the Antarctic ice sheet pressures are so high that air bubbles are forced out over eons and the crystal is nearly perfect.
5,160 individual photomultiplier detectors, each roughly the size and shape of a basketball are the eyes of the machine, seeing the extremely dim blue light flashes. These detectors are lowered on cables into shaft holes drilled 1.5 miles down into the Antarctic ice with hot water hoses. The entire detector looks something like nearly 100 strands of giant glass beads, dangling the height of seven Eiffel towers down into the pitch black frozen ice.
Detection of Cerenkov light in several of these photomultipliers allows triangulation of the exact location and speed of the particle emitting it. This is then traced back to find the original neutrino and its properties. While finding only 28 candidate neutrinos, the detector also found thousands of other neutrinos from the sun, the atmosphere and even nuclear power plants. Terabytes of data must be taken and analyzed for years to even yield these few findings. Each discovery is a single needle in an enormous haystack.
Once in Hawaii, I was taken to see a Buddhist temple. In the temple, a man said, "I am going to tell you something that you will never forget." And then he said "To every man is given the key to Heaven. The same key opens the gates of Hell."
And so it is with science.
Richard Feynman, a Nobel Prize winning American theoretical physicist, celebrated thinker, and architect of the atomic bomb, has long been known as one of science's greatest supporters. But it was he who spoke those forthright words in 1963 at a lecture at the University of Washington.
Years earlier, in the wake of watching his powerful, atom-splitting creations wreak unspeakable havoc and end thousands of lives in Hiroshima and Nagasaki, the affable and cheery Feynman grew melancholic, grappling with a painful nuclear reality -- one he was instrumental in establishing -- as well as uncertainty over the shape of things to come.
"I didn't know what the future was going to look like, and I certainly wasn't anywhere near sure that we would last until now," he recalled in 1987. "Therefore one question was: is there some evil involved in science?"
Feynman was struggling with an existential crisis only a member of the Manhattan Project could truly experience.
"Put another way, what is the value of the science I had dedicated myself to--the thing I loved--when I saw what terrible things it could do? It was a question I had to answer."
In 1955, in an extraordinary address delivered to the National Academy of Sciences, Feynman did. From his soul-searching, born out of the choking dust of a mushroom cloud, the physicist expounded upon three simple but vital values tendered by science.
"The first way in which science is of value is familiar to everyone," Feynman said. "It is that scientific knowledge enables us to do all kinds of things and to make all kinds of things."
This could neither be more obvious, nor more true. Though once firmly anchored to the ground, man first realized that by displacing a large enough surface area of water, even immense objects could float. And so we set out to sea. Next, we found out that heating air within a large tarp made the apparatus less dense than even the air we breathe. And so we took to the skies. Years later, we fired rockets with enough force to overcome the bonds of gravity, and thus break free of our atmosphere. And so we entered space. Science powered it all.
But in that quintessential power to devise and create awesome ideas and inventions comes the power to wield such constructs for evil, Feynman cautioned.
"Scientific knowledge is an enabling power to do either good or bad - but it does not carry instructions on how to use it," he added.
Feynman then shared the second value.
"Another value of science is the fun called intellectual enjoyment which some people get from reading and learning and thinking about it, and which others get from working in it."
Though Feynman recognized that mere enjoyment isn't necessarily valuable to society, he contended that the thrill imparted by science is of a different, more inspirational nature.
"With more knowledge comes a deeper, more wonderful mystery, luring one on to penetrate deeper still. Never concerned that the answer may prove disappointing, with pleasure and confidence we turn over each new stone to find unimagined strangeness leading on to more wonderful questions and mysteries - certainly a grand adventure!"
When a child gets a taste of such an adventure, that is when a scientist is born. Perhaps, like Jack Andraka, they'll develop a simple test for pancreatic cancer? Or maybe, like Taylor Wilson, they'll try to invent the energy source of the future? Such is the exuberant energy that science musters.
"I would now like to turn to a third value that science has," Feynman continued. "The scientist has a lot of experience with ignorance and doubt and uncertainty, and this experience is of very great importance, I think."
Speaking humbly and hopefully, Feynman then shared what he knew.
"Now, we scientists... take it for granted that it is perfectly consistent to be unsure, that it is possible to live and not know. But I don't know whether everyone realizes this is true. Our freedom to doubt was born out of a struggle against authority in the early days of science. It was a very deep and strong struggle: permitting us to question - to doubt - to not be sure. I think that it is important that we do not forget this struggle and thus perhaps lose what we have gained. Herein lies a responsibility to society."
Feynman pressed on, explaining how so many people have, over the centuries, claimed to offer simple and all-encompassing "answers." When, in fact, the key to finding genuine answers to life's difficult questions is first embracing that you don't know all of them.
"If we want to solve a problem that we have never solved before, we must leave the door to the unknown ajar," Feynman said. To do so leads to what he described as an "open channel."
"It is our responsibility as scientists... to proclaim the value of this freedom; to teach how doubt is not to be feared but welcomed and discussed; and to demand this freedom as our duty to all coming generations."
Source: "The Value of Science." Richard Feynman. University of Washington.
"SHE IS leaving the room."
Eleanor Longdon was a freshman in college the first time she heard it: the voice, as clear as day, calm and decisive. But that wasn't the last time.
Over the following weeks she continued to hear it, harmlessly narrating everything she did. It was like having her own personal sportscaster, minus the haughty voice and occasional exclamations.
"It was neutral, impassive and even, after a while, strangely companionate and reassuring," she recalled in August at TED2013. Sure, it would occasionally sound frustrated, but that was only when she herself was feeling exasperated or annoyed. In a way, her disembodied companion mirrored the parts of herself she kept hidden from the world, and that was comforting.
Everything changed, however, when she revealed the voice's existence to a friend. So was set in motion a chain of events that almost destroyed her life: a visit to a psychiatrist, a hospital admission, a diagnosis of schizophrenia, a prescription of powerful antipsychotics, a stigmatization that would haunt and terrorize her for years.
Two years later, and the deterioration was dramatic. By now, I had the whole frenzied repertoire: terrifying voices, grotesque visions, bizarre, intractable delusions. My mental health status had been a catalyst for discrimination, verbal abuse, and physical and sexual assault, and I'd been told by my psychiatrist, "Eleanor, you'd be better off with cancer, because cancer is easier to cure than schizophrenia." I'd been diagnosed, drugged and discarded, and was by now so tormented by the voices that I attempted to drill a hole in my head in order to get them out.
SOMEWHERE BETWEEN 3 and 10 percent of people hear voices that others don't, and scientists still aren't precisely sure as to why. According to Dr. Ben Alderson-Day, a researcher at Durham University and a member of the Hearing the Voice Project -- a group aiming to better understand and explain the experience of hearing voices -- many of the brain's language centers become activated when voices are heard.
"What we don’t know yet is why they are active," he told Metro Magazine. "One possibility is that the answer will be found in how these language centers are connected to other parts of the brain, such as the motor cortex and areas linked to long-term memory."
We do know, however, that hearing voices is often triggered during prolonged bouts of stress, low self-esteem, or drug use. It may also stem from traumatic events in early life or overwhelming fear or unexpected situations. When they appear, voices can sound corporeal as if someone is talking to you. They may also originate internally, but outside the bounds of one's own control. Plainly different from personal thoughts, explains Dr. Anna Tickle, a clinical psychologist at the University of Nottingham who, along with her partner Dr. Lucy Holt, just completed a review exploring the experience of hearing voices from the hearer's perspective.
"Voices were generally experienced by participants as coming from outside the self but manifested inside the body whereas thoughts were perceived as ‘belonging’ to the self," she said.
According to Tickle and Holt, Western medicine has typically categorized hearing voices as meaningless, nothing more than a sinister symptom of mental illness. But in the fast decade or so, that's been changing.
"There has been increasing recognition of the importance of treating voices not just as a 'symptom' but as an experience with meaning," Tickle told RCScience. "Although sadly some professionals are yet to catch up."
To help them do so, Tickle and Holt sought to find and synthesize qualitative studies that explored subjects' own experiences with hearing voices. They targeted research that showcased voice-hearers' own thoughts and opinions, not just those of their psychiatrists.
They found that hearing voices is clearly not a homogenous experience. Some consider their aural companions to be positive, others pernicious. Almost all ascribe some sort of identity to their voices, whether by assigning a name or a gender. Voices were generally described as "powerful" and "invasive."
OUT OF THEIR exploration, Holt and Tickle came up with a number of recommendations to help people deal with hearing voices without unwarranted, potentially damaging treatments. Interventions aimed at promoting the individual's sense of self-worth and reducing their stress are key, as is putting them in touch with others who've experienced hearing voices and have learned to live harmoniously with them. Antipsychotics in moderate doses may also be of some use.
"For some, medication offers a vital component of recovery and can play a role such as 'dampening' distressing voices or just reducing anxiety," Tickle told RCScience. "That said, such medications bring with them a barrage of side effects that can limit recovery, including diabetes, weight gain, lethargy, concentration difficulties, restless legs, muscle stiffness, facial tics, facial hair growth for women, hypersalivation... the list goes on and many of the difficulties have the potential to lead to further social exclusion and stigmatization."
Overall, Tickle believes that hearing voices should be treated first and foremost with understanding.
"Within some non-Western cultures, voice hearing is valued and it would not be seen as indicative of any illness at all," she said.
"As a psychologist, I am less interested in the label(s) somebody has been given and much more interested in their experience and the meaning they give it."
IT WAS THIS sort of compassionate thinking that eventually liberated Eleanor Longdon. Pulling her up from the pits of her personal hell were empathetic survivors of schizophrenia, a mother who wouldn't give up, and a doctor who instilled in her a sense that she could prevail over her own demons.
I used to say that these people saved me, but what I now know is they did something even more important in that they empowered me to save myself, and crucially, they helped me to understand something which I'd always suspected: that my voices were a meaningful response to traumatic life events, particularly childhood events, and as such were not my enemies but a source of insight into solvable emotional problems.
Source: Holt L, Tickle A. Exploring the experience of hearing voices from a first person perspective: A meta-ethnographic synthesis. Psychological Psychotherapy 2013 Nov 14. doi: 10.1111/papt.12017.
Ask most serious pen enthusiasts, and they'll probably return only one answer.
The greatest pen ever made? It has to be the Parker 51.
Released in 1941 by the Parker Pen Company, the 51 was streamlined, functional, and crafted with the artistry of a Renoir. Imagine the writing utensil equivalent of the most perfect Apple product. Now multiply times ten. That was the Parker 51.
"It was elegance herself," Sam Kean described in The Disappearing Spoon. The pen's caps were gold- or chrome-plated, with a gold-feathered arrow for the pen's clasp. The body was as plump and tempting to pick up as a cigarillo and came in dandy colors such as Blue Cedar, Nassau Green, Cocoa, Plum, and Rage Red. The pen's head, colored India Black, looked like a shy turtle's head, which tapered to a handsome, calligraphic-style mouth. And from that mouth extended a tiny gold nib, like a rolled-up tongue, to dispense ink."
But the Parker 51 wasn't merely easy on the eyes; its interior was just as finely crafted as its decadent exterior. The 51 was chiefly designed as the consummate vessel for a revolutionary new ink. Thirteen years in the making, it was an ink that for the first time dried by being absorbed directly into the paper's fibers. Previous inks would simply dry through evaporation of their watery base.
To accommodate the new ink -- which was both highly alkaline and filled with isopropyl alcohol, making it corrosive -- the 51's body couldn't be composed of the celluloid plastics typical of most pens at the time; it required a much sturdier compound. Parker engineers chose lucite, a new thermoplastic produced by stacking another methyl group -- composed of one atom of carbon and three of hydrogen -- onto the colorless, liquid, methyl ester of methacrylic acid. The result was a shatter-resistant alternative to glass. Besides confining ink within the 51, the plastic was also just coming into use for airplane canopies.
Lucite wasn't even the 51's biggest secret -- the biggest secret rested atop the pen's gold nib. Crowning the utensil was a tiny amount of ruthenium, a rare, hardy, and nearly untarnishable transition metal famous for augmenting the properties of other metals. Though a member of the vaunted platinum group metals, ruthenium also has the added bonus of being relatively cheap compared to its elemental brothers and sisters. Such was the pride that Parker engineers took in their tip that the company advertised the nibs as being made from fictitious "plathenium," apparently to mislead competitors.
Their pride was not unfounded. So grand was the reputation of the 51 that Generals Dwight Eisenhower and Douglas MacArthur each used one to sign the dual treaties that ended World War II. Thus, the 51 became synonymous with victory, and though it was exorbitantly priced between $12.50 and $50 -- roughly equal to $200 and $800 today -- sales hit 2.1 million units in 1947!
Today, the Parker 51's legacy is cemented into history. Recently, a poll conducted by the Illinois Institute of Technology deemed the pen to be the fourth best industrial design of the 20th century -- not just among writing utensils, mind you -- of any piece of technology.
And it all began with a little chemistry.
One of the big problems in science journalism is the tendency to hype scientific research. You're familiar with the routine: A new study comes out on, say, how coffee might lead to a slight increase in a particular disease. Then, plastered all over the front pages of websites and newspapers are headlines like, "Too Much Coffee Will Kill You!" Of course, the following week, a different study will report that coffee might protect you from another disease, and the media hysteria plays out all over again, just in the opposite direction.
This is bad. Poor science journalism misleads the public and policymakers. Is there a way to prevent such hype?
Yes, say three researchers in the latest issue of the journal Nature. They give 20 tips and concepts that readers should keep in mind when trying to properly analyze the claims made in a scientific paper:
1. Variation happens. Everything is always changing. Sometimes the reason is really interesting, and other times it's nothing more than chance. Often, there are multiple causes for any particular effect. Thus, determining the underlying reason for variation is often quite difficult.
2. Measurements aren't perfect. Two people using the exact same ruler will likely give slightly different measures for the length of a table.
3. Research is often biased. Bias can either be intentional or unintentional. Usually, it's the latter. If an experiment is designed poorly, the results can be skewed in one direction. For example, if a voter poll accidentally samples more Republicans than Democrats, then the result will not accurately reflect national opinion. Another example: Clinical trials that are not conducted using a "double blind" format can be subject to bias.
4. When it comes to sample size, bigger is better. Less is more? Please. More is more.
5. Correlation does not mean causation. The authors say that correlation does not imply causation. Yes, it does. It is more accurate to say, "Correlation does not necessarily imply causation" because the relationship might actually be a causal one. Still, always be on the lookout for alternate explanations, which often take the form of a "third variable" or "confounder." A famous example is the correlation between coffee and pancreatic cancer. In reality, some coffee drinkers also smoke, and smoking is a cause of pancreatic cancer, not drinking coffee.
6. Beware regression to the mean. Scientific studies sometimes mistakenly exaggerate a particular result. These outliers are exposed when subsequent research demonstrates a more moderate result. A drug that cures 50% of patients often fails to achieve similar rates of success in follow-up studies.
7. Beware data extrapolation. Just because you got a 2% raise last year and a 3% raise this year doesn't mean you'll get a 4% raise next year and a 5% raise the year after that. Sorry.
8. Mind the predictive value of tests. A positive test result does not necessarily mean you have that disease. The reason is because tests aren't perfect, and each one has a different predictive value. The predictive value of any test is based on the sensitivity and specificity of the test, the nitty gritty details of which you can read about here.
9. Control groups are essential. An experiment without a control group isn't an experiment.
10. Experimental subjects should be randomized. Randomization helps eliminate self-selection and other biases. A patient should not be allowed to say, "I would like to receive the actual treatment, not the placebo, please."
11. Look for replications. Has the result of a study been properly replicated in a different population, preferably by another group of researchers?
12. Scientists aren't perfect. They have biases. They make mistakes. They cherry-pick. In other words, scientists are human.
13. Mind your statistics. Data must be "statistically significant," which means that it is unlikely to have occurred merely by chance. The most common standard used is 0.05, which means that scientists accept a 5% chance that their results were simply due to luck. A new paper, however, suggests that the threshold should be lowered to 0.005.
14. There's a difference between "no effect" and "not significant." If a result does not meet the 0.05 threshold, that doesn't necessarily mean "nothing happened." Sometimes, the effect is real, but the sample size wasn't large enough to detect it using statistics. The best example of this is climate change: Many people wrongly believe that there was no global warming in the 15-year-period spanning 1995-2009. But, the planet indeed kept warming up; the data just wasn't statistically significant.
15. There's a difference between "significant" and "important." On the flip side, just because a result is statistically significant doesn't mean it is important. If chemical X doubles your risk of disease from 1 in a million to 2 in a million, that's not an effect worth worrying about.
16. Results may not be generalizable. A psychology study examining the sexual fantasies of 18-year-old college freshmen may not be generalizable to the entire human race.
17. People are terrible at risk perception. Many people are fearful of GMOs and nuclear power plants, yet those are both far, far safer than automobiles -- which kill more than 30,000 Americans every year.
18. Don't always assume independent events. The probability of any given event is often independent of another event -- but not always. For example, the probability of you wearing a yellow shirt is independent of your probability of getting hit by a bus in broad daylight. However, the probability of you getting hit by a bus changes dramatically if you are wearing a black shirt and wandering around in the middle of the street at night.
19. Beware of cherry-picked data. An experiment with no hypothesis can find almost anything, especially in the era of Big Data. For example, it is now possible to compare thousands of DNA sequences between different people, and just by sheer luck, some differences between people (e.g., those with Alzheimer's and those without) may be statistically significant. However, if scientists are just "fishing" without knowing what they're looking for, the results they report can be misleading.
20. Beware of extreme data. Any jawdropping statistic (school X has the highest percentage of child geniuses!) could have several different explanations, not just the one of which the authors are trying to convince you.
The authors have put together a very good list of tips, but I would like to add one more: Extraordinary claims require extraordinary evidence!
Source: William J. Sutherland, David Spiegelhalter and Mark A. Burgman. "Twenty Tips for Interpreting Scientific Claims." Nature 503: 335-337. 21-Nov-2013. doi:10.1038/503335a
It may surprise you to learn that -- like humans -- cells can be male or female. The distinction is more subtle at the cellular level, but it can actually affect how cells react in a variety of experiments. Still, many scientists don't take into account the sex of their cells. According to a new review published in the
A good chunk of scientific research is performed in vitro. These experiments are undertaken using components of an organism rather than the whole organism, itself. For example, if a researcher has a disease treatment or biological theory they'd like to examine, they'll often start by testing it on lines of cells; they won't simply jump into animal tests.
Each cell line is derived from a single donor, and like every cell in the human body, each of the acquired cells contains 23 pairs of coiled DNA, called chromosomes. Included in this group are the two sex chromosomes: simply dubbed X and Y. Cells in women's bodies have two X chromosomes (XX), while cells in men's bodies have one X and one Y (XY). Thus, we get our male and female cells. Approximately 5% of the human genome resides on these chromosomes -- 1,846 genes on the X and 454 on the Y. This means that male and female cells are fundamentally dissimilar on a genetic level.
The scientists behind the new review, Kalpit Shah, Charles McCormack, and Neil Bradbury, all professors at Chicago Medical School of Rosalind Franklin University, say that these differences are often ignored, despite the fact that genes expressed on sex chromosomes can impact cell function and how they react to all sorts of stimuli.
Previous research has made this clear. Cultured female neurons uptake dopamine -- a neurotransmitter that helps regulate feelings of pain and pleasure -- twice as quickly as male neurons. Female neurons and kidney cells are also more susceptible to chemical agents that lead to programmed cell death. And female liver cells contain more of the gene CYP3A. This last difference is especially crucial, as the actions of CYP3A account for how over half the drugs on the market today are metabolized!
"Thus, for 50% of prescription drugs, the effectiveness of a particular drug dosage... may be quite different in females compared to males," the authors explain. Consider this possible side effect: women are 50 to 75 percent more likely than men to experience an adverse drug reaction. This is caused by a wide range of factors, chiefly because females weigh less, but cellular mechanisms undoubtedly contribute to it.
Such a statistic makes it clear than researchers should consider the sex of their cell lines when testing drugs in vitro, as the effects on male and female lines may not be the same. But it may be even more important when it comes to stem cells, as male and female varieties are decidedly not created equal.
"Female muscle-derived stem cells are less sensitive to oxidative stress and regenerate skeletal muscle much more efficiently than muscle-derived stem cells from their male counterparts," the authors write. "[This] is likely to have a big impact upon other stem cell mediated therapies, should the findings be replicated for other diseases.
Over the years, differences in disease rates and drug effects among males and females have often been attributed to variations in hormone levels. But it's entirely possible that many of these dissimilarities result from underlying differences at the cellular level. Like people, cells are also male and female, and they are plainly not the same. Their unique characteristics must be accounted for in scientific research.
Souce: Kalpit Shah, Charles E McCormack, Neil A Bradbury. "Do you know the sex of your cells?" American Journal of Physiology - Cell Physiology. Published 6 November 2013. Vol. no. DOI: 10.1152/ajpcell.00281.2013
(Image: Cell Culture via Shutterstock)
Many Americans think of Europe as something of a magical realm. The food is tastier, the people are sexier, and some parts of Poland don't experience gravity. Wait, what?
European Journal, a fairly good television program produced by DW-TV, investigated what looks to be an "anti-gravity" spot in Poland. (See video beginning at the 21:00 mark.) On a road out in the Polish countryside, things appear to roll uphill, including bottles of water and even entire cars. What's going on? The people of Europe demand an answer for this very strange physical phenomenon.
Now, bear in mind that European Journal recently reported that radio waves were causing cancer in Sicily, so our expectations for the program's scientific acumen shouldn't be terribly high.
The opening of the segment offered little hope. The host, Nina Haase, said: "There are places where people have observed seemingly supernatural phenomena for centuries: wandering rocks in deserts, for example, or permanent lightning storms. Scientists are often at a loss for an explanation, and that's also true for a place in Poland, where our reporter has also discovered a fascinating phenomenon."
Off the bat, the host is misleading the audience. Physicists are not at a loss for an explanation of anti-gravity spots. But, let's give the segment a chance.
The reporter says that "the laws of physics don't work quite right here," and the segment portrays gullible elementary school kids placing bottles of water on a road. To their astonishment, the bottles roll uphill. What did they think was going on? One child was convinced a meteorite landed there, creating a magnetic field. Another student suggested UFOs. A third suggested witches.
The reporter then turned to Mariusz Dąbrowski, professor of physics at the University of Szczecin:
It's a magical place, a truly odd hill. We suspect a gravitational anomaly. Ore deposits create a magnetic field, and the earth's gravitational pull works differently from what we're used to. Or, it's an optical illusion. This is the Polish Loch Ness. The monster we're looking for here is an explanation.
Finally, the credulous reporter gets the help of a water diviner. Together, hand-in-hand, they use a divining rod to detect the presence of a water current under the ground. Indeed, the water diviner senses something. Exactly what, nobody knows. But this magical place gives him a headache, and he cautioned that if anybody built a house in the area, they would be dead within a year. Real estate agents, you've been warned.
To summarize, the report offers several explanations:
-Magnetic field from a meteorite
-Magnetic field from an ore deposit distorts earth's gravitational field
Since we live in the 21st Century, we can automatically rule out UFOs, witches and headache-inducing water currents. Magnetic fields got two votes, from a small child and from a physics professor. Is it possible for an electomagnetic field to distort gravity?
Yes, in theory. According to physics professor Charles Torre, who writes in Scientific American:
The difficulty is that the gravitational field produced by a typical electromagnetic field you can produce in a laboratory is predicted to be very, very weak. A better place to look for gravitational effects due to electromagnetic fields would be in astrophysical objects carrying a significant net electric charge.
In other words, an ore deposit is probably not anywhere near large enough to distort a gravitational field.
So, that leaves only one answer: Optical illusion. And, that's the correct answer. According to Science Daily:
At several hilly locations around the U.S., know [sic] as "gravity hills," objects such as cars left on neutral supposedly roll uphill, driven by unknown forces and against the force of gravity. Physicists say -- and GPS measurements confirm -- that the effects are illusions caused by the landscape. The position of trees and slopes of nearby scenery, or a curvy horizon line, can blend to trick the eye so that what looks uphill is actually downhill.
Sadly, but not unexpectedly, the European Journal reporter concluded: "We just don't know why gravity works differently here, but it does. There's no doubt about that. We've seen it with our own eyes."
In 1950, Dr. Immanuel Velikovsky rewrote history.
Or rather, he attempted to.
A psychiatrist by training, the scholarly Velikovsky fashioned himself a historian, astronomer, chemist, geologist, physicist and author as well. He made the latter clear by penning a wildly popular and controversial book, Worlds in Collision. In it, he postulated that around 3,500 years ago, Jupiter ejected a planet-sized "comet" that danced around the planets of the inner solar system for thousands of years, altering Earth's orbit and axis in the process, and causing various mythological events described in Biblical texts, including the cinematic Parting of the Red Sea. The "comet" would later settle into orbit around the sun, taking its place as the heat-scorched and barren planet Venus. To make such a situation physically possible, Velikovsky also contended that electromagnetism is just as integral to celestial mechanics as gravity. He based all of this on absolutely nothing.
The uproar from the scientific community was nearly unanimous. Piqued over the fact that a brazen psychiatrist was dabbling in areas outside his field, astronomers were furious that Velikovsky took his outlandish ideas straight to the public, effectively dodging the vital and necessary process of peer review. Some called for Velikovsky's book to be banned from print. One scientist, the Harvard astronomer Howard Shapley, threatened to organize a textbook boycott of the book's publisher, Macmillan. The ultimatum worked. Within two months, Macmillan dropped the book, but it was quickly picked up by a different publisher, Doubleday.
You can't really blame scientists for being vexed. Almost a quarter-century after Worlds in Collision stirred the pot, Velikovsky's erroneous ideas had become so pervasive in the U.S. that the American Association for the Advancement of Science addressed the situation, devoting a session to debunking his ideas at their annual conference.
Velikovsky was, and is, incredibly wrong for many reasons. For starters, there is no known mechanism for Jupiter to eject any sort of object into space. Moreover, if Venus is a progenitor of Jupiter, it stands to reason that -- like its immense, gaseous father -- it should contain a lot of hydrogen. The planet contains almost none, and is, in fact, quite a rocky world. Most contradicting to Velikovsky's theory, historical accounts from ancient astronomers place Venus firmly in its celestial position more than 3,500 years ago!
In the celebrated 1980 television series COSMOS, eminent science educator Carl Sagan used some of these facts to thoroughly dismantle Velikovsky's claims. However, he reserved his most damning criticism not for Velikovsky, but for a small sect of the scientific community.
"The worst aspect of the Velikovsky Affair was not that many of his ideas are in gross contradiction to the facts. Rather, the worst aspect is that some scientists attempted to suppress Velikovsky's ideas."
"There are many hypotheses in science which are wrong. That's perfectly alright: it's the aperture to finding out what's right," he further explained, before concluding, "The suppression of uncomfortable ideas may be common in religion or in politics, but it is not the path to knowledge, and there's no place for it in the endeavor of science."
Sagan's sage advice can also be applied to our own lives. When confronted with stances counter to our own cherished thoughts and beliefs, it's often our first impulse to lash out verbally or even physically, to quell any challenging, disquieting notions. But that is not the proper course. Instead, we must respond calmly and sensibly, using evidence, logic, and reason as guides. Heck, we might even learn something, or realize that we were wrong!
Imagine if such thinking had been adopted in the past. Would Martin Luther have been condemned and excommunicated? Would Democritus have been shouted down? Would there have been a Spanish Inquisition? Would Martin Luther King, Jr. still be alive?
We can't change the mistakes of the past, but we can pledge and strive not to repeat them. Ideas should never be suppressed, no matter how revolutionary, no matter how crazy, no matter how discomforting.
When it comes to the human body, microbes trump man. As many as 100 trillion bacteria dwell inside the gut, outnumbering our own cells by a sweeping 10 to 1 margin. Thus, it's no surprise that these microscopic inhabitants leave a mark. Gut bacteria have been tied to ailments like obesity and inflammatory bowel disease.
What is surprising, however, is that microbes' influence may extend to other regions of the body as well, including the heart and brain. Conditions like heart disease, diabetes, and even depression could be swayed by the presence of the little blighters. And last week, researchers at New York University added a new ailment to that list: rheumatoid arthritis.
Put simply, rheumatoid arthritis is friendly fire. Sufferers' own immune systems carry out targeted attacks against their joints. The bombardment is agonizing and persistent. Inflammation, pain, and swelling are the first symptoms, followed by cartilage destruction and even fusion of the joints, whereupon almost all motility is lost. Over 1.5 million American adults battle the disorder, two-thirds of them women.
To date, arthritis' direct causes are still unknown, meaning that physicians can only treat the symptoms. They do so with surgery, patient lifestyle changes, and anti-inflammatory drugs.
But last week's discovery that gut bacteria could play a role opens up a new avenue on the long, winding road to finding a cure.
Researchers compared the gut microbiota of four groups of subjects: 26 patients with chronic, treated rheumatoid arthritis, 44 patients who had just been diagnosed with it, 16 patients with psoriatic arthritis, and 28 healthy subjects. They discovered significant inter-group differences in the composition of gut bacteria. Most significantly, 75% of patients who had just been diagnosed with rheumatoid arthritis and were yet untreated for it had a bacteria species called Prevotella copri in their guts. Treated, healthy, and psoriatic arthritis group members were populated with P. copri at 11.5%, 21.4%, and 37.5% respectively.
Harvard immunologist Diane Mathis praised the team's work, noting that it successfully translates previous findings in animal subjects to humans. The next step, she says, "will be to determine whether the association between P. copri and rheumatoid arthritis reflects cause, effect, or co-association."
It would be neat if a simple causal mechanism were at work here. For example, genetics are known to predispose a person to autoimmunity and joint inflammation. What if an environmental factor sets it off? P. copri -- which can cause gut inflammation in mice -- could be such a trigger.
There may be other triggers, too. Previous research has linked smoking, vitamin D deficiency, and herpesvirus to arthritis as well.
A different explanation for the current results is that P. copri is merely attracted to the inflammation that hallmarks rheumatoid arthritis: more inflammation, more bacteria. The researchers would be wise to design experiments that would bring to light potential mechanisms through which bacteria might cause the disorder.
Source: Jose U Scher et. al. "Expansion of intestinal Prevotella copri correlates with enhanced susceptibility to arthritis." eLife 2013;2:e01202 http://dx.doi.org/10.7554/eLife.01202
Low-carbohydrate diets, where carbohydrates constitute anywhere from 5 to 30 percent of total caloric intake (approximately 25 to 150 grams each day), are all the rage right now. For many, they're a successful impetus to sustained weight loss and improved health. But there could be an unforeseen toll.
Because of the way that the human brain functions, low-carbohydrate diets may adversely impact cognitive ability. Does a low-carb diet really make you duller? To examine this question, let's first discuss its focus: the brain.
There's no reason to beat around the bush, your brain is a pig. Though idle enough when observed outside its home cranium -- all pink, squishy, and squelchy; kind of cute really -- the brain is a charged biological machine. In an unseen electrical storm that would rival even the mightiest lightning display, 86 billion neurons fire -- almost nonstop -- to create the mosaic of thoughts, emotions, and mental images that we call the mind. The whole operation is an immense power suck, ravenously consuming roughly 250 to 300 calories each day, 20-25% of a human's base energy expenditure.
As far as food goes, the brain is a fairly picky eater. Like a young candy-craving child, it prefers simple sugar molecules -- glucose to be specific -- and when the brain doesn't get glucose, it gets crabby and distracted. Since the body most easily creates glucose by metabolizing carbohydrates, it stands to reason that limiting carbohydrates could dampen cognitive function.
When consuming low-carb diets in the short term, this is certainly true. In a 2008 study, psychologists placed 19 women on either a calorie restricted low-carb diet or a calorie restricted high-carb diet for 28 days. Throughout the study, participants' memory, reaction time, and vigilance were tested at regular intervals. While those on the low-carb diet enjoyed a slight boost in vigilance, they suffered impaired reaction time and reduced visuospatial memory.
"The brain needs glucose for energy and diets low in carbohydrates can be detrimental to learning, memory, and thinking," lead investigator Holly A. Taylor, a psychology professor at Tufts University, explained.
But the short-term isn't the long-term. Though the brain prefers to compute on glucose, after about four days of carbohydrate deprivation it sates about 70% of its hunger on ketone bodies, the byproducts produced when fatty acids are broken down by the liver. And by most accounts, the brain can run pretty efficiently on this fuel once it grows accustomed to it after a few weeks.
In fact, researchers have shown that low-carb diets can bring about improvements in cognitive functioning in both aged humans and rodents compared to traditional diets. Writing at Psychology Today, psychiatrist Emily Deans accounted for how this might happen.
"When we change the main fuel of the brain from glucose to ketones, we change amino acid handling," she says. This reduces the levels of glutamate in the brain, an amino acid and neurotransmitter that can cause harm in excessive amounts. Less glutamate leads to "a lower seizure risk and a better environment for neuronal recovery and repair."
In adults, low-carb diets have no adverse cognitive effects in the long-term. A well-executed, year-long study published to the Archives of Internal Medicine in 2009 found no difference in cognitive functioning for subjects consuming either a low-carb weight loss diet or a high-carb weight loss diet. Both actually enjoyed improvements to working memory and speed of processing, a result presumably attributed to weight loss.
Older and middle aged adults aren't dulled by low-carb diets, but what about children and teenagers? With still-developing brains, should they consume such diets? Here -- due to a dearth of long-term data -- the waters are murkier, but one study published in 2004 discerned some troubling results for low-carb diets. Reporting in Pediatric Research, researchers found that young rats fed a low-carb diet gained less weight than their peers on a regular diet (which isn't necessarily healthy during development). Moreover, they also had "significantly impaired visual-spatial learning and memory" and -- most disturbingly -- "significantly impaired brain growth."
Adults looking to lose weight may have their waistlines thinned and senses sharpened by low-carb diets, but those with still-developing brains should probably steer clear.
(Image: Low-carb meal via Shutterstock)
We don't cover economics regularly because it is not traditionally considered science. Furthermore, the field too often generates research and commentary that employs more voodoo than a witch doctor. It is largely for these reasons that economics is often referred to as the "dismal science" and why President Harry Truman wanted to meet a one-armed economist.
Still, economics can provide powerful insights on market behavior. Indeed, economists from various ideological backgrounds have managed to reach a consensus on several major issues, and from that vantage point, we can say the field has developed something resembling scientific knowledge.
One of those insights is that people respond to incentives. If I offer a teenager $50 to mow my lawn -- and an extra $25 if he trims the bushes -- then I can expect to shell out $75. I just offered my little helper a handsome incentive, and there's a very good chance he'll respond to it. This insight on human behavior is so basic and obvious that it is listed as one of the foundations of economics in Harvard economist Greg Mankiw's textbook Principles of Economics.
Unfortunately, socialists never learned this lesson. In a socialist economy, incentives play little (if any) role. Therefore, as University of Michigan-Flint economist Mark J. Perry wrote, "By failing to emphasize incentives, socialism is a theory inconsistent with human nature and is therefore doomed to fail."
Yet, shockingly, socialists can regularly be found on college campuses. Kshama Sawant, an economics teacher at Seattle Central Community College, openly endorses socialism. She also is running for Seattle City Council and, with the latest election returns, claims 49.5% of the vote. With many ballots left to count, she could still win.
How on earth can somebody who rejects basic academic knowledge be so close to winning a city council seat? Even more troublingly, how can somebody with her beliefs be allowed to teach an economics course? This would be analogous to allowing an AIDS denier to teach a medical microbiology course, a 9/11 truther to teach a foreign policy course, or a creationist to teach an evolution course. (Amazingly, UMass-Amherst biologist Lynn Margulis had the dubious distinction of being both an AIDS denier and a 9/11 truther!)
Just how far out of the mainstream is Dr. Sawant? She favors collectivizing Amazon. "Collectivizing" is a nice word socialists use to mean seizing assets and turning control of operations over to the government.
If Dr. Sawant's embrace of socialism isn't bad enough, she also endorses a terribly destructive policy called "rent control." This policy can take various forms, but basically, landlords are not allowed to charge market rates for apartments. That might sound like a nice thing if you're a renter, but Dr. Mankiw -- citing a 1992 paper in American Economic Review -- states that 93% of economists reject rent control because it "reduces the quantity and quality of housing available." University of Chicago lecturer Charles Wheelan, author of Naked Economics, agrees:
...if you asked ten economists why there is a shortage of cabs and apartments in New York City, all ten would tell you that limitations on the number of taxi medallions and rent control are what restrict the supply of these goods and services.
I have two questions to which I will never expect to receive a rational answer.
First, why would Seattle Central Community College allow Dr. Sawant (yes, she actually has a Ph.D. in economics) anywhere near students? And second, to the citizens of Seattle, how does one of the most educated cities in America allow themselves to get duped?
(AP Photo: Kshama Sawant)