## February 2012 Archives

### February 29: Why Is There a Leap Year?

How long is a year? A calendar year is the amount of time that it takes the earth to move in one complete orbit (nearly a circle, but not quite) around the sun. A day on the other hand, is one complete rotation of the earth around it's own center.

The ancient Greeks were the first to realize that a year is not made up of a whole (1,2,3,...365...) number of days. Hipparchus, a great astronomer, mathematician and scientist of the day calculated (with amazing accuracy!) that a year consists of 365.246 days. That is to say that the Earth spins around itself like a top 365.246 times every time that it traces its ellipse through space around the sun.

Noting this fact, Julius Caeser established a system of time (now known as the 'Julian' calendar) which considered the year to be made of 365.25 days. If we wanted to make every year the exact same length, we would have two options. First, we could ignore the extra .25 days. The problem with this method is that the system of days and months would shift by roughly 6 hours per year. After 100 years, January 1st would come to be at the same time in the Earth's orbit of the sun that January 25th used to be! After 1000 years, New Year's Day would be near the end of the summer, on September 8th! Clearly unacceptable.

Our other option would be to have one roughly six-hour day at the end of every year. This would be even worse however, since dawn, daylight and dusk would all be shifted six hours on the clock! In the second year of this calendar, sunrise would occur around noon, the third year at 6 PM, the fourth year at midnight, before arriving again at roughly 6 AM. Every four years this cycle would repeat.

The system of adding one day per year every four years thus keeps the same daylight hours by adding a day as the four 0.25 days accumulate to one. This also keeps the total shift of days of the year to less than one day. A pretty good compromise, and certainly worth the oddness of the extra day every four years.

We no longer use the Julian calendar, but an improved version known as the "Gregorian" calendar. Because of rounding 365.246 to 365.25, the Julian calendar lost 10 days over the nearly 2000 years between its start date and 1582 when the Gregorian calendar was implemented. The 10 days were skipped entirely by the world, which jumped from October 4th to October 15th. After the skipped days, a new version of leap year was put into place: every four hundred years, three leap years are omitted. 1600 was a leap year, 1700,1800 and 1900 were not. 2000 was again a leap year, 2100, 2200 and 2300 will not include February 29th.

Even the Gregorian calendar is no longer as precise as scientific measurement. Over the course of about 500 days, this calendar currently loses one second because the earth's rotation is gradually slowing down!* Since the 1970s, leap seconds have been added occasionally to the time and calendar to correct for this change. This confusing practice may end soon, and the time we all use will become separate from the earth's rotation as best we can measure it. GPS satellites and other systems which require time that matched Earth's rotation would continue to use leap-seconds.

*Note: this video attributes the slowing to the drag of the ocean; in actuality, the slowing is mostly due to the change in angular momentum of the earth due to the moon's gravity deforming it. This is known as 'tidal acceleration'.

## February 2012 Archives

### Five Things You Didn't Know About Pirates

Ahoy, my fellow landlubbers! As many of you are well aware, pirates have been massively popularized in literature and cinema. After all, who hasn't read or heard of Robert Louis Stevenson's Treasure Island? And who hasn't cackled or guffawed at Captain Jack Sparrow's swashbuckling antics in Pirates of the Caribbean?

However, my dear friends, it is my unfortunate duty to inform you that you have been slightly misled. For the most part, pirates weren't the hotheaded hooligans they have been made out to be. In reality, they were hardened mariners striving to survive on the high seas. They relied on rules and order. They utilized democracy and science. They even had workers' comp!

Without further ado, here are five things you may have not known about pirates:

1. No Buried Treasure. Extensive research by University of Pittsburgh Professor Marcus Rediker has debunked this common belief. Pirates rarely buried their treasure, partly because they didn't see the point of saving or hiding their riches, but mostly because the type of loot they took on -- usually food, trading goods, clothes, etc. -- was either perishable or served absolutely no purpose buried in a treasure chest.

For the most part, pirates actually traded their stolen goods in the New World. In fact, this trade infusion may have greatly boosted the local economies of large seaports and struggling settlements in the Americas.

2. Pirates Were Astronomers.
Or at least the navigators were. For pirates plundering in the 15th through the 18th centuries, celestial navigation using astrolabes or sextants was the prime method of navigation at sea. In order to calculate position and course heading, pirates had to recognize several celestial bodies, such as the sun, moon, and certain stars, including Polaris, Rigel, and Procyon.

Pirates didn't actually talk like scallywags. (Illustration by John McCoy)

3. Grog Helped Ward off Scurvy.
Aboard most pirate vessels, grog was the alcoholic beverage of choice. The simplest recipe called for rum and water, but many pirates added lime or lemon juice to ensure adequate intake of vitamin C and to make the beverage slightly more palatable. Some even ingeniously added exotic spices such as cinnamon and nutmeg to their rum cocktails.

4. Pirates Had Worker's Comp. A life spent ransacking and raiding was a dangerous one, which is why it didn't hurt to have some basic insurance. For example, the code of the pirate ship Revenge decreed that, "If any Man shall lose a Joint in time of an Engagement, [he] shall have 400 Pieces of Eight ; if a Limb, 800."

By and large, buccaneers in the Caribbean had similar compensation. French writer Alexandre Exquemelin, who served under Admiral Henry Morgan (better known as Captain Morgan), described one such policy:

"...for the loss of a right arm six hundred pieces of eight; for the loss of a left arm five hundred pieces of eight; for a right leg five hundred pieces of eight; for the left leg four hundred pieces of eight for an eye one hundred pieces of eight; for a finger of the hand the same reward as for the eye."
One thing you may have noticed: lefties got a rotten deal!

5. Pirates Rocked the Vote (but hopefully not the boat). As it turns out, pirate ships weren't crewed by lamebrain individuals serving under dictatorial captains; they were floating bulwarks of democracy. Pirate captains were often elected, and the quartermaster served as an arbiter in matters of dispute. Thus, pirate ships had three branches of government. The crew functioned as the legislative branch, the captain served as the executive branch, and the quartermaster acted as the judicial branch.

## February 2012 Archives

### Bacteria! Bacteria in the Ball Pit!

For the family undertaking a long road trip, a McDonald's restaurant outfitted with an indoor playland is an oasis in the interstate desert. These food and fun factories offer something for everyone. Kids get to release pent-up energy after being cooped up in the backseat, parents get a brief respite from their fidgety, nagging children, and McDonald's gets to sell a few Happy Meals and Big Macs. It's a seemingly worry-free, win-win-win situation.

But Dr. Erin Carr-Jordan, a professor at Arizona State University, wasn't so sure. So last year, she bought agar plates, jumped into the rainbow ball pit at her local McDonald's, and started swabbing for bacteria. What she found was somewhat disgusting. Carr told CBS News:

We found stuff that causes meningitis, food-borne illness, skin, hair, eye infections... fecal contamination, coliforms, quite a few things can make children ill, and several of which are multi-drug resistant and potentially fatal.

Carr-Jordan's exploits garnered some notoriety last fall, and they also got her banned from a host of McDonald's restaurants in Phoenix, Arizona. The franchise undoubtedly was not too enthused with her findings, which revealed the presence of high amounts of unfriendly bacteria such as Staphylococcus aureus and coliforms.

What other dangers lurk in the ball pit?

To be honest, these findings really should come as no surprise. Whenever you mix greasy fast food and prepubescent children, especially in a location that promotes rambunctious behavior, things are bound to get messy. In fact, I would hazard a guess that at any given time during a Playland's normal hours of operation, there are at least two half-eaten pieces of chicken nugget lurking at the bottom of the ball pit, at least three snot stains defacing the tunnel tubes, and almost certainly a fair amount of fecal coliforms befouling the twisty slide. It's just a "recreational hazard."

Besides, it's likely that you could find the same pathogens on toilet seats and bathroom door handles. There's no reason to believe that McDonald's Playland is uniquely dirty. Even fecal coliforms are everywhere. Think of all the people you have intimate contact with on a daily basis -- then ask yourself, "How many of them properly washed their hands after leaving the bathroom?"

This frankness may not be to the liking of many a parent, who might prefer that their children frolic in bastions of cleanliness and not be exposed to such filth. To them, I would refer the Hygiene Hypothesis, which states that an excessively clean lifestyle may actually weaken our immune systems.

Make no mistake, fast food restaurants need to cleanse their indoor play pens. And if particularly nasty pathogens are detected -- such as those that cause meningitis -- they should be cleaned with antiseptics. But, in general, parents need not worry: A little grime is probably good for the kiddies.

## February 2012 Archives

### What's So Special About Oscar Winners?

Colin Firth and Natalie Portman sort of have a lot in common. For one thing, they are both actors. In fact, they are both Oscar-winning actors. They are both presenting awards at the 84th Annual Academy Awards ceremony this weekend. And...they both are authors of scientific papers?

It's true. Just when you thought they couldn't get any cooler, these actors turn out to be interested in science too.

Colin Firth's involvement with research started when he was a guest editor for a BBC radio program a couple years ago. On the show, he commissioned a researcher to compare the brains of a liberal and a conservative politician with MRI. When the test was replicated with more subjects, the researchers saw correlations between a subject's political affiliation and the amount of grey matter in certain areas of the brain.

Smarty pants Natalie Portman got involved with neuroscience research as an undergraduate at Harvard. She helped with a project on new brain-imaging technology that was used to study object permanence in babies. Infants who have object permanence understand that when a toy is hidden under a blanket it doesn't cease to exist.

While Firth and Portman's Academy Awards have little to do with their own neuroscience exploits, the awards could qualify the actors to be subjects of scientific study, themselves. Apparently the fame and fortune that comes with an Academy Award changes an actor's life so drastically that scientists have found it important to study them. Either that or we common folk are interested in pretty much anything having to do with stardom.

One such study showed that male Oscar-winners have more kids. The men each produced a whopping average of four children, compared to the national American average of 1.2. The researcher came up with three possible explanations of this trend: 1) The winners are considered guaranteed good mates because of their status. 2) The winners now have more time for a family. 3) The winners are actors, so they were probably really good-looking to begin with.

On the flip side, a different study showed that Best Actress winners are more likely to get a divorce after they win the prize. The researchers analyzed the marriages of 751 nominees for Best Actress, and they found that the actresses who won the title had shorter marriages compared to the actresses who were simply nominated. Interestingly, the trend was not seen in the marriages of men nominated for Best Actor. The study speculates that the results are caused by an upset of traditional gender roles when the female partner gains prestige.

Not only do the Academy Awards inspire scientific discovery, they also explicitly celebrate it.

Two weeks ago, the Academy held its annual ceremony dedicated to celebrating the science and technology behind motion pictures. In the past, the Scientific and Technical Awards Presentation has given awards for such work as Dolby Digital surround sound and IMAX screens. Unlike the actual Academy Awards, recipients are not usually recognized for work they did in the previous year. Instead, the Academy only awards those whose work has has a important and lasting contribution to movie-making.

I guess they don't call it the Academy of Motion Picture Arts and Sciences for nothing!

## February 2012 Archives

### Can Science Explain McDonald's Shamrock Shake?

"So... In your professional opinion, would you say that making the Shamrock Shake is more art or science?"

The McDonald's employee looked at me with a befuddled expression.

"It comes out of the machine," she said.

"Ah yes, of course!" I exclaimed. "So science, then?"

Another quizzical look. "Sure..."

"Brilliant."

Each year, in the weeks leading up to St. Patrick's day, McDonald's shamrock shake makes its triumphant, limited-time-only return. The moment is hailed with media mentions, social media shout-outs, and loosened belts across the country.

For over four decades since it first appeared, the shamrock shake has not been widely available. As a result, it has attained an almost cult status among its followers. In addition, the YouTube arrival of nostalgic advertisements featuring Uncle O'Grimacey has helped fuel the mania.

So what is it about the shamrock shake that makes it so tempting? Is it the sumptuous taste? Many would describe the flavor as "minty" and "creamy," though the technical description is "pure mint McAwesomeness." But the taste alone cannot adequately explain the shake's allure.

Is there something special about the crafting process? I can confirm that the shamrock has its own special nozzle in McDonald's shake machine, though what exactly is going on within the machine's innards remains a mystery. (Leprechaun laborers, perhaps?)

What about the ingredients? Is it the unique, magical concoction of reduced fat vanilla ice cream, high fructose corn syrup, water, sugar, natural flavoring, xanthan gum, citric acid, sodium benzoate, yellow 5, and blue 1 that arouses such mirth and loyalty within the shake's indulgers? (Strangely enough there's no mention of mint or mentha in the ingredients, but there is mention of carrageenan, which is also used in personal lubricants.)

There's obviously something to be said for the shake's combination of sugar and fat, which research has shown to be quite a powerful force indeed. The simultaneous intake of both fat and sugar activates hunger signals, depresses feelings of satiety, and may boost feelings of happiness.

Such a simple explanation surely doesn't do the shamrock shake justice, but barring the involvement of leprechauns, it probably is the most likely answer. If you have a hypothesis of your own, please share! As of this moment, I have run out of legitimate theories, along with the shamrock shake that I purchased earlier for "study."

In the next edition of "shake science," Jack in the Box's bacon shake: "why?"

## February 2012 Archives

### LIFE Fusion on Target for Ignition This Year

Controlled nuclear fusion with net energy gain is on schedule to occur later this year at the National Ignition Facility (NIF) in Livermore, California.

That was the message from NIF's director for laser fusion energy, Mike Dunne, who spoke at a Photonics West 2012 plenary talk in January. Optics.org covered the event:

"We are now in a position to say with some confidence that ignition will happen in the next 6-18 months," stated [Dunne], adding that he felt personally that the breakthrough was likely to happen in around nine months.
The vehicle of what would be a monumental step forward for energy production is called Laser Inertial Fusion Energy (with the awesome acronym "LIFE"). LIFE functions by firing a powerful laser at a small, centimeter scale chamber. The laser generates massive pressures within the chamber and creates temperatures of over 4 million degrees Celsius. Miniscule deuterium-tritium fuel pellets will be rapidly inserted into the chamber, and the aforementioned conditions will induce the deuterium and tritium to fuse together, creating helium, a free neutron, and massive amounts of energy.

A diagram of a potential LIFE power plant.

While fusion ignition is on schedule, the next eight or so months will be crunch time for NIF scientists. There remains a few key hurdles to overcome, chief among them: actually controlling the fusion reaction.

"As the implosion proceeds at about a million miles an hour, can you create that tiny spark of fusion burn at the very center of the system, without the rest of the system mixing together? The next few months of work will be predominately focused on [this obstacle,]" Dunne said.

Already assuming a successful test in the coming months, scientists at NIF are planning for a fusion future. This has been made realistic by recent advancements in optics, which have dramatically lowered the cost and size of laser technology. In addition, scientists have introduced the concept of line replaceable units into the design of LIFE power plants. This will allow future fusion power plants to minimize downtime for equipment repairs.

"When ignition and gain are achieved on NIF, we will have a substantive delivery plan to take us to a commercial plant," Dunne told Science and Technology Review. "We will be ready to go."

The fact that someone is saying "when" and not "if" in regards to fusion power is incredibly exciting. For decades, many have dreamed of bringing star power to Earth, but it's always been firmly out of reach.

The numerous benefits of fusion power are even more electrifying. It's tagline of "clean, inexhaustible energy" is not an overstatement. There exists about 60 million years worth of fusion fuel in seawater and the power generation process would produce no carbon emissions.

My fellow Americans, get excited. Because this fall, the power of the sun may be made in America.

## February 2012 Archives

### 3D Printing

Think of what you have to go through to heal a broken bone. What comes to mind? A big plaster cast that you collect signatures on for a month? Surgically implanted titanium and stainless steel plates and screws, making you popular with airport screeners? A new engineering technology has created an easier way to heal and replace damaged bones

You may still be setting off metal detectors, unfortunately

This process is intuitively simple: simply print out a new bone and implant it! This technology has been under development by companies such as IBM for about 25 years. You start with a computer file which is analogous to a typed document to print out in letter form. But instead of words and paragraphs, you have encoded the three dimensional shape of an object. When you hit print, your design is sent to a machine that can form objects out of metal, plastic or ceramics. These can be anything from machine parts to whimisical sculptures:

The machine breaks the design down into tiny component squares, like bricks in a Lego construction. It then squirts out tiny blocks of material much like the single Lego pieces but much smaller. The machine lines these up row after row to build a flat slice of the shape you want. Then the machine lifts up a little bit, and melts new bricks that stack on top of the first layer of bricks. Gradually the entire structure is built from the base up, one vertical layer at a time.

While the first machines of this type literally used plastic similar to Legos, newer technology is beginning to allow metal and ceramic shapes to be made! Auto makers for example can print out the body of a car:

Aasimov Robot Included with LX Trim Package

The technology has even matured enough that you can build your own rapid prototype machine from free plans.

### Are Multiracial People More Apt to Be Successful?

It's a question I've occasionally pondered when watching Derek Jeter smack a homer or while laughing hysterically at one of Maya Rudolph's jokes: Are mixed-race people more inclined to be successful?

2006 data shows that mixed-race people make up only two percent of the population of the United States, yet it often seems that they are over-proportionately represented in the upper echelon of sports and arts. Tiger Woods, Blake Griffin, Jessica Alba, Bob Marley, Halle Berry, Alicia Keys, and Lenny Kravitz are but a few of the individuals in this talented group. This trend may possibly even extend to the realms of politics and business. Think of influential people like Booker T. Washington, Frederick Douglass, Steve Jobs, and President Barack Obama.

But it appears that evidence for mixed-race success extends beyond the anecdotal. In 1955, B.L. Penrose of London University College published a paper citing evidence of "exceptional fertility" in mixed-race individuals of combined European and Native American ancestry. At the most basic level of biological success-- reproduction -- multiracial people excelled.

In 2010, psychology professor, Dr. Michael Lewis of Cardiff University conducted a study on the perceived attractiveness of mixed-race people. Attractiveness has been linked to many different forms of success.

1205 male and female faces were collected by two research assistants naive to the hypothesis regarding attractiveness. Twenty white psychology students rated each face on its attractiveness on a 9-point scale (5 being of average attractiveness). The averaged results were that the mixed-race faces were perceived as being significantly more attractive than either the white or black faces.
In his discussion, Lewis noted that of the photos in the top 5% of attractiveness, 74% were of mixed-race people. If the study's distributions were extrapolated to the general population of the United Kingdom, it would have been expected that mixed-race people would comprise no more than 9% of the top 1% most attractive.

Something is clearly going on here, and it may have something to do with heterosis, or hybrid vigor. This is the idea that offspring of genetically-different parents will have improved or increased function of certain biological qualities. The breeding of genetically distinct organisms has been tremendously successful with cattle, corn, rice, onion, spinach sunflowers, broccoli, and marijuana.

However, humans are definitely not corn or cattle, so the theory of hybrid vigor as it pertains to mixed-race success will be difficult to study in a controlled fashion. For now, we'll have to rely almost entirely on anecdotal evidence.

Thus, I would like to present Blake Griffin's recent "monster dunk" as "Exhibit A."

## February 2012 Archives

### Is Jackson Blankenship's Face a Good Distraction?

On his Twitter account, University of Alabama freshman Jackson Blankenship proudly declares that he has been "publicly embarrassing [himself] since 1992." Due to recent events, this open humiliation is being taken to a much grander stage, and Blankenship couldn't be more excited.

Last Tuesday, Hal Yeager of The Birmingham News snapped a photo of a Florida player dunking the ball during a basketball game against the Alabama Crimson Tide. Displayed prominently in the background is a skinny, ginger-haired Alabama student making a grotesque, bug-eyed face and holding a large replica of that face above his head. That's Blankenship.

Naturally, the photo has gone viral, and has been featured on such sites as Deadspin and ESPN. Blankenship's moment in the limelight will continue tomorrow on NBC's Today show.

Fame is a welcome byproduct of Blankenship's hilarious shtick, but the Alabama freshman's real goal is to distract opposing players, thus helping his team win. So do his efforts actually work?

According to ESPN's Sport Science, studies have shown that isolated distraction attempts from fans can easily be tuned out by focused athletes. Blankenship's gimmick appears to fall into this category.

On the other hand, coordinated fields of background motion can cause a player's movements to subconsciously drift in the direction of the motion. This distraction method is especially effective when a basketball player is shooting free throws, with an informal study showing a typical reduction in an opponent's average free throw percentage of about eight percent.

While ESPN's evidence appears to discount the effectiveness of Blankenship's distraction technique, research from Professor John Eastwood at York University has found that, "Negative facial expression captures attention and disrupts performance." In two basic experiments, Eastwood had subjects count features of positive, neutral, and negative schematic faces. In both experiments, participants took the longest to count the features of negative faces. According to the study:

Taken together, these findings strongly support the conclusion that negative faces are more effective at involuntarily attracting or capturing attention than are positive faces... The present results suggest that even when participants are not deliberately looking for faces, negative faces also capture attention more effectively than positive faces.
While it is difficult to accurately characterize Blankenship's face without words like "constipated," "bugaboo," or "hysterical," the obvious frown makes his expression an undeniably "negative" one. So -- with Eastwood's study in mind -- it is indeed possible that the over-sized replica of Blankenship's face may cause many an opposing player to shoot a "brick" instead of a "swish."

### Pros and Cons of Being a Cute Adult

Humans have a very simple definition of "cute": anything that remotely resembles a human baby. We think big heads are cute because our babies are born with freakishly large brains. Forward-facing eyes are cute because our babies have forward-facing eyes. We even find clumsiness cute because our babies are born without coordination (and some never achieve it).

When cuteness is retained after an animal reaches adulthood, it is called neoteny.

Neoteny is exhibited by a wide range of animals: humans to tree frogs, manatees to penguins.  A species' livelihood or an individual's well-being can both be drastically affected by characteristics of neoteny--for better or for worse.

Pros

Extinction is less likely

Neoteny can become one of a species' greatest assets in getting support form the public. One peek at a panda's roly poly body or a penguin's unsteady waddles is enough to make people squeal with delight.  In this way, species facing the threat of extinction have a lot better chance of surviving if they keep their infant qualities as adults.

"Ugly" creatures (like the one on the right) aren't quite so lucky. Though many are not even closely related to Homo sapiens, endangered species without neoteny are neglected simply because they don't resemble human babies.

Can thrive in modern society

In humans, the psychological characteristics of neoteny have become more and more prominent. Some experts have attributed it to a single source: college. Traditionally, humans began to reach cognitive maturity in their late teens and early twenties, but college students trade in life experiences to pursue higher education.

Copious shenanigans and ridiculous antics make it clear that maturity is certainly lacking in college life. Researchers say, however, that psychological neoteny is actually essential for higher education. In order to maintain a receptive attitude and an ability to learn lots of new information, college students have to hold on their childlike characteristics.

This cognitive flexibility is an advantage even after graduation. Cute-minded adults can better adapt to the frequent social and professional changes that are required in today's culture.

Cons

Might forget to grow up

Not surprisingly, postponing maturity might also work against a person. And they might never grow up entirely.

Bruce Charlton, founder of the psychological neoteny theory, told Discovery, "People such as academics, teachers, scientists and many other professionals are often strikingly immature outside of their strictly specialist competence in the sense of being unpredictable, unbalanced in priorities, and tending to overreact."

Who knew scientists may have something in common with Peter Pan?

Perceived as submissive

In Japan, cuteness has become an integral part of the culture. Colorful pictures of adorable cartoons can be found on everything from hair accessories to cooking utensils to men's underwear.  The Japanese term for this obsessive cuteness, kawaii, encompasses not just a style but a behavior.

Some critics of the movement are concerned that the kawaii attitude squelches self-assertiveness in favor of helplessness and innocence. Hello Kitty's lack of mouth, for example, may equate cuteness with an inability to speak for yourself.

In an article for Wired, science writer Mary Roach asks Hello Kitty designer Yuuko Yamaguchi about the character's missing feature. "It's hidden in the fur," Yamaguchi says.

Hmmm. See if you can spot it.

May become disfigured

Recent popularity of tiny lap dogs has caused people to breed dogs for their cuteness. So-called "toy" and "tea-cup" varieties have sprung up as breeders try to produce adult dogs that look like puppies.

These new breeds are so small and helpless that it's hard to remember they are descended from wolves. In fact, scientists say toy dogs have been bred to look so young that they resemble not simply infant wolves but fetal wolves.

The dogs' fetal qualities, like large heads and short muzzles, can even cause health problems. In some breeds, the jaw bone is too small to hold all the dog's teeth, which causes dental complications. In other breeds, the brain can simply grow too big for the skull. Talk about cute.

## February 2012 Archives

### Geese Aren't Always Gaggles: Animal Collective Nouns

Climbing out of your car in a crowded supermarket parking lot, you hear what sounds like a raucous cocktail party. Glancing skyward, you see the sound's origin: hundreds of migrating, low-flying Canadian geese coming right at you. It's at this moment that your ability to remember seemingly pointless facts suddenly comes in handy. You recall that Canadian geese defecate up to 92 times a day! Being the upstanding citizen that you are, you look to your right and warn the stranger getting out of his car, "Take cover! There's a gaggle of geese coming in fast!"

Despite your good Samaritan act, the stranger gives you a supercilious gander. He opens his mouth to speak, but the geese are almost upon you. Diving back into your car, you vaguely hear his condescending reply. "No, no, no, get your terminology correct. Geese are only called a gaggle when they're on land. That, or a flock. But when they're in the air you call them a skein, wedge, or te-" Three poo bombs to the face prevent him from finishing his comment. Apparently migrating geese don't like grammar police, either.

There's a time, place, and manner for discussing the correct use of collective nouns, and the earlier fictitious situation wasn't one of those times. Now, however, is a good time.

The earliest use of collective nouns stems from the Book of St. Albans, published in 1486. The work contained three essays on hawking, hunting, and heraldry. This is why almost all collective nouns refer to animals. The book was so wildly popular that it was reprinted in the 16th century in dozens of different editions. It's perhaps for this reason that the hundreds of terms coined in the book still persist in Oxford Encyclopedias today.

Ever heard of a "shrewdness" of apes? A "flange" of baboons? You should be calling many bears a "sleuth," and bees a "grist." You can call a group of asses a "coffle," but only when they're in a roped line. It's a "bellowing" of bullfinches; a "pounce" of cats; a "rangale" of deer; a "stand" of flamingoes; a "business" of flies; a "whoop" of gorillas; and a "gam" of whales. And don't forget that owls are a "parliament" and parrots are a "pandemonium."

A pounce of cats. (Image by speedycrabinc)

Believe it or not, these terms are supposed to make sense. According to Michael Quinion, a contributor to the Oxford English Dictionary, many of them were originally based on zoological observations:

"...an exaltation of larks is a poetic comment on the climb of the skylark high into the sky while uttering its twittering song; a murmuration of starlings is a muted way to describe the chattering of a group of those birds as they come into roost each evening; ...a spring of teal is an apt description of the way they bound from their nests when disturbed."

Animal collective nouns: they're descriptive, clever-sounding, and slightly based in science. I say it's high-time they make a comeback in everyday use!  If you agree, Wikipedia has a terrific list to study. Just remember to use this new-found vocabulary responsibly and avoid turning into the grammar police.

## February 2012 Archives

### No More Mercury in Science Class? Totally Lame.

The school was deserted. There was no teaching nor hand raising, no recess nor laughter. Instead, EPA officials scoured the grounds in haz-mat suits.

In 2005, a mercury vapor scare forced South Granville High School in North Carolina to shut down for over a week. School officials took this drastic measure after two students stole liquid mercury used for a science class demonstration. After the heist, the students passed it around to their friends, who may have been awed by its silvery luster and its unbelievably high density. The students then brought the mercury home and continued to toy with the toxic substance. Their homes later had to be evacuated.

A similar situation occurred in Washington, D.C. two years prior to the Granville incident. Except that school was shut down for over a month! Similarly in 2009, students at a high school in Avondale, Arizona nabbed some mercury from science class. Their school was shut down for a week, and two of the students' families had to be relocated from their contaminated homes.

For decades, mercury, also known as quicksilver, has been synonymous with science class. As the only metal that is liquid at room temperature, it makes for a thoroughly entertaining exhibition. Lead can float in a pool of mercury. (Woah!) And only two tablespoons of the element weighs in at nearly one pound. (Cool!)

Liquid mercury. (Bionerd, Wikimedia Commons)

In my father's 4th grade 1950s science class, he was told to hold out his hand. The teacher would then pour mercury into his cupped palm. Mesmerized, my father would swish the metal around and watch it trace the creases of his palm. But that was then.

This is now. The dangers of mercury are well known. Cognitive impairment, memory lapses, lung damage, and death are a few of the alarming hazards. Chronic exposure to mercury causes fatigue and leads to birth defects.

In light of these dangers, it appears that mercury demonstrations in high school science class may be on the way out. In Minnesota, it already is. The Minnesota Legislature passed a law in 2007 banning mercury in elementary and secondary schools.

I must admit, I do mourn for mercury's almost inevitable removal from science class. It was one of the more memorable demonstrations from high school chemistry. But considering mercury's dangers and our society's propensity towards suing school districts, the move is understandable. Luckily there are plenty of YouTube videos out there to pick up where chemistry class will leave off.

## February 2012 Archives

### Strange Attractors

Taking the V-day bait, let's ask, "How are geeks spending their supposedly solitary Valentine's Days?" If they are into all that sweet mathematical physics action, they may be envisioning attractors. Like a mathematical pin-up, just seeing a snapshot of the fractal spirals of a strange attractor can keep a physicist up late at night.

There are two types of patterns inherent to the strange attractor. The first is what is called fractal. When you zoom in on a fractal, it has smaller and smaller patterns that look just like the larger pattern. As far in as you go, a fractal never looks like a smooth line, but always like a pattern of shapes that are the same as the big shape.

Mandelbrot fractal. Each circle branching off to the left is a complete recreation of the larger one to its right

Spirals require somewhat less description: they are simply patterns that start from some point and circle around that point, constantly moving further away.

Where are these strange patterns found to exist? The answer is, literally, everywhere. You just have to look through some strange glasses; these patterns occur when you look at the world from a different perspective.

When you walk your dog, you worry about where the dog is in terms of "how far in front of me, how far to the left and right, and how high off the ground." ("And is there a car at that spot?") These patterns occur when you look at something like how fast the dog is walking compared to how far it is away from you.

What do they actually mean? To the mathematically minded, strange attractors are "a type of pattern in the geometry of manifolds in the phase space of chaotic dynamical systems."

For the rest of us, a fractal pattern arises from situations that are chaotic. Chaos, to a physicist, is when models of reality are forced to be unable to accurately predict what happens. This happens because the outcome of an experiment is wildly different based on minuscule changes in the conditions that cannot be seen.

This is known popularly as the "butterfly effect," the idea that the flapping of a butterfly's wings on one side of the earth may noticeably effect the wind patterns on the other side. Another example might be the behavior of the stock market. The buying or selling of one share at one moment can trigger completely unpredictable swings, booms and crashes that do not happen if the trade is made a second later. Two plumes of smoke never form anything close to an identical pattern regardless of how similar the stuff burning is.

### Bug Love: Searching for Romance in Class Insecta

In the 1998 animated movie, Antz, the protagonist, "Z," a worker ant, falls in love with Bala, the princess of the colony. Though only a meager worker, Z eventually triumphs through numerous trials, rescues the entire colony from certain doom, and marries Princess Bala. How romantic!

But I wonder, does this kind of endearment exist in the real-life Class Insecta?

"Two dragonflies form a heart shape as they mate." Chelyapin Dmitry / Caters News

Let's first look at the Japanese red bug. After mating, the female immediately lays her eggs within fifteen to fifty feet of the tree S. jasminodora, which produces small, fleshy fruits called drupes. Once her eggs hatch, the devoted mother will tirelessly trek along the forest floor looking for drupes to carry back to her ravenous young. But, if she doesn't return fast enough or provide enough nourishment, her children will ditch the burrow in search of a better mother. Ouch. Talk about tough love!

Okay, so the Japanese red bug was sort of a bust -- but wait -- maybe we'll have better luck finding insect affection with Darwin's Beetle. Known for their massive jaws, male bugs climb to the top of great trees in search of females high above in the lofty upper branches. Along the way, males engage in macho duels against rivals. The winner continues climbing on his quest for romance. The loser gets thrown out of the tree.

After many thrilling "battle royale" bouts, a champion finally rises high enough to claim his lovely prize. Awwww. It's like when the prince in the fairy tale finally overcomes harrowing tribulations to rescue the princess.

The two beetles appear amorous as they approach each other, but then it all goes wrong! The fornicating is passionless (though I must admit that bug sex is hard to judge), and afterwards, the male often picks up the female and tosses her off the branch. Well... I guess nothing says "I love you" like throwing your mate out of a tree.

Letting her down gently. (See the amazing video from Discovery's Life here!)

But don't fret, romance isn't lost on all insects. During and after mating, adult pairs of the aptly named Lovebug remain coupled for up to several days. Considering that these insects only live for four to five days, that sounds like the equivalent of an enduring, beautiful relationship.

Happy Valentines Day!

## February 2012 Archives

### Six Romantic Ideas That Are Shot Down by Science

It's the season for cupids, dinner reservations, and teeth-breaking candy hearts. Everywhere I look I see couples holding hands, whispering secrets, and giving each other piggy-back rides. I can't even turn on the TV or radio without being bombarded with sappy commercials.

Now, I'm not bitter, but I think that a few especially nauseating couples could use a good smack upside the head when it comes to romance. And what better way to give a good smack than with science?

Naming or buying a star has become the unique and interesting way to show your feelings for someone. Basically, you give a company money, and they give you the coordinates of a star and a pretty certificate with the name you chose printed on it. Oh and they promise they will never let anyone else name "your" star.

The gift has become so popular that the International Astronomical Union (IAU) has found it necessary to address the issue on their official website. The IAU's hilariously cynical statement says that people can waste their money to name stars whatever the heck they want, but the names will never have any validity in the scientific community. A star's official name is simply a catalog number and coordinates, and that's how it's gonna stay.

2. Flower bouquets are thoughtful

Okay so the gesture of giving someone flowers may be thoughtful, but the fact that you're also giving them a face-full of chemicals probably isn't. The flower industry is different from the food industry in that there aren't strict guidelines about the amount of pesticides flower growers are allowed to use. Therefore the growers use large amounts of chemicals to keep the flowers looking pretty. While a whiff of chemically-dosed roses probably won't hurt you, the chemicals are not good for the environment where the flowers are grown or for the workers who tend them.

Also, most of the U.S.'s cut flowers were grown in warmer climates like South America or Africa. Then, to meet the demands of ardent lovers, the flowers must be shipped long distances, which releases tons of carbon dioxide emissions. While the receivers of the flowers may be pleased, I can't imagine that the atmosphere feels very cherished at all.

3. Chocolate is an aphrodisiac

While chocolate is ridiculously delicious, it probably does not stimulate sexual arousal. It's true that chocolate contains a couple chemicals, tryptophan and phenylethylamine, that could feasibly contribute to arousal. However, the amounts in chocolate are so small that it's very unlikely they have much effect. Research backs up this hypothesis. In one study, women who ate larger daily doses of chocolate did not report a greater rate of sexual arousal than the women who ate smaller amounts.

4. Oysters are an aphrodisiac

Slurping down oysters probably won't help you with your sex drive either. Oysters contain a lot of zinc, which is involved with testosterone production, but the effects of zinc, if there are any, would not be immediate. Oysters also contain dopamine, a neurotransmitter that affects the appropriate area of the brain. Here again, there's probably not enough to have any realistic effect on arousal.

But don't despair! After surveying hundreds of studies about edible aphrodisiacs, one research team found that ginseng and saffron are two that actually work.

While chocolate and oysters don't have many chemical properties that contribute to arousal, their texture may be enough to do the trick. The oh-so-velvety feeling of chocolate and the slippery sensation of oysters in your mouth may contribute to arousal, giving you the illusion that these foods are aphrodisiacs.

5. Driving fast impresses girls

Some people have been smart enough to figure out that if you give a person a thrill, she is more likely to fall for you. However, I'm sorry to say that the reason she's burying her head in your shoulder and clutching your arm isn't because she likes you--it's because she's legitimately scared.

The trick, however, is that she has a hard time telling the difference between her usual symptoms of attraction (fast heartbeat, sweaty palms, etc) and the symptoms of fear generated by the scary experience. She might think that she likes you, but she was really just frightened. This phenomenon is called misattribution of arousal.

Misattribution of arousal can work with speedy cars, scary movies, rollercoasters, and even coffee. And it definitely doesn't just apply to women. The concept was first discovered by observing whether men who walked across a stationary bridge or men who braved a swaying bridge were more likely to ask out a girl waiting at the end. Guess which group of men fell for the bait.

6. Doves are accurate symbols of romance

Traditionally, doves are considered symbols of love and romance because they mate for life and have affectionate courtship behavior. Pigeons, on the other hand, are symbols of excrement and stupidity because they smear droppings all over public places and bob their necks when they walk.

Well guess what: doves are pigeons.

Yep. According to the American Dove Association, these terms are synonymous. In fact, the "doves" released at weddings are usually homing pigeons so that they can be released outside and find their way home. That's right: the beautiful white birds you ceremoniously released on your Big Day are the same little buggers that defile historic monuments. How romantic.

## February 2012 Archives

### Are We Entering an Epigenetic Spiral of Obesity?

The facts are telling.

Thirty-four percent of American adults aged twenty years or older are obese. In 1996, no state had an obesity rate above 20%. In 2010, no state had an obesity rate below 20%. There's no doubt that the American public's collective girth is growing, and growing fast.

How and why is this happening? Theories abound, some incredibly convincing, but nobody can precisely pinpoint the cause of our portly plight. And the people who say they have the definitive answer are almost always selling something.

Over the past three decades, we've seen a convergence of situations that is driving obesity sky high. Computers have shifted the workforce out of manufacturing lines and into desk chairs. New technologies have transformed our hobbies from the physical to the virtual. Food -- and fast food -- is readily available. The consumption of fast food quadrupled between 1977 and 1995, and agriculture subsidies mean that food is cheaper and more abundant than ever before. This is just a smattering of causes; I'm sure you can add more.

Unfortunately, many of these trends aren't likely to change anytime soon, potentially bringing another factor into play: genetics. The relatively new fields of epigenetics and nutrigenomics are showing that changes in gene expression can be produced by environmental mechanisms. Could rising levels of obesity alter our genes, and, in turn, could these obesity-favoring alterations be passed on to future generations?

Recent animal research shows that it's certainly possible. A 2009 study from the University of Pennsylvania linked a mother rodent's diet-induced obesity to offspring adiposity, risk of cardiovascular disease and impaired glucose metabolism. This research was reaffirmed in 2010 when a study published in Nature showed that obesity can alter gene expression in lab rats, and these changes can then be passed on to progeny. Descendants of rat parents fed an obesity-triggering diet were born with impaired insulin production.

An obese mouse. (Image courtesy University of Tennessee)

But could this same situation happen in humans? As our society rapidly becomes more obese via lifestyle and dietary changes, will we enter a genetic spiral of obesity? Let's turn to Randy Jirtle, director of the Laboratory of Epigenetics and Imprinting at Duke University, for an answer:

"There is already evidence that epigenetic transgenerational inheritance can also occur in humans in response to food supply and smoking. Nevertheless, until the epigenetically changeable targets in humans are defined, it will not be possible to determine if such associations are directly mediated by epigenetic changes..."
So, an epigenetic spiral of obesity via transgenerational inheritance is possible, but at this time, there's simply not enough evidence to support it. However, one thing is for certain. If current obesity trends continue, there will be plenty of opportunities to study this theory.

## February 2012 Archives

### The Science of Alcohol: Beware the 'Beer Blanket'

The "beer blanket." The "vodka veil." The "moonshine mantle." The "shroud of spirits." If you've ever had a nip o' the "good stuff" then you have undoubtedly felt the warm, fuzzy feeling that swathes you at each inebriating indulgence. It's this feeling that makes you feel invincible to cold and allows you to attempt brazen acts of tomfoolery on even the most frigid of nights. You might have your own clever name for this phenomenon, but whatever you call it, to most, it remains an enigma.

In order to remove this shroud of mystery, we must first settle on a name for it. For the purpose of clarity, this post will use "beer blanket" as the official terminology, because it's actually in the (urban) dictionary. Now, on to the science.

First off, alcohol intake does NOT actually warm you up. There is no bodily heat-producing reaction catalyzed by a sip of blackberry brandy.

In reality, what's happening is this: Alcohol causes your blood vessels to dilate, which shifts blood flow to the surface of the skin. This warm blood circulates past millions and millions of tiny cutaneous receptors found within the epidermis. Specifically, the skin's thermoreceptors sense the blood's warmth and transmit this information via the somasensatory system to your brain, making you feel like you're cradled in a cozy cocoon.

Sorry, Mr. Saint Bernard. Your delivery doesn't actually keep me warm.

So, the "beer blanket" is literally caused by heat rushing to the borders of your body, forming a seemingly protective barrier against freezing temperatures. But beware, this "heated shell" does not make you impervious to cold; quite the contrary, in fact. When warm blood flows close to the surface of the skin, its heat is more easily sapped by cold extraneous temperatures. This vastly increases the risk of hypothermia. In fact, alcohol was partially indicted in many of the recent deaths caused by a brutal cold spell in Ukraine.

There are two lessons to be learned here. Number one, drink responsibly. And number two, beware the false warmth of the "beer blanket." A brief snowball fight under its thermal guise is one thing, but skinny-dipping in the dead of winter -- well -- that's another.

## February 2012 Archives

### The Potential of Thought-Controlled Computing

Thought-controlled computing sounds right out of science fiction, and it's exactly what you think it is: an emerging technology that allows you to directly control a computer through thought.

Speaking at TEDxToronto in September 2011, Ariel Garten shed some light on thought-controlled computing. Her company, InteraXon, has been working extensively with the technology since 2007.

Thought-controlled computing functions by first utilizing an electroencephalograph (EEG) to read the magnetic fields produced by firing neurons within the brain. Software then separates the readings by frequency into alpha, beta, gamma and theta waves. InterAxon's system is designed to respond to alpha waves (correlated with relaxation) and beta waves (correlated with focus and attentiveness). By wiring the system into pieces of technology -- an iPad or light switch, for instance -- users can turn a light on or off or play a video game simply by focusing or relaxing. According to Garten:

In the first phase of development we were really enthused by all the things we could control with our mind. We were making things activate, light up and work just by thinking. We brought to life a vast array of prototypes and products... like thought-controlled home appliances or slot car games or video games or a levitating chair.
Thought-controlled levitation.

But Garten believes that there's much more to thought-controlled computing than just operating other pieces of technology. She believes that we can use it to better know ourselves.

With humanized technology we can monitor the quality of our sleep cycles. When our productivity starts to slacken, we can go back to that data and see how we can make more effective balance between work and play. Do you know what causes fatigue in you or what brings out your energetic self, what triggers cause you to be depressed or what fun things are going to bring you out of that funk? Imagine if you had access to data that allowed you to rank on a scale of overall happiness which people in your life made you the happiest, or what activities brought you joy. Would you make more time for those people? Would you prioritize? Would you get a divorce?
Garten envisions a world where technology serves as a vital tool for introspection. She wants to develop brain scanners and interpretive software can show us what's really going on inside our minds.

It's an intriguing notion to be sure, but one that thus far leaves me unconvinced and with questions galore. Will a piece of technology really help you get in touch with your inner self? How in-depth can a brain scanner really get? Can the complexity of human life and human consciousness really be broken down into ones and zeroes? And honestly, do you really want a computer to decipher your brain waves and tell you how stressed you are?

Don't get me wrong; I think that thought-controlled computing has a promising future. I'm just not sure of its potential as a digital shrink.

## February 2012 Archives

### Why Do We Philosophize?

"Anyone can call themselves a philosopher and announce any claim as a philosophical truth, and no one can say that the author is not a philosopher nor the claim is not a philosophical truth; [therefore] philosophy is useless."
-Philip Atkinson

The relationship between philosophy and science is often framed as a "chicken or the egg" situation. Which came first?

Well, it appears that philosophy came first, with natural philosophy acting as the precursor for science. This notion is also evidenced by the fact that a lot of early philosophers dabbled in science. Fast-forwarding to today, philosophers often pontificate on science, but not a lot of scientists study philosophy.

Over the years, plenty of philosophers have tackled the philosophy of science. For example, they've attempted to delineate the border between science and non-science. In addition, they've dissented on how science should be interpreted. Scientific realists claim that we ought to consider scientific theories as true, approximately true or likely true. Conversely, scientific antirealists argue that scientific theories should only be regarded as useful, but not necessarily true.

To me, this philosophizing seems to be naught but aimless drivel, and historically, many in the scientific community have privately or openly agreed. Prominent physicist Richard Feynman was quoted as saying, "Philosophy of science is about as useful to scientists as ornithology is to birds."

Despite a great many philosophers' eagerness to probe and dissect science, we have not seen a great many scientists applying the same rigorous scrutiny to philosophy. It's time for this to change by launching a new discipline: "science of philosophy."

A good way for scientists to start is by addressing a fundamental question: "Why do philosophize?"

Scientists can begin to answer this query by examining what philosophy truly means to human beings. Since the dawn of rational thought, humans have been preoccupied with the fundamental problems of existence, language, knowledge and values. Is this philosophical search an essential trait of humanity? Psychologists can conduct preliminary research on this topic through surveys of prospective college philosophy majors.

Neurologists also have a role to play in the science of philosophy through examining hormonal responses and brain activity catalyzed by considering philosophical questions. This would undoubtedly yield fascinating results!

One brain I'd surely love to see analyzed is the one belonging to the professor from my Philosophy of Religion class at the University of Wisconsin - Madison. His head-down ramblings, conducted entirely without any form of notes or visual aids, scythed across the continuum of religion and philosophy. And he somehow found a way to relate such topics as Creationism and Atheism back to the Boston Celtics. The man radiated an aura of mystical brilliance that --   though it regularly put me to sleep -- was utterly amazing to behold.

I'd love to know why he -- and the rest of us -- philosophize.

## February 2012 Archives

### How to Think with Your Hands

Our hands help us do lots of physical things: weed gardens, button buttons, and pick noses. However, it turns out that our hands can also help us think.

According to a recent study, explaining a task using hand gestures may be even more profitable than practicing it. To me it makes sense that practicing a task will help me get better at it. It also makes sense that if I were to explain to someone how to do a task using gestures, I might also get better at it. But can gesturing how to do a task actually make me better at it than practicing the task itself?

Gesture is a unique case because, although it is an action, it does not have a direct effect on the world the way other actions usually do. Gesture has its effect by representing ideas. We have argued here that actions whose primary function is to represent ideas--that is, gestures--can influence thinking, perhaps even more powerfully than actions whose function is to affect the world more directly.
Another study used an entirely different angle to observe the impact of gesture on thought. This study's somewhat whimsical goal was to see if acting out metaphors about creativity could produce creativity itself. For one of the experiments, the researchers fashioned a box and then tested to see if subjects thought more creatively when sitting inside or outside it. They were literally "thinking outside the box." Get it?

A similar experiment from the same study required subjects to generate creative solutions to a problem by holding up a hand. Some subjects were asked to hold up their right hand and then their left hand while others only ever held up their right hand. The study's results showed that subjects who were asked to look at the issue "on one hand, then on the other hand" came up with more unique answers to the problem.

Right- or left-handedness is also related to the way you think and how your brain is set up. For instance, most right-handed people have the majority of their language activity in the left side of their brain. However, left-handed people can have their language-related area in the left or right side of the brain (or both).

Also, left-handed people usually have a bigger corpus callosum, the structure that bridges the brain's two hemispheres. This feature may provide left-handed people with more interhemispheral connectivity, which may be related to a higher incidence of left-handedness among gifted children (those who have an IQ over 132) compared to non-gifted children.

Whether you are left-handed or right-handed, your hands work hard in more ways than you may expect. Sometimes it's important to let them cut loose and have a little fun.

### Without NASA, Future of Space Coast Up in the Air

Last week, spectators from across the country watched with mild incredulity as Newt Gingrich spoke to Floridians about his vision for a permanent lunar colony. Upon hearing this, many of us snickered from our couches at home, but at the Holiday Inn in Cocoa, Florida, where Newt was holding his rally, the message struck home. His statement boldly going where few politicians have gone before lifted many in attendance off of their seats and drove hundreds to cheers.

Cocoa, Florida sits nestled on the storied Space Coast, and the town is really hurting right now. Brevard County, which contains Cocoa and most of the Space Coast, lost an estimated 13,000 jobs with the demise of the space shuttle program last July. Cocoa pool halls, restaurants and bars are now empty, where once they were packed with NASA scientists, engineers and contract workers looking to relax after a hard day's work.

Roughly fifteen miles to the north of Cocoa, Titusville, Florida sits directly across the narrow bay from the Kennedy Space Center, and it's also hurting. The city of 44,510, which residents have proudly dubbed "Space City," sported a 13.8% unemployment rate before the NASA layoffs struck. In addition, many of its businesses depended upon space tourism, which would skyrocket around every NASA launch. With the loss of the space shuttle program, the city is now grounded in a painful economic reality.

Residents of the Space Coast watch a shuttle launch. (AP Photo)

Titusville's story took center stage in a short documentary by Ride5 Films entitled Welcome to Titusville. The film primarily focused on residents' memories of the 30-year space shuttle program. Watching the documentary, you could really feel the pride they took in their connection to NASA and the celestial, and their sense of loss when that connection was severed last July. The film struck a somewhat somber tone and evoked memories of Flint, Michigan after the closure of its General Motors plant.

"I don't know how to excite kids about growing algae... It's not the same as flying in space like Buck Rodgers," resident Brenda Mulberry, owner of Spaceshirts.com, said in the film.

Despite the town's current setback, "Welcome to Titusville" showed that some residents are hopeful. Observed Mark Conklin, "We'll see the next space program begin out here. We have survived Apollo. We have survived the Challenger incident. We spring back surprisingly well."

Titusville's website proudly proclaims that the city is the "Gateway to Nature and Space." For the moment, this gateway is closed, but there may hopefully come a day when it will reopen. Perhaps when we start sending settlers to the Moon.

## February 2012 Archives

### Thomas Jefferson: Founding Father of Science

Newton follows up Monday's piece on Benjamin Franklin with a look at how Thomas Jefferson was a founding father not only of America, but also of American science.

Thomas Jefferson was a pioneer. Of this you are undoubtedly aware. He played a pivotal role in crafting the Declaration of Independence. He helped mold and nurture our political system when it was young and susceptible to corruption. He struck the greatest bargain of all time when he arranged for the purchase of the Louisiana Territory from France.

What you might not know about Jefferson was that he was a "statesman of science." For over two decades -- including the time when he was President of the United States -- Jefferson served as president of the American Philosophical Society, the preeminent scientific foundation at the time (and coincidentally founded by Benjamin Franklin). In this dual role, he effectively served as America's scientific leader, as well as its political leader.

"Nature intended me for the tranquil pursuits of science, by rendering them my supreme delight..." Jefferson said in 1809.

As this quote makes overt, Jefferson was supremely fascinated by science. His hobbies included -- among many others -- paleontology, archaeology and agriculture, each subjects to which he made noted contributions. In addition, he absolutely adored mathematics. "When I was young, mathematics was the passion of my life," he said.

In government, Jefferson promoted science on a national scale. He recommended that Congress commission a survey to accurately chart the coast of America, a project that would later evolve into the National Geodetic Survey. As secretary of state, he headed the patent office and was instrumental in laying the groundwork for patent law.

Though he invented the folding chair, the swivel chair, the polygraph and many other items that we today take for granted, Jefferson never patented any of them. Like Benjamin Franklin, he believed in the "natural right of all mankind to share useful improvements without restraint."  His inventions and ideas didn't belong to him; they belonged to everyone.

Above all, Jefferson popularized the notion that the pursuit of scientific endeavors is an American ideal, one that is crucial to our country's freedom and success. "Science has liberated the ideas of those who read and reflect, and the American example has kindled feelings of right in the people," he said.

Over two centuries ago, Thomas Jefferson helped endow our country with the means to flourish in the world of science. Today, we all have our own opinions on the problems and faults of American science. But despite these shortcomings, I think that Thomas Jefferson would be proud of all that we have accomplished. To honor his instrumental contribution, we must make every effort to uphold America's lead in science and innovation.

## February 2012 Archives

### NASA Manned Spaceflight Slowly Dies

Just before dawn on July 21st of last year, Space Shuttle Atlantis touched down at the Kennedy Space Center in Florida. It is the same complex from which all six of the country's missions that took men to the moon were launched. Each shuttle mission, all 135 of them, lifted off from the concrete of this complex. This successful one-chance-only landing on the center's three mile long runway in the dark was nearly the same as the hundred before it. There was one major difference however: these four crewmembers may be the very last humans NASA ever launches into space.

News this week suggests that the planned successor to the Space Shuttle, named the Orion, has been delayed from its scheduled first manned flight in 2016. The new estimated date? 2021. That's if everything goes according to plan. (How often does THAT ever happen?)

Whether you proclaim manned spaceflight a glorious pinnacle of human ingenuity and exploration or deride it as an impractical, expensive national show of "swag," the US government's time at the forefront is likely coming to an end. What is coming to replace it, and how are we going to ferry astronauts between earth and the \$100-billion International Space Station?

The plan was to buy seats on the generally extremely reliable Russian Soyuz spaceships until our next generation ship was finished. Things not going according to plan struck here too though, when an unmanned Soyuz mission crashed last August. The Russian Federal Space Agency quit launching the craft until they determined the cause of the crash and attempted to address it. For roughly three months there was no way for any one to go up, and no way for anyone up to make it back!

With a tenuous lifeline to the ISS and any other human activities in space, this latest NASA delay is a major blow to all human endeavors in the realm. The Hubble Space Telescope was launched by one mission and has been repaired and upgraded five different times by human hands brought up on the Space Shuttle. Many of the satellites and space telescopes (such as Kepler, which finds other "earths" in the universe) have also required astronauts to be deployed into space.

Given NASA's likely failure to produce any vehicles capable of carrying people into space for the foreseeable future, we will probably soon be using private companies to get there. While these companies also miss deadlines, they are still realistically planning manned flight in two to three years. What's more, they have spacecraft that are already completely designed and built, and they have flown them to space and back.

Even if NASA ever does get its act together, they may already be long surpassed by the next generation of space explorers and workers!