Put Down That Bucket of Ice Water. Read Lou Gehrig's Story. Learn About the Science of ALS. Then Donate.

BY NOW, THOUSANDS, perhaps millions, of Americans have already filmed themselves dumping ice water on their heads in the name of amyotrophic lateral sclerosis (ALS) -- Lou Gehrig's disease. Thousands more will follow suit. Whether or not you're a fan of the Ice Bucket Challenge -- and particularly its narcissistic nature -- you cannot deny that it's been extremely successful. As of Thursday, The ALS Association has received $41.8 million in donations from July 29th to August 21st, compared with just $2.1 million during the same time period last year.

The Ice Bucket Challenge has been an undeniable boon to the fight against ALS and online egos everywhere (Look at all the Facebook "likes!"), but how much awareness it has truly raised? While videos of people creatively dousing themselves with cold water abound on social media, the story and the science of ALS seem either absent or drowned out.

This is an attempt to fill that void. If social pressure isn't enough to convince you to donate to ALS research, the heart-wrenching story of Lou Gehrig and the science behind the illness that shares his namesake should be.

THOSE ATTENDING YANKEES spring training in 1939 saw slugging first baseman, Lou Gehrig, the "Iron Horse," set to return for his 17th season. Up until that point, Gehrig had been an "institution of the American League," hitting 493 home runs, averaging .341 at the plate, and playing in 2,122 consecutive games. But onlookers wondered how long that could continue. Gehrig was now 35, and his prior season had been a bit off his usual pace. He only hit .295, an amazing feat by most standards, but squarely subpar for Lou. Yankees fans hoped that Gehrig would get back on track in 1939.

However, as spring training pressed on, it was clear something was amiss. Sports writers picked up on it.

"They watch him at the bat and note he isn't hitting the ball well. They watch him round the bag and it's plain he isn't getting the balls he used to get. They watch him run and they fancy they can hear his bones creak and his lungs wheeze as he lumbers around the bases," the New York World Telegram's Joe Williams wrote.

"On eyewitness testimony alone, the verdict must be that of a battle-scarred veteran falling apart."

A rare few, like the New York Sun's James Kahn, were more perceptive.

"I think there is something wrong with him. Physically wrong, I mean. I don't know what it is, but I am satisfied that it goes far beyond his ball-playing. I have seen ballplayers 'go' overnight, as Gehrig seems to have done. But they were simply washed up as ballplayers. It's something deeper than that in this case, though. I have watched him very closely and this is what I have seen: I have seen him time a ball perfectly, swing on it as hard as he can, meet it squarely — and drive a soft, looping fly over the infield. In other words, for some reason that I do not know, his old power isn't there... He is meeting the ball, time after time, and it isn't going anywhere."

Things didn't improve when the season began. One day at batting practice, Yankees teammate Joe DiMaggio watched as Gehrig swung on and missed ten "fat" pitches in a row. Eight games in, right before a bout against the Detroit Tigers, and despite the protests of his teammates and manager, Gehrig benched himself "for the good of the team." Everyone, even the stadium announcer for the Tigers, was shocked. "Ladies and gentlemen," he announced, "this is the first time Lou Gehrig's name will not appear on the Yankee lineup in 2,130 consecutive games." Gehrig received a standing ovation from the Detroit fans. Tears glistened in his eyes.

A month later, Gehrig visited the Mayo Clinic in Rochester, Minnesota. The six-day visit produced the following diagnosis from Dr. Harold H. Habian:

"After a careful and complete examination, it was found that he is suffering from amyotrophic lateral sclerosis. This type of illness involves the motor pathways and cells of the central nervous system. The nature of this trouble makes it such that Mr. Gehrig will be unable to continue his active participation as a baseball player."

Gehrig was also informed that the disease was incurable, and that he likely did not have long to live. Despite the earth-shattering diagnosis, he remained optimistic.

"The road may come to an end here," he wrote his wife. "Seems like our backs are to the wall. But there usually comes a way out. Where and what I know not, but who can tell that it might lead right on to greater things."

On July 4, in-between a double-header against the Washington Senators, a ceremony was held to commemorate Lou Gehrig and allow him to announce his retirement. 61,808 fans watched as Yankees manager Joe McCarthy -- who had been like a father to Gehrig -- handed the outgoing slugger a trophy. They watched as Gehrig bent down with the apparent effort of a man forty years his senior to set it on the ground. They watched as Gehrig stood silently with his head slightly turned down, too moved to move. And then, they watched as Gehrig gathered himself, walked to the collection of microphones, and gave one of the greatest, most humble speeches ever delivered.

"Fans, for the past two weeks you have been reading about the bad break I got. Yet today I consider myself the luckiest man on the face of the earth...

When the New York Giants, a team you would give your right arm to beat, and vice versa, sends you a gift—that's something. When everybody down to the groundskeepers and those boys in white coats remember you with trophies—that's something. When you have a wonderful mother-in-law who takes sides with you in squabbles with her own daughter—that's something. When you have a father and a mother who work all their lives so that you can have an education and build your body—it's a blessing. When you have a wife who has been a tower of strength and shown more courage than you dreamed existed—that's the finest I know.     

So I close in saying that I might have been given a bad break, but I've got an awful lot to live for. Thank you."

Two years later, Lou Gehrig died.

ALS NOW AFFECTS more than 30,000 Americans. In those diagnosed, the motor neurons -- the cells that signal muscles to move -- suddenly and mysteriously start to degrade. As they dwindle, the muscles they formerly controlled diminish as well from underuse. Paralysis eventually sets in, but cognitive function is often spared. In this respect, ALS is the opposite of Alzheimer's: the body goes, but the mind remains. Still incurable, the disease is often fatal within five years of diagnosis. Most patients die from respiratory failure.

Though precise causes and risk factors haven't been identified, a number of genes and mutations have been linked to ALS. That means that those with a family history of the disease can get tested and receive an imperfect estimation of their risk.

In February, researchers revealed how ALS is spread from neuron to neuron. It seems that a mutant of the enzyme SOD1 causes the cells to go haywire. The researchers also found that certain antibodies can block SOD1 from being transmitted, which could potentially halt the progression of ALS. The method has yet to be tried in humans.

The ALS Ice Bucket Challenge has arrived at an "exciting time" for ALS research. With new drugs undergoing clinical trials and promising research pathways being elucidated, the money raised is sure to be put to good use. To donate, visit the website of the ALS Association.

Now you can dump that bucket of ice water on your head.

Primary Source and Images: Baseball: a Film by Ken Burns, AP

The Legendary Study That Embarrassed Wine Experts Across the Globe

A LITTLE OVER a dozen years ago, "la merde... hit le ventilateur" in the world of wine.

Nobody remembers the 2001 winner of Amorim Academy's annual competition to crown the greatest contribution to the science of wine ("a study of genetic polymorphism in the cultivated vine Vitis vinifera L. by means of microsatellite markers"), but many do recall the runner-up: a certain dissertation by Frédéric Brochet, then a PhD candidate at the University of Bordeaux II in Talence, France. His big finding lit a fire under the seats of wine snobs everywhere.

In a sneaky study, Brochet dyed a white wine red and gave it to 54 oenology (wine science) students. The supposedly expert panel overwhelmingly described the beverage like they would a red wine. They were completely fooled.

The research, later published in the journal Brain and Language, is now widely used to show why wine tasting is total BS. But more than that, the study says something fascinating about how we perceive the world around us: that visual cues can effectively override our senses of taste and smell (which are, of course, pretty much the same thing.)

WHEN BROCHET BEGAN his study, scientists already knew that the brain processes olfactory (taste and smell) cues approximately ten times slower than sight -- 400 milliseconds versus 40 milliseconds. It's likely that in the interest of evolutionary fitness, i.e. spotting a predator, the brain gradually developed to fast track visual information. Brochet's research further demonstrated that, in the hierarchy of perception, vision clearly takes precedence.

Here's how the research went down. First, Brochet gave 27 male and 27 female oenology students a glass of red and a glass of white wine and asked them to describe the flavor of each. The students described the white with terms like "floral," "honey," "peach," and "lemon." The red elicited descriptions of "raspberry," "cherry," "cedar," and "chicory."

A week later, the students were invited back for another tasting session. Brochet again offered them a glass of red wine and a glass of white. But he deceived them. The two wines were actually the same white wine as before, but one was dyed with tasteless red food coloring. The white wine (W) was described similarly to how it was described in the first tasting. The white wine dyed red (RW), however, was described with the same terms commonly ascribed to a red wine.

"The wine’s color appears to provide significant sensory information, which misleads the subjects’ ability to judge flavor," Brochet wrote of the results.

"The observed phenomenon is a real perceptual illusion," he added. "The subjects smell the wine, make the conscious act of odor determination and verbalize their olfactory perception by using odor descriptors. However, the sensory and cognitive processes were mostly based on the wine color."

Brochet also noted that, in general, descriptions of smell are almost entirely based off of what we see.

"The fact that there are no specific terms to describe odors supports the idea of a defective association between odor and language. Odors take the name of the objects that have these odors."

Now that's deep. Something to ponder over your next glass of Merlot, perhaps?

A FEW YEARS after publishing his now famous paper, the amiable, bespectacled, and lean Brochet turned away from the unkind, meritocratic, and bloated culture of French academia and launched a career that blended his love for science and his passion for "creating stuff."

Yep. You guessed it. He makes wine.

(Images: AP, Morrot, Brochet, and Dubourdieu)

Six Big Lessons from the Ebola Outbreak

Not an Ebola expert.

The Ebola outbreak in West Africa, which continues to rage and has now claimed the lives of more than 1100 people, offers some big lessons for America.

#1. For all its flaws, the American public health system is pretty good. We transported two patients from the middle of a hot zone who were infected with one of the world's deadliest viruses to a major metropolitan area in the United States. We did this without infecting anybody else or putting the public in danger. The two Americans were treated with a "secret" remedy (that we reported on two years ago) and are continuing to improve. One of them may actually be discharged soon.

#2. Bringing the sick Americans home was the right thing to do. On August 1, the ever-present Donald Trump tweeted: "The U.S. cannot allow EBOLA infected people back. People that go to far away places to help out are great-but must suffer the consequences!" If Ebola was as infectious as, say, measles or influenza, then Trump would be right to be concerned. If such a virus were to emerge, quarantining the patients abroad would probably be the appropriate course of action to prevent unnecessary risk to the American public. But Ebola is not that infectious. Ignorance is no excuse to stir up public anxiety, and Trump's comments were completely out of line.

#3. Biotechnology and GMOs save lives. The antibody cocktail that was used to treat the patients was the product of biotechnology, specifically GMOs. Mouse genes were modified to become human-like, and then they were placed inside of a tobacco plant. The medicine was then extracted from the plant and given to the patients. (Read John Timmer's excellent article for the details.) Keep in mind that this is the sort of life-saving research that anti-GMO activists are fighting to prevent.

#4. Do not destroy smallpox. A few months ago, the world was once again debating whether or not to destroy the known vials of smallpox that exist at the CDC in Atlanta and at a facility in Russia. Since that debate, the Ebola outbreak exploded, and some previously forgotten vials of smallpox reappeared in an NIH storage room. When scientists say we should keep smallpox around "just in case," these are the sorts of surprises they are talking about. Yes, there is a real risk that smallpox (or some other deadly pathogen) could escape from a laboratory. But is the world really better off if we forego research out of fear?

#5. Americans need to pay more attention to global affairs. Separated by two vast oceans, and bordered by two friendly neighbors, we tend to be rather insular in terms of our global perspective. Unless there is a war or some other geopolitical instability that directly threatens our interests, we remain disinterested in the rest of the world. Even then, we still may not be able to find the troubled spot on a map, as 84% of Americans were unable to do with Ukraine. If merely 1 in 6 Americans can find a gigantic country bordering Russia on a map, just how few could find Liberia, Guinea, or Sierra Leone -- the center of the outbreak? In our modern, interconnected world, what happens on one side of the globe can and will affect the other side. Maybe it's time to teach more geography in school.

#6. NIH funding should be increased. The U.S. government has neglected the National Institutes of Health (NIH), more or less letting funding slide ever since 2003. As Pacific Standard reported last year, "the Obama administration's budget request for the 2014 fiscal year is $31.3 billion, more than 23 percent lower than the 2003 funding level in purchasing power." If the U.S. wants to remain globally competitive and ready to fight disease, this downward trend needs to be reversed. Maybe the Ebola outbreak will force some very much needed bipartisanship.

Will Epigenetics Be Used to Oppress Women?

Epigenetics is the next big field that the media, fearmongers, and political hacks will attempt to exploit. How do we know? Because there is a flurry of research in the field (which is not always a good sign), and journalists are already hacking away. You can find articles blaming epigenetics for obesity, cancer, personality, homosexuality, and (absurdly) how we vote.

Never mind the fact that there is serious reason to believe epigenetic changes are temporary and may not be passed down to multiple generations, particularly among mammals.

To be sure, epigenetic changes are probably linked (albeit very slightly) to some of those things. But, epigenetics will turn out to be just like regular genetics: Any one allele or epigenetic variation will only make a person ever so slightly more/less inclined toward a particular health outcome. That is because out of thousands or even millions of such potential variables, the impact of any one of them is usually marginal.

In other words, epigenetics is incredibly complicated. It is a field that is quite literally in its infancy, and there is still much to be learned. If the human genome is the black box of an aircraft, the epigenome is the black box of a UFO. Therefore, be wary of over-simplifications and general pronouncements. They will almost certainly be incorrect.

Other researchers have made similar observations. A commentary in the journal Nature indicates that the biggest headline-grabbers mostly involve women, specifically mothers:

‘Mother’s diet during pregnancy alters baby’s DNA’ (BBC), ‘Grandma’s Experiences Leave a Mark on Your Genes’ (Discover), and ‘Pregnant 9/11 survivors transmitted trauma to their chil- dren’ (The Guardian). Factors such as the paternal contribution, family life and social environment receive less attention.

The authors fear that "exaggerations and over-simplifications are making scapegoats of mothers, and could even increase surveillance and regulation of pregnant women."

They have a point. And they used a particularly persuasive example to illustrate it.

Everybody knows that pregnant women shouldn't drink. But everybody is wrong. Though it is completely taboo for a pregnant woman to be seen sipping a glass of wine, research has shown that moderate amounts of alcohol probably do not harm the developing fetus. Yet, the concern surrounding "fetal alcohol syndrome" -- a serious condition that can arise when an expecting mother drinks excessively -- resulted in mass government regulation, in some cases even making it illegal for pregnant women to drink at all.

The authors worry, perhaps rightly so, that the media hype surrounding epigenetics will once again turn its focus on mothers. Will the government once again regulate what pregnant women can eat, drink, and do? And if so, why not regulate the behavior of men, as well? Epigenetics, after all, can affect sperm quality.

The authors' conclusion provides an excellent framework for the media and policymakers to make sense of epigenetic studies:

First, avoid extrapolating from animal studies to humans without qualification. The short lifespans and large litter sizes favoured for lab studies often make animal models poor proxies for human reproduction. Second, emphasize the role of both paternal and maternal effects. This can counterbalance the tendency to pin poor outcomes on maternal behaviour. Third, convey complexity. Intrauterine exposures can raise or lower disease risk, but so too can a plethora of other intertwined genetic, lifestyle, socioeconomic and environmental factors that are poorly understood. Fourth, recognize the role of society. Many of the intrauterine stressors that DOHaD [developmental origins of health and disease] identifies as having adverse intergenerational effects correlate with social gradients of class, race and gender. This points to the need for societal changes rather than individual solutions.

This is eminently reasonable. Media and politicians, the ball is in your court.

Source: SS Richardson et al. "Don't Blame the Mothers." Nature 512:131-132. (2014)

(Images: AP Photo; Blame it on my epigenetics)

Should You Eat Organs?

The prospect of eating stomach is not something that would excite a great many Americans. But what's disgusting to us is a delicious, balanced meal to a great many animal predators. I mean, think about it.

"Stomachs are especially valuable because of what's inside them. The predator benefits from the nutrients of the plants and grains in the guts of its prey," science writer Mary Roach wrote in Gulp.

It's not just stomachs, of course. Bodily organs in general are amazingly nutritious.

"A serving of lamb spleen has almost as much vitamin C as a tangerine. Beef lung has 50 percent more," Roach wrote.

Yet thanks in part to our culturally evolved sense of disgust, most citizens of the Western world tend to turn up their noses at organs and ship them elsewhere.

"In 2009, the United States exported 438,000 tons of frozen livestock organs... Egypt and Russia are big on livers. Mexico eats our brains and lips. Our hearts belong to the Philippines."

Are we missing out? Alternative medicine proponent and supplement guru Joseph Mercola thinks so. But Mercola, who funds anti-vaccine and anti-fluoridation groups, regularly spouts woo-y mumbo jumbo. What are the facts?

Right now, Americans binge on muscle meat -- wings, legs, breasts, ribs... you get the idea. You should know, however, that muscle meat contains plenty of protein but little else. By comparison, a single 100-gram serving of liver contains roughly 800% of your daily value of Vitamin A and 1100% of Vitamin B12. Most organs sport similarly copious amounts of B vitamins, and only slightly less protein than muscle meat. They also generally contain more fat and a higher amount of cholesterol. For example, a serving of liver has more cholesterol than an egg, while a 4-ounce serving of brain has as much as ten eggs! (Coincidentally, brains are often served scrambled.)

It's obvious that organs are more nutritionally complete, but we live in an era of over nutrition, not under nutrition. To live a healthy life, you don't need a serving of liver each and every day. In fact, that might be unhealthy. The Institute of Medicine, a part of the National Academy of Sciences, recommends consuming on average no more than 3,000 micrograms of Vitamin A each day, and one serving of pork liver alone contains 6,500! The Vitamin A that we don't use is stored mostly in the liver (your own, not the one you'd be eating) and fat to be used later. So consuming more than needed over long periods can lead to a toxic build-up, known as Hypervitaminosis A.

Several members of a polar expedition led by Franz Josef Land in the late 1800s reportedly suffered from Vitamin A toxicity after consuming a stew made from the liver, heart, and kidneys from a single polar bear. (The organs of walruses, bears, seals, and moose contain especially high amounts of Vitamin A.) Within four hours, the explorers who partook of the stew grew nauseous, drowsy, and suffered severe headaches. Over the next 24 hours, the skin of half of those individuals began peeling off. All eventually recovered.

Organs from livestock animals such as cows, pigs, sheep, and chickens can certainly be consumed safely, but I'd caution against making liver an everyday staple. Eating brain can also be a risky proposition. You might think it would make you smart, but it can actually transmit neurodegenerative diseases like bovine spongiform encephalopathy, more commonly known as mad cow disease.

Now that you're sufficiently informed, care for some skewered heart?

Source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013

(Image: AP, Shutterstock)

Antiperspirants Alter Your Armpit Bacteria and Could Actually Make You Smell Worse

In modern society, antiperspirants are widely hailed as a godsend, dispelling the inconvenient odors wafting from armpits everywhere. But a new study casts doubts on their vaunted position. As it turns out, antiperspirants may actually make you smell worse in the long run.

For 90% of all Americans, slathering on deodorants and antiperspirants is a daily occurrence, a precautionary measure against foul odors and unsightly sweat stains. The odors arise when bacteria living in our armpits break down lipids and amino acids excreted in sweat into more smelly substances. Deodorants employ antimicrobial agents that kill off bacteria, as well as chemicals that replace noxious odors with pleasant aromas. Deodorants that double as antiperspirants, like Degree, Old Spice, and Dove, take the process one step further by physically plugging sweat glands with aluminum-based compounds.

While most of us might only concern ourselves with the dry, aromatic benefits of antiperspirants and deodorants, researchers at the Laboratory of Microbial Ecology and Technology at the University of Ghent in Belgium are more interested in the effects on bacteria. Billions of bacteria dwell in the "rain forests" under our arms, and the substances we don are mucking with their habitats!

To uncover how deodorants and antiperspirants affect armpit bacteria, Chris Callewaert, a Ph.D student specializing in microbial ecology, and a team of researchers recruited eight subjects for a task a great many people (and especially their friends) might deem unbearable: Six males and two females pledged not to use deodorant or antiperspirant for an entire month. Specifically, four subjects stopped using their deodorants and another four stopped using their antiperspirant deodorant. (Most antiperspirants are also deodorants. See image below for an example.) Another control subject who did not regularly use either was asked to use deodorant for a month. The duration was chosen because it takes approximately 28 days for a new layer of skin cells to form.

The researchers analyzed the diversity and abundance of subjects' armpit bacteria at various timepoints before they stopped using antiperspirant, during the period of abstaining from antiperspirant, and for a few weeks after resuming the use of antiperspirant. Switching hygiene habits plainly altered the armpit bacterial communities of every subject. Since no two armpits and their resident bacteria are identical, it was difficult to pinpoint precise changes brought about by deodorants or antiperspirants, but one clear trend did materialize: antiperspirants resulted in a clear increase of Actinobacteria.

You might not recognize the name of Actinobacteria, but chances are, you've smelled them. Dominated by Corynebacterium, they are the major instigators of noxious armpit odor. Other microbes that inhabit the armpit, like Firmicutes and Staphylococcus, don't produce odors as quickly, nor are those odors nearly as pungent.

Callewaert believes the aluminum compounds in antiperspirants may be to blame, killing off "good," less smelly bacteria and allowing "bad" bacteria to dominate. His study found that deodorants which lack these sweat-blocking antiperspirant compounds are actually linked to a slight decrease of stinky Actinobacteria. (Below: An example of the bacterial community of one subject. When using an antiperspirant, Firmicutes [diamonds] and Actinobacteria [dashes] dominate. Without antiperspirant, Firmicutes are more abundant and Actinobacteria are hardly present.)

Though antiperspirants and deodorants are widely used, they are only a temporary fix.

"The measures we utilize today do not take away the initial source: the odor causing bacteria," Callewaert told RealClearScience. "Deodorants only mask unpleasant odors. We can do better than that. The follow up of this research is finding better solutions."

And Callewaert is already working on one: "armpit bacterial transplantation."

"We take away the bad bacteria from the armpit of somebody with a body odor, and replace it with the good bacteria of a relative who doesn't have a body odor," he explained.

"So far we have helped over 15 people. For most subjects it brings immediate improvements. Most of them on a permanent time scale, although there are also people who suffer again from a body odor after some months."

For now, this approach seems rather extreme, but maybe it should be used as a last-line resort for the sort of person you can smell from the other side of the room.

The big limitation of the current study is its sample size. Just nine subjects took part.

"The sample size is rather small," Callewaert admits. "However, we see consistent outcomes."

Callewaert also says that this is the first study to specifically examine the effect of deodorant on the diversity and abundance of armpit bacteria.

"We barely know what lives in our armpits, on our clothes, in our laundry machines, and what causes all these unpleasant odors."

While the current study strongly suggests that antiperspirants can make their users smell worse via the growth of Actinobacteria, it does not directly assess body odor. As a follow up, Callewaert should recruit subjects to use antiperspirants and utilize methods like gas chromatography to directly measure the amount of the stinky volatile organic compounds emanating from their armpits. Professional smellers could also be put to work... at their own peril.

Source: Chris Callewaert, Prawira Hutapea, Tom Van de Wiele, Nico Boon. "Deodorants and antiperspirants affect the axillary bacterial community." Arch Dermatol Res. 2014 Jul 31.

(Images: Shutterstock, Callewaert et. al.)

NASA's Impossible Space Engine Is Total BS

Last week NASA released results of a test on a new space engine design. It seems to produce thrust without burning fuel. Is the impossible science fiction of the future now possible?

A simple picture of the proposed idea reveals its fundamental absurdity. Grab a friend and a hop into a car with a back seat (not like that). You push on the windshield repeatedly, while they push on the rear window just after every time you push on the front. Now, does the car roll forward? NASA is claiming that it might.

Peer a little deeper into the story and a forest of red flags start to appear. The design of the engine is disarmingly simple, but conceptually doesn't make sense. There are lots of equations to confuse the average reader. NASA didn't actually build the engine. It was given to them completely built from a design that has been criticized, refuted, discredited and described as a fraud by physicists.

NASA's test? Conducted by a guy who believes in warp drives. The second test of validity? A study published by an unknown researcher in a fourth-rate academic journal and never cited by anyone else.

Even worse, the control engine that was supposed to produce no thrust produced the same thrust as the test engine. When a "null" experimental control doesn't produce a null result... well, that's very bad.

Looks like NASA got duped.

According to Roger Shawyer and Guido Fetta, the peddlers of the EMDrive or Cannae drive, the engine works by bouncing microwaves back and forth within a metal container. The claim is that by making one side of the container bigger than the other, more thrust is deposited on that wall. Face that wall to the back and the engine pushes forward. What's the flaw?

The total energy flux of a wave doesn't change except for dissipation as it goes back and forth. Translation: the same amount of energy is deposited on the front of the engine as the back of the engine. You cannot possibly get net energy out of a system with no additional energy input. In reality, the net force is zero, producing zero thrust.

Once the basic idea is busted, the creators resort to true BS. First, they claimed relativistic electrodynamics explained the device's power. Then they switched tacks and claimed that vacuum quantum energy is the key. Science seems to refute these findings as somewhere between implausible and nonsensical. What other extraordinary evidence can we look for to support this extraordinary claim?

How about the only other test of the engine? Great scientific work is not always published in top journals, and sometimes fraudulent work is. However, better work from better researchers generally tends to be published in a select few well respected journals. This academic test was not published in one of the 10 best publications in physics, nor one of the 50 best, nor even one of the 500 best. It was published in a journal ranked 688th in the field, a place where weak research findings go to quietly die. This publication isn't even translated into English, the universal scientific language of the Earth.

Finally, common sense can be a last check: the smell test. Extracting free energy with no loss of fuel? Does this sound plausible? I leave that to you. Personally, I'd bet my salary against it. Not that that's much money, mind you.

Technically, Earth Does Not Orbit Around the Sun

The discovery that Earth revolves around the Sun was revolutionary. It fundamentally changed how we viewed the cosmos, as well as ourselves.

But the Earth does not revolve around the Sun. At least, not exactly. Time to get pedantic.

"Technically, what is going on is that the Earth, Sun and all the planets are orbiting around the center of mass of the solar system," writes Cathy Jordan, a Cornell University Ask an Astronomer contributor.

"The center of mass of our solar system very close to the Sun itself, but not exactly at the Sun's center."

Every single object in the solar system, from the gargantuan sun to the tiniest speck, exerts a gravitational pull on everything else. The solar system is basically a massive game of tug of war, and all of the yanking balances out at a specific point: the center of mass, or "barycenter." Everything in the solar system orbits around that point. Sometimes, it's almost smack dab at the Sun's center. Right now, the barycenter is just outside the Sun's surface. But it's constantly changing depending upon where the planets are in their orbital paths.

 

Because the Sun holds 99.87% of all the mass in the solar system, it's always going to win the tug of war. Even if all the planets were perfectly lined up on one side of the Sun, the center of mass would be just 800,000 kilometers off the surface of the Sun. That sounds like a lot, but remember, our solar system is big! Such a barycenter would be roughly 70 times closer to the Sun than the closest planet to the Sun, Mercury.

 

 

An even better illustration for center of mass is the binary star system. When two stars of comparable mass cohabit the same corner of space, they orbit about a point between each other.

 

 

However, rather than play a billion-year game of cosmic tag, more often than not, binary stars will take an elliptical orbit!

 

File:Orbit5.gif

 

The rest of the universe certainly doesn't revolve around the Earth, but, like so many topics in science, it's an oversimplification to say that everything orbits around the Sun.

(Images: Chandra Observatory, Wikimedia Commons, Lsmpascal, ESO, Zhatt

Surprise! E-Cigs Do More Good than Harm

Some things in life are so indisputably true that it is a surprise that anyone disagrees. The safety and efficacy of vaccines, the benefits of GMOs, and the inability of Nicolas Cage to convincingly perform a single role all would be at the top of Captain Obvious's list. Now, the list has a new entry: Electronic cigarettes produce far more good than harm.

The idea behind e-cigarettes is straightforward. Tobacco cigarettes are bad for you, mostly because of the tar and other toxic chemicals that can lead to cancer or chronic obstructive pulmonary disease (COPD). The addictive component is nicotine, a nervous system stimulant that mimics the neurotransmitter acetylcholine. That is why smokers keep coming back for more; they need the "high" that nicotine delivers.

Though it is a powerful drug, nicotine does not cause cancer or COPD. It is widely assumed that, similar to nicotine patches or other forms of nicotine replacement therapy, e-cigarettes will give smokers an innovative (and fun) way to quit tobacco.

But there are naysayers. Some studies have shown that e-cigs produce carcinogens, and long-term biological effects are unknown. UCSF researchers insist that e-cigs do not help smokers quit, but merely lead to "dual use of e-cigarettes with conventional cigarettes." Still others claim that e-cigs may serve as a gateway to regular cigarettes, particularly among children. Thus, some adversaries have concluded that e-cigs are no good and, in fact, may be worse than regular cigarettes.

Such logic beggars belief. The worst part of a tobacco cigarette is the tar; e-cigs contain none of that, which means they will produce fewer carcinogens. It is true that long-term biological effects are unknown, but a reasonable hypothesis is that e-cigs will prove to be a much safer alternative. "Dual use" is a rather strange complaint; time spent vaping an e-cig is certainly better than inhaling tobacco smoke. And since the notion that marijuana serves as a gateway to harder drugs is a dubious one at best, why should we believe that e-cigs will serve as a gateway to real cigarettes?

Indeed, a new literature review in the journal Addiction analyzing the safety and use of e-cigs should quell some of those fears. The authors write:

"EC [E-cigarette] aerosol can contain some of the toxicants present in tobacco smoke, but at levels which are much lower. Long-term health effects of EC use are unknown but compared with cigarettes, EC are likely to be much less, if at all, harmful to users or bystanders. EC are increasingly popular among smokers, but to date there is no evidence of regular use by never-smokers or by non-smoking children. EC enable some users to reduce or quit smoking."

They conclude:

"Allowing EC to compete with cigarettes in the market-place might decrease smoking-related morbidity and mortality. Regulating EC as strictly as cigarettes, or even more strictly as some regulators propose, is not warranted on current evidence. Health professionals may consider advising smokers unable or unwilling to quit through other routes to switch to EC as a safer alternative to smoking and a possible pathway to complete cessation of nicotine use." (Emphasis added.)

Based on current data, the authors see no reason to regulate e-cigs as strictly as the real thing. Many e-cig opponents will balk at that, claiming that e-cigs have no business being in the hands of children or other non-smokers. Indeed, they could be right. But there is a simple fix: Regulate them so they are sold only to people over the age of 18 or, alternatively, as a prescription drug.

Source: Peter Hajek, Jean-François Etter, Neal Benowitz, Thomas Eissenberg & Hayden McRobbie. "Electronic cigarettes: review of use, content, safety, effects on smokers and potential for harm and benefit." Addiction. Published online before print. DOI: 10.1111/add.12659

(AP photo)

Why Not Nuclear-Powered Aircraft?

We have nuclear submarines and nuclear ships, so why not nuclear planes?

Well, that's a very good question, one the United States spent $1.04 billion back in the 1950s trying to answer.

The idea for a nuclear-powered plane was originally hatched in 1944, during a time when the Manhattan Project was astonishing everyone with the potential of nuclear energy. Three years later, engineers at the newly established U.S. Air Force were awarded with initial funding of $10 million to research methods of wielding atomic power for aircraft. The request was buoyed by an obvious advantage: Theoretically, such a plane could remain aloft for weeks at time.

Under the new Aircraft Nuclear Propulsion program, Air Force engineers immediately went to work, developing three experimental engines (two of which are pictured below). By 1951, they settled on a method for directly transferring heat from a reactor and using it to propel an aircraft, described thusly in a 1963 government report:

"Air enters through the compressor, is forced into the reactor, and is heated by the fuel elements. After passing through the turbine, where energy is extracted to drive the compressor, the heated air is expelled at high velocity through the exhaust nozzle."

Concurrently, engineers at General Electric built a 2.5 Megawatt nuclear reactor designed specifically for a plane. It was a molten salt reactor, the first of its kind. Unlike the typical light water reactors used today -- where the uranium fuel is submersed in water -- a molten salt reactor utilizes a uranium-containing salt mixture as both a coolant and/or a fuel, allowing for a much smaller design. 

With everything coming together on the technical end and funding continuing to pour in, it was time to face a key practical roadblock: how to protect the crew from the radiation emanating from their own plane. Thus, the early 1950s saw engineers affixing a gargantuan 11-ton cockpit lined with lead and rubber to the head of a Convair B-36 "Peacemaker" bomber. By the middle of the decade, Air Force personnel loaded General Electric's reactor onboard via the bomb bay and switched it on during the majority of 47 test flights. The reactor didn't propel the aircraft or power any of its systems, however. The flights were purely intended to test the radiation shielding, and their results were promising.

"All the data collected by these tests showed the program managers that the possibility of using a nuclear power plant to provide an aircraft with unlimited operational range was indeed at their disposal at this time," Raul Colon wrote for Aviation History. Reports from the time back Colon's assessment.

Meanwhile, January 1956 brought successful ground tests of the X-39 engine, powered completely by a nuclear reactor. Forecasts of atomic jets by 1980, published in Popular Science, were apparently on schedule!

But then the bipolar nature of politics abruptly intervened. Hesitance towards nuclear power supplanted excitement in the late 1950s and early 1960s, and politicians and a public nervous about the escalating Cold War started viewing nuclear planes less as a futuristic life changer and more as a recipe for mobile meltdown. President Kennedy canceled the Aircraft Nuclear Propulsion program in 1961.

Every so often, a call for nuclear-powered planes resurfaces, but it's either ignored or swiftly shot down. Given the current regulatory attitude toward grounded nuclear energy, the ubiquity and safety of fossil fuel-powered flight, and advancements in electric aircraft, it's difficult to imagine that nuclear power will ever take wing.

(Images: USAF, Wtshymanski)

Less Water on Exoplanets than Expected

Distant stars are pinpoint specks, too small to resolve. Exoplanets are ten times smaller in diameter and don't emit light of their own. They're vastly fainter than any star; we couldn't even see a single one until 20 years ago.

That's what makes a painstaking new study completed using the Hubble Space Telescope so beautiful. It not only located three exoplanets, but painstakingly measured the water composition of their atmospheres. There's water, but it's less than we expected. How is this possible?

Astronomers often find exoplanets by watching the light of many stars. If the brightness of a particular sun has a tiny (usually 1% or smaller) flicker that repeats in a regular pattern, we can calculate whether a planet's continuous orbit crossing in front of the star is the cause. Astronomical techniques have now evolved to the point that we not only look for a flicker in the total light from the star: we can see precisely how much each and every color of the rainbow flickers.

This is a more difficult version of the way we've been investigating stars for a century. We know the chemical makeup of far away suns because of their absorption spectrum: the colors missing from the light they broadcast to us.

The searing hot plasma at the core of a massive star emits light of all colors in the spectrum. The elements in the atmosphere of the star absorb a little bit of that light however, leaving certain colors absent from the light that reaches us.

Due to the quantum nature of energy states in atoms and molecules, they only absorb and emit energy in certain exact amounts, i.e., with very certain electromagnetic frequencies.

Water molecules absorb and emit a certain group of wavelengths due to their quantum transitions too. Upon absorbing an infrared photon of the correct wavelength (1380 nm for example), the atoms of the molecule will be kicked into vibrating back and forth in a certain pattern. This picture shows what such states look like.

An exoplanet atmospheric survey first looks to see how much light the star emits toward us at the colors absorbed by water when nothing else blocks any of the light. When the planet passes between the star and our telescopes during its orbit, we look at each of these wavelengths of light and see how much of it has been blocked. A certain percentage will be blocked solely by the mass of the planet itself.

However, a small part of the star's light will pass through the atmosphere of the planet and escape to the other side. We then look at this light that has passed through the sky of an alien world to see if it is missing a greater amount of those wavelengths that water likes to absorb.

The amount of water present in the measured alien atmosphere was actually something around 100 times less than predicted. This could mean that our models of how elements are distributed and retained in planet formation need tweaking. It could also be due to patterns of cloud or haze in the atmospheres of the planets. Measurements like these are the payoff of incredible recent improvements in astronomical instruments and techniques.

If this trend continues, it may not be long before we begin to look at something even more exciting: the atmospheres of earth-like planets.

(AP photo)

The Most Boring Time on Earth

Starting roughly 1.8 billion years ago and continuing for a "Boring Billion" years hence, life on Earth was slimy, and evolution stagnated. When the first eukaryotic organisms evolved many millennia before, things looked like they were going to get exciting! What followed was a mega letdown. Life consisted predominantly of microbes and algae. Our planet, so resplendent and colorful today, amounted to little more than a watery, brown-green mess.

Scientists don't have time machines, so how do they know this? The best ancient snapshots come from fossilized structures called stromatolites (pictured above). When scientists examine these primordial records of life -- specifically the ones that date back approximately 800 million to 1.8 billion years -- they see bacteria, single-celled organisms called Archaea, some eukaryotic algae, and little else.

While geologists generally agree on the monotony of the period, they offer many competing explanations for precisely why it was so dull. Through the first half of the year alone, three new studies joined the theoretical fray.

In a study published in April, geologists at the University of St. Andrews blamed the gradual cooling of Earth's interior, which may have caused Earth's surface to stabilize. Thus, with almost no tectonic activity to shake things up, life remained contentedly simplistic.

Ross Large, a distinguished professor of geology at the University of Tasmania, offered a different theory in March. By analyzing seafloor sediments from all across the world, Large and his team found that trace metals like cobalt, selenium, copper, zinc, molybdenum, vanadium, and cadmium seemed to decline in the oceans during the "boring billion." These elements, he says, are critical to life and evolution, and without them, life wouldn't have been able to diversify and advance.

Another recent hypothesis points the finger squarely at oxygen, or rather the lack thereof. Around 2.3 billion years ago, cyanobacteria sent atmospheric oxygen levels soaring 1000 times over, which in turn sent an untold number of oxygen-hating anaerobic bacteria to their graves. But according to a team of American researchers, after the brief spike -- which could very well have catalyzed the evolution of eukaryotes -- oxygen levels seem to have dipped back down. And so, eukaryotes persisted in oxygenated parts of the ocean, but evolution as a whole stalled.

While scientists research and debate why the "Boring Billion" was so boring, a few holdouts, like Harvard University's Andrew Knoll, proffer that the period wasn't as banal as its name suggests.

"There’s reason to believe that all of the properties of cell biology that made complex life possible in the next geologic era were put in place here: cytoskeletons that allow eukaryotic cells to change shape, and cell polarity that allows cells to send a molecular message to one side of the cell but not the other, and to interact with nearby cells. The molecular circuitry and cross talk that allow complex organisms like us to exist today all took root in the so-called Boring Billion."

(Image: Shutterstock)

Turkish Researcher Claims that Virgin Mary Was a Hermaphrodite and Souls Are Dark Matter

The history of science is populated with three types of ideas.

First, there are the great ideas. Peter Atkins counts ten of them, in his book Galileo's Finger: The Ten Great Ideas of Science. Evolution by natural selection, the heritability of genetic information, the conservation of energy, and an expanding universe are among those ten ideas.

Second, there are the fascinating-but-wrong ideas, such as alchemy, phlogiston, and spontaneous generation. All of those ideas seemed reasonable in their day, but subsequent investigation proved they were incorrect.

And then there are the stone cold crazy ideas. This is a post about those.

At RealClearScience, we have had the immense pleasure of stumbling across a Turkish researcher named M. Kemal Irmak. It was he who proposed that schizophrenia is actually demonic possession. Looking through his other published papers, he also apparently believes that fluoride causes diabetes in Finland.

He has other ideas. And, bless him, he's not afraid to tell us what they are.

The New Testament narrative of the virgin birth of Jesus has two widely believed explanations. On the one hand, most Christians (and Muslims) accept the biblical account as true but, due to its miraculous nature, is well beyond the reach of science. On the other hand, most skeptics and non-Christians insist the story is a myth, not unlike the Greco-Roman tales of the semi-divine birth of demigods, such as Hercules.

Dr. Irmak has his own hypothesis: The Virgin Mary was a hermaphrodite, and her pregnancy was the result of self-fertilization. He explains:

"Virgin Mary is a chimera of 46,XX/46,XY type resulting from the fusion of two embryos of different sex types and both ovarian and testicular tissues develop in Her body as seen in a beautiful plant."

In other words, the mother of the Virgin Mary was going to give birth to twins. Instead, the twins -- one male, one female -- fused together, forming a hermaphroditic Virgin Mary. After reaching adulthood, she accidentally self-fertilized. As crazy as this sounds, Dr. Irmak actually found an obscure 1990 paper in the journal The Veterinary Record that documents the curious case of a hermaphroditic rabbit that apparently became pregnant after living in isolation.

But, there are (at least) two very big problems with the "hermaphroditic Virgin Mary" hypothesis: (1) Mary's husband, Joseph, probably would have noticed; and (2) While human hermaphrodites certainly do exist, it is very unlikely that they would develop two sets of functioning genitals.

For the sake of argument, however, let's assume that the hypothesis is correct. Are there any milestones we should be looking for in the developing fetus? Dr. Irmak does not address the embryological development of Jesus, specifically, but he does have something rather unique to contribute to the field of embryology: It involves dark matter.

The allocortical birth theory, apparently coined by the esoteric Dr. Irmak, claims that a fetus receives a soul during the 13th week of development. This is predicated upon Dr. Irmak properly understanding a brain structure called the allocortex. Granting him that, where does the soul come from? Dr. Irmak explains:

"The dark matter constitutes most of the mass in our universe, but its nature remains unknown. The soul is likely to work into man’s physical body directly via that dark matter."

Aha! Dark matter, that mysterious substance that makes up 26.8% of the mass-energy of the universe, is actually soulstuff. And it enters the fetal brain during the 13th week of development... through its nose, via a vestigial structure called the vomeronasal organ.

If you've made it this far, you are probably thinking, I'm not terribly convinced by his argument. After all, can he make a testable prediction? I would strongly argue yes.

People who do not have noses would not have souls. People who don't have souls are evil. Can you think of any malevolent humans lacking a proboscis? Of course you can. His name is Lord Voldemort.

Q.E.D.

Source: Irmak MK. "Embryological basis of the virgin birth of Jesus." J Exp Integr Med 4(2): 143-146. (2014) doi: 10.5455/jeim.060113.hp.011

Source: Irmak MK. "Cosmological dark matter and ensoulment." J Exp Integr Med 3(4): 343-346. (2013) doi: 10.5455/jeim.110813.hp.006

(AP Photo)

The Strange Disorder Where Your Hand Has a Mind of Its Own

THE DAY WAS progressing normally enough for Pam Johnson (not her real name, in the interest of privacy), but when the 77-year-old sat down to watch television, the monotony turned disturbing. Out of the corner of her eye, she witnessed her left hand creep off its resting position on the armchair and slowly arc across her field of vision. It then began to stroke her face and hair. Now, none of this would be out of the ordinary... had she actually been controlling the appendage. But the hand was doing it all by itself!

Frightened, she called for her husband, and the pair zoomed off to the hospital. By the time they arrived, the hand was back under Pam's control. The 30-minute ordeal seemed like an apparition, but she didn't imagine it. Brain scans confirmed that something was amiss.

The story of Pam's temporarily wayward hand is not at all foreign to neurologists.

"ALIEN HAND SYNDROME is a phenomenon in which one hand is not under control of the mind. The person loses control of the hand, and it acts as if it has a mind of its own," scientists recently described in the Proceedings of Baylor University Medical Center.

Pam got off lucky. Her bout of alien hand syndrome is the shortest known to science. In the dozens of times that the condition has been documented previously, it usually lasts around a year, but can persist for more than a decade.

Pam's case of alien hand appeared to have been caused by a very minor stroke, but that's quite out of the ordinary. Most often, alien hand syndrome arises in the wake of some sort of damage to the corpus callosum or the supplementary motor area of the brain.

For the average patient, once the initial shock wears off, having an arm with a mind of its own can grow to be somewhat amusing, if a little annoying. The syndrome generally does not pose risks, though there are rare instances of out-of-control hands slapping or choking their owners. Occasionally, the hand will reach out and pick up random objects. Oftentimes, it does nothing. However, in some cases, it will maddeningly do the opposite of what the consciously controlled hand does. Imagine trying to change the television channel or get dressed in the morning!

Sergio Della Sala, one of the foremost experts on "anarchic hand" syndrome (as he calls it), described a meeting with one of his patients in a 2005 issue of The Psychologist:

We were discussing the implication of her medical condition for her and her relatives, when, out of the blue and much to her dismay, her left hand took some leftover fish-bones and put them into her mouth. A little later, while she was begging it not to embarrass her any more, her mischievous hand grabbed the ice-cream that her brother was licking. Her right hand immediately intervened to put things in place and as a result of the fighting the dessert dropped on the floor. She apologised profusely for this behaviour that she attributed to her hand’s disobedience. Indeed she claimed that her hand had a mind of its own and often did whatever ‘pleased it’.

IN THE PAST decade, neuroscientists have used functional magnetic resonance imaging (fMRI) to compare the brains of people with alien hand syndrome to those of normal individuals.

"In normal individuals, initiation of motor activity shows activation of multiple extensive neural networks," the Baylor scientists detailed. "However, in patients with alien hand syndrome, only isolated activation of the contralateral primary motor cortex is observed."

So while the systems that move the hand function normally, adjacent systems in the brain that allow patients to consciously register the actions of their hand do not.

Currently, there is no cure for alien hand syndrome, but long-term suffers have been known to deal with their disobedient hands by constantly wearing an oven mitt or giving the hand something to hold on to. Dousing it in warm water throughout the day also may sate the hand's desire to move. It's possible that constant tactile stimulation serves as a basic form of treatment.

(Image: AP)

Scarlett Johansson's New Movie Is Based on One of the Biggest Scientific Myths of All Time

The reviews aren't yet in for Scarlett Johansson's new movie Lucy, but a single viewing of the trailer is enough to give the film a resounding "two thumbs down" on science.

Here's the synopsis:

In a near future where corruption reigns, the Taiwanese mob forces a young woman named Lucy to work as a drug runner. They implant a mysterious compound into her body for her to transport. But when its container ruptures, Lucy begins to experience radical changes. The chemical contained within boosts her brain capacity far beyond the normal 10% that humans utilize. She gains superhuman strength, dexterity, and accuracy, absorbs knowledge almost instantaneously, and develops telepathic abilities. Commence kickassery.

Have you spotted the problem?

The idea that humans only use 10% of their brains is a complete, utter, and total myth. Lucy is entirely premised on neuroscientific BS.

Now, as an ardent cinephile (I've often wondered how much better a science writer I'd be if my memory were packed with chemistry and physics facts instead of useless movie trivia), I'm not usually one to be overly pedantic when it comes to science and cinema. I remain in awe of Gravity, despite its scientific shortcomings. I'm willing to suspend belief in physics to allow the existence of a 700-foot wall of ice in Westeros. Heck, I even enjoyed Prometheus, even though the "scientists" in the film acted with complete and total disregard for, well, science.

Oh, and I'm almost certainly going to see Lucy.

But I refuse to pretend that the "10% Myth" is anywhere close to grounded in reality, even though 65% of the public believes it! Thanks to modern brain scanners, scientists have shown that virtually every part of the brain is in use at all times, even when we're sleeping! Moreover, it makes no logical sense that humans would evolve such large brains if we didn't use them.

It's easy to see why the myth has perpetuated for so long: It's an enticing fallacy! There might be psychic powers and super intelligence locked away within our minds. All that's needed is a key!

Or illicit drugs, apparently.

*Because Lucy is totally premised on a notion that's been thoroughly disproven, yet strives to maintain a semblance of realism, I think it enters contention for the most unscientific movie of all time. If you can think of any other nominees, share them in the comments below!

What to Do with a Failed $5 Billion Experiment?

So you spent 17 years and $5 billion to build a fusion experiment. You built a facility wider than the length of three football fields. You built a 400-foot-long laser with more than 33,000 optical parts; it is currently the highest energy laser in the world. You've been through more budget overruns and management problems than you'd care to admit.

Now, you finally turn the thing on at full power and carry out your experiment. And it fails monumentally. Now what?

This is the dilemma facing the National Ignition Facility (NIF). Built with the promise of providing ignition -- creating fusion energy greater than the energy needed to release it -- NIF fell 28,000 times short of its goal. No one knows how to fix it. So NIF has now been finding other things to occupy its time.

There are many useful things to do with a 1.8 MJ laser system.

One recent experiment tested whether diamond can take on a new crystalline form if you squeeze it hard enough. A pressure of 49 million Earth atmospheres was applied to a tiny sliver of the hardest natural material by placing it in the target chamber and firing the giant laser. The predicted crystal restructuring didn't occur. As the pressure increased, the diamond remained resilient instead of abruptly reordering its atoms. This tells us more about diamond and also about the behavior of carbon structures in the cores of extremely dense planets like Jupiter.

Another use for the NIF laser is to gather data for the design and upkeep of nuclear weapons. Since 1992 the US has not tested a single nuclear device. Real world tests of new bomb designs and the working condition of older weapons can no longer be carried out. The task falls to extremely complex codes run on supercomputers. NIF can produce heats and pressures found in bomb detonations. It also mimics the crushing action applied from the fission detonation to produce the far more powerful fusion reaction used in thermonuclear "hydrogen" bombs. Implosion-triggered fusion data can be gathered from NIF to guide the computer simulations.

NIF has also been running experiments on the properties of various materials at extremely high pressures and temperatures, shock wave creation, hydrodynamics and fuel pellets to look for better fusion results.

There are also a few useless things to do with a 1.8 MJ laser system.

In 2012 work was halted so that the film Star Trek: Into Darkness could be filmed at the laser target chamber. Releasing misleading press releases also falls under things not to do.

This seems like piling on, so let's be clear: fusion power is a goal worth spending billions on. It will take decades of work and massive resources to succeed. Taking gambles on facilities like NIF is a part of this process, and certainly a worthwhile investment. Now that this effort has failed, however, we need to find new science to produce at this uniquely powerful and capable facility.

Outbreak of Political Correctness in Science Media

A free and objective press: A quaint idea.

The American media is widely perceived to lean to the Left. Though most journalists won't openly admit the fact, it is indisputably true. As reported in the Washington Post, a 2014 study showed that among journalists Democrats outnumber Republicans by four to one. (The exact numbers were: 28.1% Democrat, 7.1% Republican, 50.2% Independent, and 14.6% "other" -- whatever that means.) It is impossible to know exactly what to make of the roughly 65% of journalists who refused to put a label on themselves, but it is perhaps safe to assume that Left-leaning independents outnumber Right-leaning independents by the same margin. After all, about 93% of DC-based journalists vote Democrat, and 65% of donations from journalists went to Democrats in 2010.

For science journalists, political affiliation shouldn't be a problem because the job of a science writer is to report data and facts. Yet, it is a problem. As Hank Campbell and I detailed in our book, Science Left Behind, science journalists are quick to point out unscientific flaws in Republican statements and policies, but shy away from doing the same for Democrats. (Thankfully, this is slowly beginning to change, as more journalists are rebuking Democrats for being opposed to GMOs.)

The left-wing echo chamber that is the modern-day science newsroom has resulted in some very troubling controversies. A recent outbreak of political correctness has resulted in the termination of a Scientific American blogger who committed the unspeakable crime of giving a favorable review to a controversial book on genetics by New York Times writer Nicholas Wade and for defending Richard Feynman against exaggerated accusations of sexism.

Then, the science writing community expressed bewildering outrage over a cover photo from the journal Science that depicted transgendered prostitutes for a special issue about AIDS. Of course, banging a hooker is a risk factor for acquiring HIV, and the spread of HIV via prostitution has become a giant problem in places like China. Initially, the faux outrage was directed at the supposed objectification of women, particularly because the photo does not show their faces. But, the photos were of transgendered individuals, not biological women. Besides, showing their faces surely would have been criticized as a violation of privacy. Either way, Science loses.

Finally, science writer Jeffrey Kluger penned an article titled "The Myth of the Diseased Immigrant" for TIME. In regard to the refugee crisis on the border with Mexico, he provocatively writes:

Now the nativists and xenophobes have played their nastiest -- and least surprising -- card: the border must be secured and the immigrants sent back because they are, of course, diseased.

In his entire 700-word screed, he states precisely one fact in support of his argument: In most Central American countries, people are vaccinated against measles at higher rates than U.S. citizens. It's hardly a relevant difference, however; 92% of Americans are vaccinated, while the rate of measles vaccination in Central America ranges from 93% to 99% (with the exception of Costa Rica at 90%).

It's a little strange that this needs to be pointed out, but measles isn't the only disease with which we need to be concerned. The World Health Organization estimates that around 2 billion people (nearly 1/3 of the planet) is infected with tuberculosis, a disease that still kills 1.5 to 2 million people annually.

In the U.S., the prevalence (i.e., the proportion of people currently infected) is 4.7 per 100,000, and the incidence (i.e., the rate of new infections) is 3.6 per 100,000. Compare that to Central American countries (all data per 100,000):

Mexico: Prevalence: 33; Incidence: 23
Belize: Prev: 51; Inc: 40
Guatemala: Prev: 110; Inc: 60
Honduras: Prev: 82; Inc: 54
El Salvador: Prev: 34; Inc: 25
Nicaragua: Prev: 55; Inc: 38
Costa Rica: Prev: 12; Inc: 11
Panama: Prev: 64; Inc: 48

The data speaks for itself. Central America has a vastly higher burden of tuberculosis than the U.S. In the worst-case scenario, 90,000 children (mostly from El Salvador, Honduras, and Guatemala) cross into the country. Based on the prevalence data, we can expect several dozen refugees to be carrying tuberculosis. Fortunately, long-term treatment can cure the disease, so hopefully these kids are being screened properly.

Beyond Central America, many developing countries are veritable Petri dishes of infection. Thankfully, many of the diseases, such as malaria, do not spread from person-to-person. But, some do. Hepatitis B, which is transmitted through body fluids like blood and semen, infects about 240 million people in the world, and more than a quarter of a million die every year. The virus is particularly prevalent in Africa and East Asia.

MERS, the deadly new virus similar to SARS that has killed at least 282 people, has already shown up in the U.S. Granted, the patients were travelers, not immigrants, but the threat is precisely the same. Many experts believe that it is merely a matter of time before Ebola -- which has killed more than 600 in the ongoing outbreak in West Africa -- turns up in the United States.

Simply put, an increasingly globalized world poses the very realistic threat of exotic diseases coming to the United States. It would be naive and irresponsible to pretend that immigrants, especially those from developing countries, pose absolutely no challenge to our public health system. Yet, that's what Mr. Kluger implies when he refers to the "myth of the diseased immigrant." He ought to know better; a brief scan of medical headlines proves he is incorrect.

Mr. Kluger is correct about one thing, however: We shouldn't fear or demonize immigrants. Any diseases that they might carry are usually curable or at least controllable. That is one of the benefits of living in a fully modern society. Therefore, we can welcome the world's tired, poor, huddled masses without worrying excessively about any microbes that might be coming along with them.

But, that's not what Mr. Kluger wrote. Instead, he absurdly claims that "immigrants have more to fear from us than we do from them." Then, he calls you a xenophobic bigot if you disagree.

Why on earth would a science journalist write such unmitigated nonsense? Could it be because Mr. Kluger places more emphasis on political ideology than on epidemiology and medical microbiology? Could it be because of political correctness? Given the other events this past week, these are tempting explanations.

Unfortunately, political correctness is a disease with no cure.

(Image: Censorship via Shutterstock)

The Things We Think Make Us Happy... Don't

What's likelier to make you happy: striking it rich in the lottery or getting paralyzed from the waist down?

The answer is obvious, right? Winning the lottery!

But when psychologists studied the question empirically, the results did not precisely follow our intuitive predictions. Oh, at first they did. In the ensuing 12 months after hitting the jackpot, lottery winners were significantly happier than paraplegics. Exotic trips and extravagant luxuries easily topped losing one's ability to walk.

But after a year, the gap vanished. Both groups reverted to the same levels of happiness they experienced before their life changing events. Once they acclimated to their new status, their expectations altered to match. Paraplegics found joy in small victories, while lottery winners grew accustomed to their wealthy lifestyle.

This example illustrates a fundamental error of human thinking: we are incredibly poor at gauging what will make us happy. University of Southern Indiana philosopher Garret Merriam attributes this ineptitude to what Harvard psychologist Dan Gilbert calls "impact bias."

"We're generally pretty good at imagining what it's going to feel like in the short term -- over the next couple days, the next week or so -- and we project that feeling out in perpetuity for much, much further than it will actually last. That really throws of our self evaluations of what's going to make us happy."

It's not just winning the lottery, of course. Many things that we think will improve our happiness over the long-term in fact do not. Moving to California? Nope. Getting married? Nada. Having children? Definitely not.

The graph is a tad difficult to read, but the description makes its message clear:

"As the four separate studies in this graph show, marital satisfaction decreases dramatically after the birth of the first child and increases only when the last child leaves home."

It should be noted that marital satisfaction is not a perfect proxy for happiness. Nevertheless, the data is enough to counter the conventional wisdom that children make us happy.

If we can't count on reproduction to make us content, surely that promotion at work will! Sorry, no. A 2012 study found "no evidence that promotions impact general health or life satisfaction." However, the researchers noted that two or more years after a promotion worker mental health is significantly lower, driven predominantly by increased anxiety.

"Fame, talent, wealth, beauty; most people think that 'if I only had these things I would be happy,'" Merriam stated at a 2011 talk at Sacramento State University. "But it turns out that the people who have these things are not on average happier than people who lack these things. The grass is always greener on the other side of the street."

(Image: AP)

Scientific American's 'PC Police' Fires Blogger

Last week, while perusing Scientific American's blog section I stumbled upon a post entitled, "Richard Feynman, sexism and changing perceptions of a scientific icon," written by Ashutosh Jogalekar. Already a Feynman fan, I had an inkling of the article's thrust even before reading it: Richard Feynman was, on occasion, a total jerk to women.

That much is clear to anyone who has read Feynman's book, Surely, You're Joking, Mr. Feynman! In it, the Nobel Prize-winning physicist candidly reveals a great many personal anecdotes and beliefs. Some aren't exactly politically correct. For example, in regard to women, he describes his very questionable approach to picking up girls at parties or bars. With a hat-tip to Field of Dreams, it is best described as "If you disrespect them, they will come." Don't buy them anything, don't be polite to them, and don't do what they want... until they've agreed to sleep with you, that is.

When I read that section, I was taken aback. Feynman's actions were archaic, rude, and unacceptable, unbefitting of one of my scientific heroes.

Ashutosh Jogalekar, who penned the article at Scientific American, described having a similar reaction to Feynman's "casual sexism," which also manifested in more than just social arenas. But, he noted, though some of his actions are "disturbing and even offensive" when viewed from the socially-evolved lens of today, "they were probably no different than the attitudes of a male-dominated American society in the giddy postwar years." Thus, Jogalekar reasoned, we should not condemn Feynman wholly as a sexist.

That seems to make sense. While anecdotes from Feynman's own book show that he was a jerk to women in certain settings, there's no evidence that Feynman ever discriminated against women in science. In actuality, it was quite the opposite. As Julia Lipman wrote in 1999:

"Feynman took the side of a female Caltech professor who brought a sexual discrimination complaint against the school. He encouraged his younger sister’s career as a physicist even though their parents didn’t believe that women should pursue scientific careers."

And so, Jogalekar concluded, "We can condemn parts of his behavior while praising his science. And we should."

The article earned some controversy on Twitter, but generally prompted diverse, reflective discussion. Not a big deal.

Ashutosh Jogalekar's Feynman article appeared last Friday. The next day, it was taken down, and Jogalekar was abruptly excused from Scientific American's blog network. (The article has since been reposted "in the interest of openness and transparency.")

Scientific American editor Curtis Brainard offered an explanation for the dismissal earlier this week. He said that some of Jogalekar's posts lacked clarity, which made them insensitive to "valid concerns that many readers have about past and existing biases and prejudices in our society."

In addition to the Feynman piece, Brainard referenced two earlier articles that stoked the ire of a few readers, expressed almost entirely through social media. "The first was a guest post in April about Larry Summers’ statement regarding women in science. The second was a post in May, which favorably reviewed a controversial book by Nicholas Wade, A Troublesome Inheritance: Genes, Race and Human History." (Robert VerBruggen also gave this book a moderately positive review for RealClearScience.)

The first post attemped to navigate the muddy waters of gender discrimination in science, and why, in certain fields, there are more men than women and vice versa. The guest author, Chris Martin, respectfully contended, "The research clearly shows that such discrimination exists—among other things, women seem to be paid less for equal work... but the latest research suggests that discrimination has a weaker impact than people might think, and that innate sex differences explain quite a lot."

What exactly is "insensitive" about that?

In his review of Nicholas Wade's controversial book, Jogalekar wrote:

"Overall I found this book extremely well-researched, thoughtfully written and objectively argued. Wade draws on several sources, including the peer reviewed literature and work by other thinkers and scientists. The many researchers whose work Wade cites makes the writing authoritative; on the other hand, where speculation is warranted or noted he usually explicitly points it out as such. Some of these speculations such as the effects of genetics on the behavior of entire societies are quite far flung but I don’t see any reason why, based on what we do know about the spread of genes among groups, they should be dismissed out of hand. At the very least they serve as reasonable hypotheses to be pondered, thrashed out and tested. Science is about ideas, not answers."

While I disagree with Jogalekar's favorable view of the book, there was nothing in his review that struck me as distasteful. His article was well within the mainstream of scientific thought.

In the wake of his removal from Scientific American's blog network, Jogalekar has remained polite and pensive, expressing nothing but respect for Brainard and the magazine. He did, however, ask some open questions. For example:

"How much should a brand care about opinions (particularly negative ones) on social media, especially in an age when waves of such criticism can swell and ebb rapidly and often provide a transient, biased view of content?"

How much should a brand care about opinions (particularly negative ones) on social media, especially in an age when waves of such criticism can swell and ebb rapidly and often provide a transient, biased view of content? - See more at: http://wavefunction.fieldofscience.com/#sthash.6rFo9aVc.dpuf

The simple fact is that science is occasionally uncomfortable and sometimes runs counter to what we believe. But that doesn't mean we should shy away from it. Yet, that is what Scientific American has chosen to do; they have dismissed a blogger for tackling controversial topics and ruffling a few overly sensitive feathers.

"A scientific topic cannot be declared off limits or whitewashed because its findings can be socially or politically controversial," Jogalekar sagely wrote in one of his pieces.

Apparently, Scientific American disagrees. And in their politically correct world where feelings come before facts, that means you lose your job.

(Photo: Censorship via Shutterstock)

Robert Hooke: A Genius Comparable to Newton

Isaac Newton's preeminence in the history of science and mathematics is fully deserved. However, his enormous reputation overshadows the importance and work of some of the other founding fathers of modern science. Liebniz, Huygens, Boyle and other contemporaries are outshined by his incomparable brilliance.

Another early natural philosopher with gifts rivaling Newton's was Robert Hooke (1635-1703). Physics, biology, microscopy, paleontology, astronomy and engineering all bear his fingerprints. Hooke was an experimentalist and the definition of a polymath; he was a leading expert and discoverer in nearly every field in which he worked.

Hooke is known to physicists as the creator of Hooke's Law: force is proportional to stretch. This is an accurate approximation for the behavior of how elastic materials pull back against being stretched like a spring. The approximation of a resisting force linearly proportional to the pull on an object is ubiquitous in physics and engineering.

While the fundamental law of gravity is named for Newton, it appears that Hooke nearly discovered the law himself before and during the time Newton worked on the problem. Scientists and historians agree that Newton was the first to derive and investigate in detail the mathematical aspects of universal gravitation. Hooke attempted to prove the law by a series of experiments which were beyond the means of the time. Details of his ideas of an inverse square law of attraction between all heavenly bodies, pulling toward the centers of their spheres, were conveyed in his correspondence with Newton during the latter's time working on the Principia.

Perhaps Hooke's greatest contribution to science was in the field of biology. Hooke, along with Leeuwenhoek, was the first to construct a practical microscope and use it to study nature. Peering at plant tissues, Hooke saw a pattern of tiny container units, which he named cells after the tiny cubicles inhabited by monks. Later he verified Leeuwenhoek's discovery of microbes.

Hooke's famous work Micrographia contains beautiful, meticulous illustrations of flies' eyes, hairs on mites, crystals, veins in leaves, fibers in cloth and other subjects under his microscope. Hooke particularly marveled at the way that biological organisms were revealed to have tinier and tinier levels of beautiful detail, while man-made objects such as razor blades proved to be rough and ugly when magnified.

Fossils fascinated Hooke. He was the first to observe their microscopic structure; the similarity between living and petrified wood led him to propose that the bodies of dead organisms transformed into rock as their tissues were replaced with minerals. Many previous thinkers had concluded that fossils grew as they were within rocks, never having known life.

Turning his powerful magnifying lenses to the Heavens, Hooke worked on more studies. He attempted the failed measurements of Brahe and Galileo to find the distance to a star through the method of stellar parallax. This work was a failure. Even the best 17th century telescopes were far too weak to detect the tiny change. Hooke also misinterpreted his data. Other studies included the rings of Saturn and double-star systems.

Science wasn't the only discipline in which Hooke was a master. After the great fire of 1666 burned nearly all of London to the ground, Hooke was appointed chief surveyor for the reconstruction. He proposed to rebuild the city on a grid and carried out an enormous amount of surveying and architectural design. Hooke also invented some of the methods and devices used to make clocks extremely accurate for the time, maritime navigational instruments and other optical and mechanical devices, including the universal joint and a practical vacuum pump.

All this begs the question: Why don't we mention Hooke in the same breath as Newton? Blame the latter for a fraction of this. Newton was an incredibly jealous and vindictive man, who liked to destroy those with whom he disagreed. It's not all on Newton though. Hooke himself took a turn for the worse as he aged. Even short biographies of Hooke often include a prominent mention of his disagreeability. It always pays to make friends and be nice.

It also helps to not irritate the most famous scientist of all time.