How to Spread Misinformation

With the advent of the Internet and social media, the average person is afforded unprecedented power to consume and spread information. But with great power, comes great responsibilities -- to be skeptical, to seek out facts and evidence, to restrict, or at least not aid, the spread of rumors. Of course, one also has the ability to completely ignore all those responsibilities. When that happens, you may discover that you're a purveyor of misinformation.

And if you're going to spread misinformation, you might as well do it right. To all the devout Natural News readers, anti-vaxxers, and alternative medicine scam artists out there, this guide's for you! If truth and logic is your thing, feel free to read these tips as well, then forget them, or better yet, do the exact opposite.

Embrace your biases. Whatever you're into, chances are you can find an advocacy group, corporation, or political party whose tweets and Facebook posts you can pass on without question! We humans have a tendency to favor information that confirms our beliefs -- give in to it! Seek out sources that share your stances and like-minded individuals to revel with. Turn your life into a sounding board.

Don't stop. Don't think. According to psychologists, assessing the veracity of a piece of information requires both motivation and brain power. You certainly don't want to waste either of those precious resources. Better to stick to your carefully tailored and guarded mindset. Avoid visiting pesky websites like Snopes or TruthOrFiction. Don't bother searching out conflicting sources or reading the entire article. Let the mass media be your guide: if you want to be the most popular, you have to share the information first!

Repeat, repeat, repeat! Your friends, family, and followers need to know that juicy bit of information or controversial discovery! So what if all the "facts" don't line up? Facts don't change minds! Opinions are formed and molded by a barrage of information. Whether that information is true or not doesn't matter.

Ignore the Fallout. Who cares if vaccine-preventable diseases like measles, mumps, and whooping cough are on the rise because you've convinced everyone that vaccines are evil? So what if you blindly shared a fake story that an asteroid was set to wipe out all life on Earth? Who cares if you fueled a fire of false allegations that nearly destroyed somebody's life? If you do start to feel guilty, just take a look at that insane photo that's too unbelievable to be true... then hit "share."

 

Whatever you do, don't read these: "Digital Wildfires in a Hyperconnected World." World Economic Forum. 2013

Stephan Lewandowsky, Ullrich K. H. Ecker, Colleen M. Seifert, Norbert Schwarz, and John Cook. "Misinformation and Its Correction: Continued Influence and Successful Debiasing." Psychological Science in the Public Interest December 2012 vol. 13 no. 3 106-131

"I Don't Want to Be Right." Maria Konnikova. The New Yorker. May 2014.

Can you Poop Via Your Mouth?

Last week, with the help of Mary Roach's fascinating book Gulp, I tackled a question surely on the back of everybody's mind: Can you eat via your anus?

This week, again guided by Gulp, I do the disgusting and flip the question. Ladies and gentlemen, restrain your gag reflexes.

Can you poop via your mouth?

Sketchy websites, questionable YouTube videos, and a hilarious episode of South Park all present evidence pointing to "yes." What's more, the American Heritage Medical Dictionary defines a condition dubbed fecal vomiting: "The vomiting of fecal matter that has been drawn into the stomach from the intestine by spasmodic contractions of the gastric muscles." The term is even used various times in the medical literature.

The answer to the question, however, seems to be a nuanced "no."

Indeed, fecal vomiting is a genuine condition, rarely occurring in cases of severe constipation in which the colon is completely full of feces. The vomit, however, doesn't actually contain what we recognize as poop, which comes from the colon. It contains liquid from the small intestine, ejected in with the help of powerful, reverse contractions of muscles in the small intestine and esophagus.

"A well-formed stool does not exit the upper end of the colon," Roach writes.

To the few unfortunates afflicted with fecal vomiting, the difference may seem borderline semantic. After all, liquid from the small intestine can be darker in color and doesn't exactly smell like roses.

Strangely enough, there are a couple dozen reports of actual feces being expelled from the mouth, however all of them come from the early days of modern medicine and are highly dubious. The reports prompted one skeptical physician, Dr. Gustav Langmann, to put a claimant to the test. In 1889, he undertook the care of a female patient who witnesses reported had passed stools via the mouth. At some point during the woman's hospital stay, nurses discovered "some hard feces, wrapped in paper" under her pillow. Ewwww. Case closed.

Source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013

(Image: South Park Studios)

Why Eating McDonald's Is Completely 'Natural'

To a great many ardent nutrition purveyors, the Big Mac is the embodiment of evil: greasy, loaded with hard-to-pronounce chemicals, and entirely unnatural.

Packing in 550 calories, 29 grams of fat, and 65% of your daily value of sodium, it's easy to see why you probably shouldn't eat a Big Mac at every meal. However, with that said, caricaturing the act of eating McDonald's and other fast food as unnatural is completely unwarranted. (It is also, of course, fallacious to assume that what is "natural" is always better.)

Humans are perhaps the ultimate omnivores. We evolved to eat whatever is around, be it plant, animal, or burger. Before the rise of agriculture roughly 10,000 years ago, our ancestors were predominantly hunter-gatherers. Spread far across the globe, their diets matched their surroundings. For examples, we can look to modern hunter-gatherers. 95% of the Inuit's diet is comprised of meat and fish. The !Kung of southern Africa eat mostly seeds, nuts, fruits, and vegetables. Farther north, the Hadza predominantly consume meat, fish, and roots.

There is no specific "natural" human diet, evolutionary biologist Marlene Zuk wrote in 2009. What's "natural" for us is to eat whatever we can. Over time, our bodies may adapt to take advantage.

"Take dairy products, one of the classic modern foods we supposedly aren’t meant to eat. Most people who can’t tolerate them lack a gene that confers the ability to break down lactose, the sugar in milk, after the age of weaning. Our Stone Age ancestors couldn’t digest milk as adults either, but a recent study shows that about 5,000 years ago, mutations that keep that gene switched on spread throughout Northern Europe. That’s also when cattle began to be domesticated; being able to drink milk as well as lower-lactose cheese would have been advantageous as a source of nutrition and fluids."

So then what about the epidemic levels of obesity? Our Paleolithic ancestors certainly weren't as voluptuous as we are. That's true, but they also only lived till age forty. Moreover, obesity levels aren't sky high because humans have strayed from our natural dietary path. They're sky high because food is much easier to attain than it used to be. In the developed world, the question now is not "Will we eat?" It's "How much will we eat?" If an "unnatural" McDonald's diet made us fat, then you wouldn't be able to lose weight eating there. But you can!

Writing at Science-Based Medicine, Harriet Hall dismantles the notion of a natural human diet:

Arguments that we should eat what we evolved to eat are undercut by three facts: humans have continued to evolve since the Paleolithic, humans evolved to be adaptable and to thrive on a wide variety of dietary intakes; and we evolved to have the survival advantage of intelligence and inventiveness to develop technology to improve our access to food (for instance, cooking). In other words, technology is “natural” for humans, and eating in a variety of ways is natural for humans too.

Imagine if humans weren't naturally flexible eaters.

"If our ancestors had been less adaptable and if there was a single healthy diet, humans could not have spread to new continents or survived the climate changes of the Ice Age," Hall notes.

Humans' natural ability to thrive on almost any diet is a key to our species' success. So go ahead. Have that Big Mac.

Changing the World, One Tweet at a Time

It is easy for journalists to succumb to the notion that we make very little difference in this world. According to Pew, only 28% of Americans believe journalists contribute "a lot" to society, while 27% believe journalists contribute "not very much" or "nothing at all." And a Gallup poll showed that only about 20% of Americans believe that reporters are honest and ethical. On the bright side, at least we beat out car salesmen (9%), Congressmen (8%), and lobbyists (6%). Huzzah!

Many young people go into journalism because they "want to change the world," but by that, they often mean engaging in advocacy journalism. I believe that is the wrong approach. Too much of it, in my opinion, comes across as unobjective and even dishonest, far more akin to propaganda than to journalism. That is because advocacy journalists -- by definition -- are wedded to an ideological worldview and, hence, are not particularly amenable to changing their opinions if the facts change. Instead, they resort to cramming and distorting evidence to fit into their worldview.

We manifestly reject that sort of journalism at RealClearScience. Like most journalists, we also want to change the world, but we do not want to do so by embracing political or social causes. Instead, we just want to tell the truth to the best of our ability. And when the facts change, our opinions change. Those are two of the guiding principles upon which this website is based.

A logical corollary to those principles is that we should hold influential people accountable for their actions and words. When we come across scientific misinformation, we believe it is our duty to correct the record. We believe the traditional "watchdog" role of journalism is important for keeping the public properly informed and maintaining a healthy democracy.

To that end, Todd Myers -- the environmental director of the Washington Policy Center -- recently brought to my attention a curious tweet posted by the "Ecoconsumer," a King County (Seattle and vicinity) employee whose job is to promote various environmental causes. (No, I don't think that's a legitimate use of taxpayer money, but that's a subject for another day.)

His tweet, since removed (but forever memorialized here), advertised a new "documentary" about how cell phones cause cancer. Loyal readers of RealClearScience, as well as most reasonably educated people, know that is complete rubbish. Because cell phones operate using microwaves, it is not physically possible for them to cause cancer. As Michael Shermer explains, microwaves do not have enough energy to break chemical bonds, which is a requirement for something to be carcinogenic. Besides, we are constantly bathed in electromagnetic radiation -- from the sun, wi-fi devices, laptop computers, radio broadcasts, etc. If all electromagnetic radiation caused cancer, none of us would be alive.

Disturbed that a government employee would post such nonsense, I contacted the Ecoconsumer. He sounded rather flustered by my phone call, so I reached out to his superior. Within mere moments following a brief conversation, in which he agreed with me that King County should be disseminating accurate information, the offending tweet was removed.

I must admit that I was pleasantly surprised. Encounters with bureaucrats rarely end so quickly and favorably. I felt that, for the first time, my job as "watchdog" actually paid off. And though removing a single post from Twitter hardly constitutes an Earth-shaking victory, maybe it is quite possible for journalists to make a difference in this world -- albeit, for some of us, just one tweet at a time.

(Photo: Make a Difference via Shutterstock)

The NFL Has a Lower Rate of Domestic Violence Than the General Population

This year, three National Football League players -- Adrian Peterson, Ray Rice, and Greg Hardy -- have either admitted to or been convicted of domestic violence. Their stories coalesced into a storm this past week with the release of a damning new video of Ray Rice punching his wife (then fiancée) and the indictment of Adrian Peterson, debatably the NFL's best running back, for child abuse.

The media onslaught of updates, analysis, and opinion on what has been called the National Football League's "worst week ever" leaves a distinct impression: the NFL is a league stocked full of criminals.

Evidence, however, doesn't bear that out.

Back in 1999, leading criminologist Alfred Blumstein teamed up with author Jeff Benedict, who has written five books focused on crime and athletics, to compare rates of criminal violence among NFL players to that of the general population. Controlling for age, they found that the annual rate of assault and domestic violence among NFL players was less than half that of the general population.

But Blumstein and Benedict's analysis is fifteen years dated. Perhaps things have changed in that time?

It doesn't appear they have. Back in July, FiveThirtyEight's Benjamin Morris tallied up the incidents in USA Today's NFL Arrests Database to discern crime rates among NFL players. He then compared those numbers to the national averages among 25-29 year olds, and found the rate of domestic violence in the NFL to be 55.4% that of the general population. And the overall crime rate was a mere 13% of the national average.

So why then do 69% of Americans believe that the NFL suffers a "widespread epidemic of domestic violence problems"? The answer is rooted in how we think. Humans are prone to rely on examples and experiences that can be easily recalled. The idea is that if we can remember it, it must be important. This mental shortcut is termed the availability heuristic. A key drawback of the heuristic is that it leads us to overestimate the prevalence of memorable events. Here, you can legitimately blame popular media. Because plane crashes are widely covered, many erroneously view flying as more dangerous than driving. Thanks to Shark Week, people are wearier of sharks than deer. Because 91% of people have seen, read, or heard something about Ray Rice's domestic violence, they overestimate the problem of domestic violence in the NFL.

That's not to say that domestic violence isn't a problem in the NFL. By type of crime, domestic violence is the closest the NFL comes to the national average. Moreover, Morris noted that NFL players do seem to commit acts of domestic violence at a higher rate than individuals with a similar socioeconomic status, though a direct comparison wasn't available.

As public figures, football players must hold themselves to higher standards, and be punished appropriately when they fail to meet them. But more quintessentially, as human beings, they need to recognize that unprovoked violence against others, particularly those not able to defend themselves, is utterly reprehensible.

(Image: AP)

Should Macho Men Shave Their Legs?

Every man comfortable enough with his masculinity to squeeze into performance-enhancing lycra athletic body wear has sooner or later confronted the next frontier: The question of whether to shave his legs!

Aside from perhaps staunch feminists, female athletes don't face this social conundrum. Among male athletes, the otherwise socially uncool body-smoothing "manscaping" has long been a tradition. Cyclists will tell you that it might improve recovery from road rash or make leg massages better. They'll also admit that it's fashion. Tough, big-deal bike riders do it. Nobody races Le Tour de France with hairy legs. Male shaving is a form of machismo.

It also has negligible performance benefits, or so the conventional wisdom goes. Now, new data from men on bikes in wind tunnels contradicts this view. Bicyclists were measured to move more quickly with shaved legs! In theory this makes sense. Generally, smooth surfaces are more aerodynamic than rough or uneven ones.

Aerodynamics is so important to cyclists because the practical limit of their speed is not their muscle power, but the aerodynamic drag of their ride: bike and body. Terminal velocity on level ground (on a properly geared ride) is determined by how cleanly the forward-facing shapes cut into the wind. The more carefully a surface cleaves oncoming air into parts without disturbing it into a chaotic turbulent mess, the faster it goes.

An everyday pleasure rider may hit 15 mph on a brisk ride, a commuter may cruise at 16-18, and a professional racer can hold speeds in the mid 20s. A rider on a bike with an extremely aerodynamic fairing like the nose of a rocket can reach speeds of more than 80 mph!

"Aero" has become a huge buzzword and selling point in the cycling industry. Most competitive races have actually banned certain bike designs for being too fast. Within a limited bicycle geometry range, the next gains to be made are those from the other half of the aerodynamics of the system: the rider himself. Riders often employ a hunched position, with the arms out and the head tucked down, to reduce aerodynamic profile. They may smooth even their natural body profiles with seamless skinsuits.

Here's where the hairy legs come in. Smooth legs should be slightly more aerodynamic than hairy or, heaven forbid, "stubbly" legs, right?

Previous tests said no, there was no measurable effect. Leg-shaving is just machismo. This new test says otherwise. A cyclist going into the wind tunnel for aero testing at the bike industry "Specialized" forgot to shave his legs first. His test showed significantly higher drag. Surprised, he came back days later with legs as smooth as a baby's cheek, in addition to a 7% gain in aerodynamic slipperiness!

It was a repeatable result too. Several more cyclists tested in the same wind tunnel gained similar aerodynamic advantage. 7% doesn't sound massive, but it can mean more than a minute faster in a one-hour race against the clock. That is a huge competitive advantage. A similar gain in aerodynamic profile might require hundreds of dollars of specialized bike parts.

These results fly in the face of the last serious study of the subject in the 1980s. The big lesson: verifying previous results is really important. Also, men need to get to work with those razors. I'll no longer be making fun of you for your "macho fashion."

Can You Eat Via Your Anus?

It began in Ancient Egypt, was used to sustain an ailing president, and has been popularized on South Park. But is it true? Can you eat via your anus?

If eating was merely a matter of stuffing something in a hole, whether up top or down below, the answer would be "yes." But while insertion is certainly a key part of the eating process, it's only equivalent to walking through a room door. What happens inside is the most important.

Let's first follow the path of a piece of pizza consumed the good old fashioned way: through the mouth. Once chewed and swallowed, the food travels down the esophagus and into the stomach, where gastric juices begin to break down proteins. After roughly one or two hours, a valve opens and the pizza -- now liquidized and unrecognizable from its once delicious form -- continues its journey, first through the duodenum -- basically the waiting room for the small intestine -- and eventually into the small intestine itself. Here is where the digestive heavy lifting occurs. Roughly 95% of all nutrient absorption occurs within the small intestine. The pizza's next stop is the large intestine -- or colon -- where water, minerals, and some vitamins are taken in. Lastly, what remains of the pizza arrives at the rectum, to be excreted at your convenience.

As recently as 1926, the medical community believed that process could be reversed to an extent. Doctors regarded rectal feeding as a legitimate method for sustaining patients unable to eat normally. It made some sense. After all, the rectum is much closer to the small intestine than the mouth is. Who's to say that food won't wind its way upwards?

Well, as research would elucidate, the digestive system is not a two-way street. Scientists examining cadavers found that the small intestine was unreachable from below. A 1926 study using live medical students as guinea pigs was even more conclusive. The researchers found that nutrient enemas were only good for hydration. Food would enter the colon via the rectum. Sit around for a bit. Then come right back out... smelling a whole lot worse.

In time, tube and intravenous feeding supplanted rectal feeding, no doubt to the cheers of both patients, caregivers, and "guinea pig" medical students.

(Image: Shutterstock)

Source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013

Debunking the Anti-Fracking Fearmongers

World events have made it quite clear to most Americans that we should develop more of our own energy sources. Reducing our reliance on foreign oil by exploiting the natural gas under our feet is not only smart foreign policy but also smart environmental policy: Natural gas burns cleaner than coal or oil, and it has already lowered our CO2 emissions. Natural gas is a win for America and the planet.

But not according to anti-technology environmentalists, who have made all sorts of wild, unsubstantiated claims about the supposed harms of fracking. Three claims in particular are worth examining: (1) Fracking causes a dangerous leakage of methane into drinking water; (2) Fracking causes earthquakes; and (3) Fracking chemicals contaminate drinking water.

Claim #1 should be considered thoroughly debunked. The "documentary" Gasland, which depicted a guy lighting his tap water on fire, kickstarted the anti-fracking movement. The infamous scene, however, was built upon a lie: The methane in his tap water was due either to natural methane migration or to faulty well casings, not to fracking itself. And methane is neither toxic nor likely to cause your house to explode, so the note above the faucet, which read, "Do not drink this water," was nothing more than theatrics.

Even if basic chemistry and physics do not constitute sufficient evidence against Claim #1, then a new study in the journal PNAS should provide the final nail in the coffin. The researchers closely examined eight instances of drinking water contamination associated with the Marcellus and Barnett Shales. Their analysis reconfirmed the emerging consensus: Fracking itself does not cause methane to contaminate groundwater, but shoddy construction work can. Specifically, the researchers blamed leaky annulus cement and production casings.

Claim #2, that fracking causes earthquakes, is also misleading. Anti-fracking activists, including Rachel Maddow, have ignored research that suggests a nearby existing fault is necessary for fracking to trigger an earthquake. And as Bryan Walsh reported in TIME, the earthquakes are relatively minor and caused not by fracking itself but by the wastewater injection wells. (It should also be noted that injection wells are used for other things besides the disposing of wastewater from fracking. These injection wells can also trigger earthquakes.)

Claim #3, that fracking contaminates drinking water with various chemicals, is the only one that might have legs. The EPA detected carcinogenic benzene in Wyoming groundwater, and other researchers found arsenic in Texas groundwater.

If it is true that fracking is responsible for various chemicals leaking into groundwater, then the next step should be to determine if the pollutants are at unsafe levels. If they are, then the government should tighten regulations. Alternatively, Mr. Walsh suggested that companies "work on ways to clean, recycle and reuse wastewater from wells, eliminating the need for the deep injection wells." That's a good idea. It would prevent both minor earthquakes and groundwater contamination.

The EPA is set to publish a comprehensive report on fracking, but it has been delayed until 2016. Until then, there will probably be a lot more fearmongering in need of nuance.

Source: Thomas H. Darrah, Avner Vengosh, Robert B. Jackson, Nathaniel R. Warner, and Robert J. Poreda. "Noble gases identify the mechanisms of fugitive gas contamination in drinking water wells overlying the Marcellus and Barnett Shales." PNAS. Published online before print: 15-Sept-2014. doi: 10.1073/pnas.1322107111

Huffing Xenon a New Cheat for Athletes

As long as money, fame and love are to be won, professional athletes will continue to swallow, snort, shoot up, huff and "suppositorize" strange chemicals. Word of the benefits of gulping down rare xenon gas has been out for years now. How does it work, and why do they do it?

First, a brief history of the pervasive nature of chemically enhanced cheating in sports. Let's start with the most grueling competition: Le Tour de France.

Cyclists in the first half of the 20th Century eased their pain and boosted their stamina with numerous harsh substances: Alcohol, rags soaked in ether, strychnine and even cocaine. After World War II, amphetamines became prevalent in the sport until heart attack deaths during competition, such as Tom Simpson's death on the side of the road on a famous mountain in the 1967 Tour, brought about bans and tests. Around this time, injected male hormones and steroids also began to be used. The 1980s and 1990s saw the rise of blood doping. Impossible performances were recorded and riders died. Men in their late 20s weighing less than 150 pounds had heart attacks in their sleep. Lance Armstrong, of course, was the most infamous of this generation of cheaters.

New technologies such as better synthetic hormones and the Pandora's box of genetic doping are now becoming possible. Cheaters always win for a while -- that is, until rule-makers catch on and athletes are pushed to find something new.

Most other professional sports have a similar legacy: An evolution of drug-taking to improve performance, counterattacked by rules to thwart it. Often a chemical is taken for many years before authorities even devise methods to test for it, let alone become aware of it. As this pharmaceutical arms race continues, the xenon gas technique has emerged. It relies upon the same fundamental biology as many other performance-enhancing drugs: It boosts oxygen levels in the bloodstream.

The most obvious method to pack more oxygen into blood is to directly inject concentrated red blood cells (called blood doping). This method was used with great success in the 1980s and 1990s before tests were implemented to sniff its telltale tracks. Populations of red blood cells in one body with different surface antigens (protein "fingerprints") indicate transfusion from one person to another. Reinjection of an athlete's own blood, previously drawn when it contains more oxygen before prolonged activity, is harder to detect chemically. Careful recording of the blood content of an athlete over time can catch these activities as certain blood indicators spike abnormally (or reach levels naturally impossible for an adult human being). It makes cheating more difficult, but there are workarounds.

Dopers responded by taking medical research into a dark alley. They learned to inject the body with substances that cause the body's own natural red cell production to skyrocket. These chemicals, primarily erythropoietin (EPO) do not introduce new blood cells directly, so they must be found by direct detection of the chemical or other agents used to mask its presence. EPO is naturally produced by the bone marrow to regulate red blood cell production, but testing can distinguish the natural chemical from the synthetic version commonly injected. Using EPO and similar chemicals is now harder to get away with, and some athletes rely on microdosing, which reduces effectiveness.

This is where the huffing of gas comes in. Recent medical research has shown that breathing concentrated xenon, argon and possibly other noble gases (those on the far right column of the periodic table) triggers production of natural EPO in the body. These studies originally looked at xenon as a well-known anesthetic or as a treatment to alleviate lack of oxygenated blood flow to tissue, such as a kidney after injury.

The dark science of sports doping keeps very close tabs on medicinal research. As soon as it read that these benefits were the result of enhanced natural production of EPO, it began to appear in athletic competitions. As well as being too new to be understood and banned, it was easier to get away with: It is not uncommon to see athletes inhaling oxygen through a gas mask apparatus on the sidelines. Breathing in a mixture of oxygen and xenon acts almost immediately to trigger an increase in red blood cell production. As explained in this wonderfully thorough article, Russian companies who produce inhalable oxygen/xenon mixtures claim that the effects begin in minutes and last up to three days.

Last week, intentional inhalation of all gases that increase EPO production was officially banned by WADA, the biggest anti-doping authority in sports. However, there's no test yet. Catching an athlete in the act is difficult, but governing bodies say that they will be able to detect it soon.

How they will do this is not easy to guess. Xenon is very quickly eliminated from the blood; within a minute, half of it is gone. In an hour, the concentration in the bloodstream is probably indistinguishable from what is to be expected from breathing atmospheric xenon. Detection may rely upon finding a side effect caused by inhalation of unusual gas mixtures.

Will we see athletes busted for smuggling tanks of inert gas? Worse, will there be adverse medical effects discovered down the line?

Many athletes probably don't care. They just want to win. Cheaters gonna cheat.

(AP Photo)

Science: It's Okay to Feel Stupid

In 2008, University of Virginia microbiologist Martin Schwartz recalled a meeting with an old friend, one who had been a Ph.D student with him and had left to attend Harvard Law School instead. At one point during their meeting, he asked why she dropped out.

"She said it was because it made her feel stupid. After a couple of years of feeling stupid every day, she was ready to do something else."

Schwartz was astonished at the answer.

"I had thought of her as one of the brightest people I knew and her subsequent career supports that view," he wrote.

Schwartz pondered on what his good friend had told him.

"What she said bothered me. I kept thinking about it; sometime the next day, it hit me. Science makes me feel stupid too. It's just that I've gotten used to it. So used to it, in fact, that I actively seek out new opportunities to feel stupid. I wouldn't know what to do without that feeling. I even think it's supposed to be this way."

Science humbles even the most brilliant people, bringing them to their intellectual knees. Such is the nature of an enterprise that delves into the unknown.

Schwartz' meeting with his friend inspired an essay: "The importance of stupidity in scientific research," published in 2008 to the journal Cell Science. In it, he argued why it's not only okay to feel stupid, but why it's a necessity.

He began his explanation with a simple and true statement.

"For almost all of us, one of the reasons that we liked science in high school and college is that we were good at it."

But unfortunately, that leaves aspiring scientists with a specious impression. Because, as most established scientists know, science is not about taking tests or getting correct answers! Even the laboratory work most students perform in high school and college is structured to reach a predetermined end. In research, the conclusion is never known at the outset. Researchers may have a strong inkling what might happen, but they don't know for certain.

When aspiring scientists reach graduate school and doctoral programs, being correct is no longer the goal. The goal is solving problems. It's not the same.

"A Ph.D., in which you have to do a research project, is a whole different thing," Schwartz wrote. "For me, it was a daunting task. How could I possibly frame the questions that would lead to significant discoveries; design and interpret an experiment so that the conclusions were absolutely convincing; foresee difficulties and see ways around them, or, failing that, solve them when they occurred?"

Schwartz' personal breakthrough came when he realized that nobody, not even the advisors he looked up to, had the answers to his problem.

"The crucial lesson was that the scope of things I didn't know wasn't merely vast; it was, for all practical purposes, infinite. That realization, instead of being discouraging, was liberating. If our ignorance is infinite, the only possible course of action is to muddle through as best we can."

Muddling earned Schwartz his Ph.D, as it has for countless other students. In fact, muddling is simply what researchers do. Science is like wading through a swamp only to reach a vast unexplored ocean.

"Science involves confronting our `absolute stupidity'. That kind of stupidity is an existential fact, inherent in our efforts to push our way into the unknown," Schwartz wrote.

He believes scientists should embrace that stupidity.

"One of the beautiful things about science is that it allows us to bumble along, getting it wrong time after time, and feel perfectly fine as long as we learn something each time. No doubt, this can be difficult for students who are accustomed to getting the answers right. No doubt, reasonable levels of confidence and emotional resilience help, but I think scientific education might do more to ease what is a very big transition: from learning what other people once discovered to making your own discoveries. The more comfortable we become with being stupid, the deeper we will wade into the unknown and the more likely we are to make big discoveries."

In the six years since it was published, Schwartz' essay has become a source of solace for despairing doctoral students, a reminder that feeling lost is a sign you're on the right course.

(Image: Shutterstock)

Is Anything Certain in Science?

Last week, I was at a coffee shop working when a lady approached me and invited me to attend a science discussion group. The topic was the "limits of science." Intrigued, I put away my laptop and joined the group, which consisted mainly of elderly people who were thoughtful, well-spoken, and seemingly intelligent. I had no idea what to expect in terms of the tone of the conversation, so I listened eagerly as the discussion leader (who has a master's degree in geology) started the meeting.

"Science is subjective, though we like to think of it as objective," he began. "When I speak of 'facts,' I put them in quotation marks." He elaborated that things we once thought to be true were later overturned by further study.

Right away, I knew I was going to be in for a ride. While the geologist didn't clarify exactly what he meant, we can deduce one of two things: Either (1) he does not believe facts are real or (2) he believes facts are not accessible to scientific investigation.

Both of these beliefs are problematic from a scientific viewpoint. The first implies that there is no such a thing as a fact, and hence, no such thing as truth. My favorite philosophy professor, former mentor, and (I'm honored to say) friend, Robert Hahn of Southern Illinois University, once quipped, "If the ultimate truth about the universe is that there is no truth, what sort of truth is that?" I would add that if there is no such thing as truth, then science is merely chasing after the wind. Science would be pointless. As fictitious Tottenham Hotspur coach Ted Lasso would say, "Why do you even do this?"

The second belief poses a much bigger challenge to science because there is no convincing response to it. Philosopher Immanuel Kant wrote of the noumenon (actual reality) and the phenomenon (our experience of reality). Because we experience reality through our imperfect senses, we do not have direct access to it. For instance, we perceive plants as green, but that is simply the result of our eyes and brains processing photons and interpreting them as the color green. How do we know that perception is reliable? Isn't it possible that plants are actually some other color? Given that we are limited by our sensory capabilities, we can never know the answer to that question. Our experience of the greenness of a plant (phenomenon) is separate from the underlying reality of a plant's color (noumenon).

Humans in general, and scientists specifically, ignore this philosophical challenge. We assume that our perception of reality matches actual reality. Do we have any other option? How could we live daily life or accept the findings of scientific research if we believed otherwise?

The point of that lengthy aside is that the geologist's comment was at odds with a practical scientific worldview. But, things got even weirder after that.

When our conversation turned to the reliability of the scientific method, I commented, "Scientific laws are generalized in such a way that if you perform an experiment like a chemical reaction on Earth or on Mars, you should get the same result."

One of the ladies asked, "But how do we know? We've never been to Mars."

I answered, "We have a basic understanding of how chemical reactions work. To our knowledge, they aren't affected by gravity.* So, we should get the same reaction on Mars."

"In theory."

Well, yes, in theory. But this sort of extreme skepticism is difficult to address. Chemistry is a mature science whose basic principles are well understood. Until we have sufficient reason to believe otherwise, we should expect chemical reactions to be identical whether they are performed on Earth or on Mars.

Strangely, a bit later on, the same skeptical lady asked me, "How do you explain telepathy?" She added that there have been times when, as she was speaking to another person, that she knew what the other person was going to say before she said it.

"Scientists don't believe telepathy is real. That's how I explain telepathy," I responded.

"Some scientists do believe in it," retorted the geologist.

True. But, some scientists believe that HIV doesn't cause AIDS. That doesn't mean we should take them seriously. I decided to elaborate: "Think of all the times that you thought of words, but nobody said them. Or all the times you thought of somebody, but they didn't call. You forget all of those, but you remember the few times where a coincidence occurred. That's called confirmation bias."

Unsurprisingly, I didn't win her over. The conversation then took one final turn.

The skeptical lady believed the future would be run entirely by robots and machines. This is referred to as the "singularity" and has been popularized by Ray Kurzweil. It is also probably bunk. Not only are we unable to model a worm's brain accurately, but the scientific knowledge and sheer computing power necessary to properly replicate a human brain -- with its 86 billion neurons and some 100 trillion synapses -- are massive. Besides, there is no compelling reason to believe that computing power will grow exponentially forever. Eventually, some mundane physical factor will limit our technological progress. If (and that's a big if) the "singularity" is even possible, it is likely centuries away.

Our evening ended there. Over the next 24 hours, I pondered what could make otherwise intelligent people embrace pseudoscience and science fiction? Moreover, what could make a person doubtful of chemistry, but accepting of telepathy?

I'm still not sure, but I have a clue. Conspiracy theorists are known to believe contradictory ideas. For instance, as Live Science reported, "people who believed [Osama] bin Laden was already dead before the raid were more likely to believe he is still alive." Similarly, the lady who believed that science wasn't advanced enough to fully understand chemistry yet also somehow so advanced that it could build Earth-conquering robots may be engaging in conspiracy-like thinking. She had no awareness that her skepticism of chemistry and credulity toward telepathy were, in many ways, fundamentally incompatible.

Extreme skepticism and extreme credulity are anathema to the scientific mindset. Successful scientists accept the reliability of the scientific method but question extraordinary claims that are not founded upon extraordinary evidence. That is healthy skepticism, and it was curiously absent from the science discussion group.

*Note: The kinetics of chemical reactions could possibly vary under different gravitational conditions. See an interesting discussion here.

(AP Photo)

The Gluten of the 1900s

Since the rise of both modern medicine and society, a large subset of the Western World's population has required a scapegoat to explain their everyday ills. Today, it's gluten. A decade ago, it was monosodium glutamate (MSG). One hundred years ago, it was poop.

Yes, poop. But I'm not talking about the occasional dog doo along the side of the road. (Though in the early 1900s, there was plenty of horse manure to go around.) I'm referring to the feces stored inside you, within the wondrous trash receptacle that is the colon.

Thousands of years ago, the ancient Egyptians were affronted by the idea that, at any given time, feces were inside their bodies. If it was so nasty coming out, surely it must equally nasty roiling about within! They reasoned that putrefying poop releases toxins that leach into the circulatory system, causing fever, creating pus, and making people sick.

We can excuse the ancient Egyptians for their naiveté, but it's harder to go easy on physicians of the early 1900s. With a shiny new name -- autointoxication -- and just a few preliminary studies to back it, the theory became widespread. Admittedly, it must have been difficult to go against Ilya Ilyich Mechnikov, who won the 1908 Nobel Prize in Medicine for his work on phagocytosis, the process in which a cell -- often a white blood cell -- engulfs an invading particle or microbe. Mechnikov argued that intestinal toxins shorten lifespan, and that lactic acid could break them down. That's why he drank sour milk every day.

Sir William Arbuthnot Lane was also an influential proponent of autointoxication. But it was his wild overreaction to the theory that eventually helped reveal it as pseudoscience. Lane advocated irrigating, and sometimes even removing, the colon entirely to treat conditions ranging from general fatigue to epilepsy. Needless to say, both approaches did far more harm than good. Colon removal was particularly ill-advised. After all, if you eliminate an integral part of a sewage treatment plant, pretty soon you'll find $%#@ everywhere. When Walter Alvarez publicly pointed out the sheer lunacy of demonizing a vital bodily organ in the Journal of the American Medical Association in 1919, physicians everywhere finally shook off their fecal infatuation.

"Autointoxication was one of the most pervasive and enduring concepts in the long, bloated history of medical pseudoscience," Mary Roach wrote in Gulp. "It made no difference that neither the specific poisons nor the mechanisms by which they might be causing harm were known or named. In the realm of quackery, vague is better."

"It met a need," wrote James Whorton, a historian of science at the University of Washington, "that medicine has felt in every age, providing an explanation and diagnosis for all those exasperating patients who insist they are sick but are unable to present the physician with any clear organic pathology to prove it."

"Autointoxication was the gluten of the early 1900s," Roach commented.

Today, autointoxication lives on in the form of fruitless cleanse diets and enemas of all sorts. The lingering stench of pseudoscience never fully dissipates, especially when it comes to bull@#$%.

Source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013

(Image: prostok / Shutterstock.com )

Why It's So Hard to Swap the Toilet Paper Roll

WILL REID WAS getting tired of asking his teenage children to change the toilet paper roll, so he did what any enlightened father would do: call them out on YouTube. His "instructional video" on toilet roll changing has amassed over one million views since being posted on August 29th.

Of course, laziness isn't endemic to teenagers. Countless others willingly neglect to empty the overflowing trash bin, clean the leaning tower of dirty dishes, or replace the toilet paper roll.

Why do millions of Americans persistently put off easy chores like these? The answer is tied to motivation, or rather, a lack thereof.

Richard Ryan and Edward Deci, a prolific duo of psychologists based out of Rochester University in New York are the preeminent researchers on the science of motivation. They've narrowed down the basis of human action to two main drivers: intrinsic and extrinsic. We either perform an activity out of interest or enjoyment for the activity itself -- intrinsic -- or we perform an activity to attain an external, separate outcome -- extrinsic. Intrinsic activities are inherently motivating. Extrinsic ones are not.

After outlining these two categories, it's already easy to see why menial chores are often ignored. They're not stimulating in the slightest, so they certainly aren't intrinsically motivated. But as extrinsic activities, they lack enticing outcomes. For example, if you take out the the trash, you're rewarded with an empty bin and a guarantee that you'll have to repeat the chore in a couple of days. Totally lame.

According to Ryan, there also seems to be an inherent "control issue."

"One side is pressuring and demanding—the other (procrastinator) side is either unmotivated or rebelling," he explained in an email.

Ryan and Deci break down motivation as part of a framework called Self-determination Theory. For humans to really want to do something, they say, the task must satisfy three psychological needs: competence, autonomy, and relatedness. It must be hard enough to make us feel like we're accomplishing something and challenging ourselves: competence. Replacing the toilet paper roll comes up short here. The task must also grant some degree of freedom, like we're not being controlled: autonomy. If doing the dishes isn't a form of bondage, I don't know what is. Without clean pots, pans, dishes, or utensils, the task of feeding oneself can seem insurmountable. We've become slaves to modern modes of cooking and eating. Lastly, the task should at least partially sate our desires to feel that we belong to something grander than ourselves and that we are connected to others: relatedness. In co-op living arrangements, chores fulfill this psychological need. But in typical households, there's a fundamental disconnect. It's often every roommate or family member for themselves.

ENHANCING FEELINGS OF competence, autonomy, and relatedness surrounding a boring task has been shown to significantly boost motivation without altering the task itself. In 1994, Deci completed an experiment in which subjects were sat down in front of a computer and told to press the spacebar whenever a dot of light randomly appeared on the screen. One group simply completed the task with sufficient, but minimal, instruction, while another group performed it after having it presented in a slightly different manner.

"Doing this activity has been shown to be useful," the researchers told the participants. "We have found that those subjects who have done it have learned about their own concentration."

Researchers also acknowledged the subjects' dislike for the task. "I know that doing this is not much fun; in fact many subjects have told me that it's pretty boring."

Lastly, the wording in the task description was changed to make the subjects feel less compelled to take part, a subtle hint that they were free individuals who could walk away at any time.

The participants in the latter group reported feeling happier with the task, as well as more motivated to complete it.

CAN SELF-DETERMINATION Theory be put to use where menial housework is concerned? It probably won't work if you experiment on yourself, but you can certainly try it out on your indolent roommate our your neglectful kids! Tell them that taking out the trash builds character and competence, and that loading the dishwasher is an exercise in problem solving (How can I arrange the dishes most efficiently?). Impress upon them how important it is to you and the rest of the family that everyone pitch in, imbuing a sense of relatedness. And lastly, grant some autonomy in how and when they perform the chores.

If you're the guilty one, Ryan also offered a blunter tidbit of motivation.

"Remember why taking out the trash is worthwhile."

(Image: Shutterstock)

Are You a Crackpot? Take the Test!

In 1992, UC-Riverside mathematician and physicist John Baez was overloaded, not with his day-to-day activities, but with emails from people touting "revolutionary ideas" that required his learned fine-tuning. This would have been fine, had the ideas at least had a foundation in reality. Sadly, almost all of them were not in accordance with recognized laws of nature.

In response, Baez created The Crackpot Index: "A simple method for rating potentially revolutionary contributions to physics." The index comprises 36 items tailored to determine whether an idea and the person behind it are brilliant or daffy. If your score is low, you might have something. But as it starts inching up, you might want to consider donning a hat made from aluminum foil and reconsidering your perception of reality.

Here are a few of the items:

1 point for every statement that is widely agreed on to be false.

5 points for using a thought experiment that contradicts the results of a widely accepted real experiment.

10 points for each new term you invent and use without properly defining it.

20 points for talking about how great your theory is, but never actually explaining it.

40 points for comparing those who argue against your ideas to Nazis, stormtroopers, or brownshirts.

Now, let's put The Crackpot Index to use. Andrea Rossi and Sergio Focardi's cold fusion Energy Catalyzer (E-Cat) should do nicely. A brief visit to E-Cat's website provides a number of examples:

1 point for every statement that is widely agreed on to be false.

Too many to count. Rossi and Focardi's international patent application for the E-Cat was judged to "offend against the generally accepted laws of physics and established theories."

10 points for offering prize money to anyone who proves and/or finds any flaws in your theory.

"So Rossi arranged a challenge for Prof. Focardi, telling him 'I will give you a prize (size non-disclosed) if you can show me that what I have done is wrong and does not work.'"

20 points for suggesting that you deserve a Nobel Prize.

I believe—forgive me if I say it—that this is the greatest discovery in human history. So let’s say that if they were to award us the Nobel Prize, I think it would be well deserved.”

50 points for claiming you have a revolutionary theory but giving no concrete testable predictions.

"Rossi knew he was on to something big, something so powerful it could change the world forever." Yet Rossi repeatedly conducts misleading, "black box" demonstrations not giving full access to independent reviewers.

Despite crafting the index, Baez is very empathetic to crackpots and cranks. As he told This American Life in 2005:

"I think they do it because they really want to understand the universe and they have very noble albeit grandiose motivations trying to do what us regular physicists are also trying to do... And I think what distinguishes them from physicists who can make a useful contribution is that they don't want to be somebody whose epitaph says they tightened the screws on a particle accelerator that made a great experiment, they want to be Einstein. And most of us can't be Einstein."

(Image: Shutterstock)

Time to Stop Testing Magic in Medicine

Why waste time and money testing medical treatments that defy the laws of physics and chemistry?

That's the pointed question posed by Drs. David Gorski and Steven Novella in a new op-ed published in the journal Trends in Molecular Medicine. To most, the answer is obvious: we shouldn't. But in the past decade, alternative medicines without any basis in science, like acupuncture, homeopathy, and chiropractic, have received hundreds of millions of dollars from the U.S. government, which, in turn, has been used to fund hundreds of randomized clinical trials.

Alternative medicine supporters insist that these trials are necessary to find out what does and does not work. That seems reasonable. But unlike proper scientists, they don't cast off that which evidence shows to be worthless. When a study's result is negative -- and almost all of them are -- they ignore it. And on the rare occasion when a study's result is positive -- however miniscule the effect may be -- they cling to it like there's no tomorrow. In the eyes of the alternative medicine proponent, more research will always be needed.

So what we're left with is a medical community endlessly analyzing treatments that amount to nothing more than a placebo, thus lending credibility to the practices themselves.

Evidence is the lifeblood of science and rational thought. But should we analyze hocus-pocus? Take homeopathy, for example.

"Homeopathy violates multiple laws of physics with its claims that dilution can make a homeopathic remedy stronger and that water can retain the ‘memory’ of substances with which it has been in contact before," Gorski and Novella write.

In other words, it's based on magic.

"Thus, treatments like homeopathy should be dismissed as ineffective on basic scientific grounds alone."

In evidence-based medicine, a treatment must first be shown to be plausible with basic science, then further studied in vitro on cell cultures and in vivo on animals. Only then is it allowed to continue to clinical trials in humans. But alternative medicine consistently seems to get a pass on the first three steps, proceeding straight to human trials, Gorski and Novella say. It is in these clinical trials, where confounding variables seep in, and occasionally produce false-positives. Moreover, it's ethically dubious to test implausible alternative treatments on patients with serious medical conditions. The $30 million TACT study analyzed unsubstantiated chelation therapy on patients with heart disease, who -- unsurprisingly -- received no benefit. Another trial examined an alternative treatment strategy for pancreatic cancer in which patients drank juices, used coffee enemas, and took large quantities of supplements. The results of this disturbing trial were tragically unsurprising.

"One year survival of subjects undergoing this protocol was nearly fourfold worse than subjects receiving standard-of-care chemotherapy," Novella and Gorski describe.

Terrible research like that can be avoided with a simple rule.

"All clinical trials should be based on scientifically well-supported preclinical observations that justify them," the duo says.

Until alternative medical practices pass the basic science test, we shouldn't waste time or money testing them on humans.

Source: David H. Gorski, Steven P. Novella. "Clinical trials of integrative medicine: testing whether magic works?" Trends in Molecular Medicine. August 2014. DOI: http://dx.doi.org/10.1016/j.molmed.2014.06.007

Can Tiny Nuclear Plants Thwart Regulatory Hell?

1996 was the last year that a commercial nuclear reactor came online in the U.S. That project, at the Watts Barr  plant in Tennessee, began all the way back in 1973. We haven't built a new facility in 40 years.

A new startup called UPower is hoping to thaw some of this frozen market. Their plan: think small.

Currently, it is nearly impossible to open a new plant in the U.S. The reasons for this are well laid out here; they boil down to overregulation. A continuous increase in the number and complexity of regulations beginning in the early 1970s caused the materials and construction cost to increase dramatically. This increased the time required to construct a plant to nearly triple.

Vastly longer construction time has two huge negative effects. First, the loans needed to pay the high initial cost of building a plant accrue far more interest during those extra years of construction. Thus an exponential increase in cost occurs before the plant can begin its very profitable operating years. Second, during construction, new regulations often are introduced. This can require a redesign and perhaps even a partial tear-down and rebuild before the plant even opens.

The worst part? Most of these regulations would have done little to prevent previous accidents. Nuclear engineers and scientists don't believe they are useful at all. Rules stay because it's bad politics to oppose them.

Nuclear regulations, driven by a hype-fuelled media and anti-nuclear fearmongers such as the Union of Concerned Scientists, have strangled the building of nuclear plants. Ironically, these policies have directly contributed to our nation's reliance on fossil fuels, further damaging the environment and empowering tyrants in the Middle East. Given that nuclear power is our best energy strategy (as well as a good foreign policy strategy), what can be done to thwart this mess?

Go small. UPower's proposed reactor is tiny, making its design, testing, and implementation much easier. A typical U.S. nuclear reactor produces roughly 700-1300 megawatts (MW) of power at all times. In the current toxic regulatory environment, these reactors cost billions of dollars and take more than a decade to build. UPower's nuclear reactor would only produce about 2 MW of usable electricity. But after initial production of the first few units, they are hoping to reach a complete cost below $10 million each. The small, simple design allows a fast build time and easier accomodation of future regulatory burdens.

Its nuclear core can be made of several common nuclear fuels, depending upon availability, but it will not be suspended in water like most current nuclear plants. Instead, the reactor cycles coolant through an enclosed system within the device, carrying heat from the core to the outside. A particular strength of the design is that it is self-contained. No water, steam, or external electricity needs to be hooked up. The unit is placed in the ground and runs for more than a decade without needing constant micromanagement.

There are some hurdles. The reactor unit does not directly produce electricity: its output will be heat. UPower will need to design and package the machinery for turning that heat into energy. It's a relatively simple engineering task, which has been well understoood for centuries. Current nuclear, coal, natural gas, solar thermal, and geothermal power plants all generate power by converting the heat collected from those fuels into electricity via steam turbines.

In addition, they haven't yet produced a working model. However, nuclear reactor engineering is a technologically mature field. Thousands of nuclear fission reactors run all over the world today (e.g., in nuclear submarines and aircraft carriers); many of them cranking out more than 100% of the power output for which they were originally designed every single day without incident. Also, remember that these are 40-year-old designs; far better designs now exist, despite their being stifled in the US.

What about the nuclear waste? UPower claims that it will be minimal. After the plant runs for 12 years, the reactor is shut down, leaving some matter behind. This doesn't immediately become waste however; they claim that this spent fuel can easily be converted to a second material that can power the plant for a second 12-year cycle. Then after 24 years total, the fuel is spent and becomes waste. How much? Roughly the volume of a basketball. Not bad!

Whether this vision reaches commercial reality is anybody's guess. The idea, however, seems sound and could help melt the glacier of nuclear regulation in America.

(AP photo)

What RealClearScience Is For and Against

One of the hazards of science journalism is the regularity with which we are called names, by both the Left and the Right. "Shills for Monsanto," "lackeys for the pharmaceutical industry," "enablers of the global warming hoax," and (of course) "Nazis" are some of the nicer things that have been said. But just like an auto mechanic who spends his day with oily, greasy hands, we too don't mind getting a little dirtied up for the sake of science. It's all in a day's work.

Because the relentless pursuit of data-based knowledge is our sole guiding principle at RealClearScience, we are not wedded to any particular scientific outcome. For instance, we are staunch supporters of the Big Bang, not because we want there to have been a Big Bang but because we accept the overwhelming data that backs it. The same goes for evolution, anthropogenic climate change, the benefits of GMOs, and so many other supposedly hot-button topics. However, if the evidence changes, our opinion changes. That is the primary benefit of having a fact-based worldview.

After reading literally thousands of articles and writing hundreds, we have become quite familiar with the scientific evidence favoring or opposing various controversial issues. The editorial team thought it would be useful if we compiled a list of those issues, categorizing them based on how well supported (or unsupported) they are by current evidence. For those issues in which we have written an article that further explains our position, we have provided a link.

The weight of scientific evidence FAVORS:

-Evolution
-Anthropogenic climate change
-Safety and benefits of GMOs
-Vaccines
-Necessity of animal testing
-Nuclear power
-Necessity of embryonic stem cell research

The weight of scientific evidence OPPOSES:

-Most forms of alternative medicine
-Vaccine-autism link
-"Benefits" of organic food
-Cold fusion

Based on current scientific evidence, we are CAUTIOUSLY OPTIMISTIC toward:

-Fracking
-Breakthrough in fusion power
-Breakthrough in solar power
-Existence of life on other planets
-Reconciliation of science and religion

Based on current scientific evidence, we are SKEPTICAL of:

-Apocalyptic climate change
-Warp drive technology
-The technological "singularity"
-Gluten sensitivity
-Home birth
-String theory
-Existence of a multiverse
-Evolutionary psychology

Again, we are not wedded to any of these conclusions. If the data changes, so too will our opinion!

(AP photo)

Put Down That Bucket of Ice Water. Read Lou Gehrig's Story. Learn About the Science of ALS. Then Donate.

BY NOW, THOUSANDS, perhaps millions, of Americans have already filmed themselves dumping ice water on their heads in the name of amyotrophic lateral sclerosis (ALS) -- Lou Gehrig's disease. Thousands more will follow suit. Whether or not you're a fan of the Ice Bucket Challenge -- and particularly its narcissistic nature -- you cannot deny that it's been extremely successful. As of Thursday, The ALS Association has received $41.8 million in donations from July 29th to August 21st, compared with just $2.1 million during the same time period last year.

The Ice Bucket Challenge has been an undeniable boon to the fight against ALS and online egos everywhere (Look at all the Facebook "likes!"), but how much awareness it has truly raised? While videos of people creatively dousing themselves with cold water abound on social media, the story and the science of ALS seem either absent or drowned out.

This is an attempt to fill that void. If social pressure isn't enough to convince you to donate to ALS research, the heart-wrenching story of Lou Gehrig and the science behind the illness that shares his name should be.

THOSE ATTENDING YANKEES spring training in 1939 saw slugging first baseman, Lou Gehrig, the "Iron Horse," set to return for his 17th season. Up until that point, Gehrig had been an "institution of the American League," hitting 493 home runs, averaging .341 at the plate, and playing in 2,122 consecutive games. But onlookers wondered how long that could continue. Gehrig was now 35, and his prior season had been a bit off his usual pace. He only hit .295, an amazing feat by most standards, but squarely subpar for Lou. Yankees fans hoped that Gehrig would get back on track in 1939.

However, as spring training pressed on, it was clear something was amiss. Sports writers picked up on it.

"They watch him at the bat and note he isn't hitting the ball well. They watch him around the bag and it's plain he isn't getting the balls he used to get. They watch him run and they fancy they can hear his bones creak and his lungs wheeze as he lumbers around the bases," the New York World Telegram's Joe Williams wrote.

"On eyewitness testimony alone, the verdict must be that of a battle-scarred veteran falling apart."

A rare few, like the New York Sun's James Kahn, were more perceptive.

"I think there is something wrong with him. Physically wrong, I mean. I don't know what it is, but I am satisfied that it goes far beyond his ball-playing. I have seen ballplayers 'go' overnight, as Gehrig seems to have done. But they were simply washed up as ballplayers. It's something deeper than that in this case, though. I have watched him very closely and this is what I have seen: I have seen him time a ball perfectly, swing on it as hard as he can, meet it squarely — and drive a soft, looping fly over the infield. In other words, for some reason that I do not know, his old power isn't there... He is meeting the ball, time after time, and it isn't going anywhere."

Things didn't improve when the season began. One day at batting practice, Yankees teammate Joe DiMaggio watched as Gehrig swung on and missed ten "fat" pitches in a row. Eight games in, right before a bout against the Detroit Tigers, and despite the protests of his teammates and manager, Gehrig benched himself "for the good of the team." Everyone, even the stadium announcer for the Tigers, was shocked. "Ladies and gentlemen," he announced, "this is the first time Lou Gehrig's name will not appear on the Yankee lineup in 2,130 consecutive games." Gehrig received a standing ovation from the Detroit fans. Tears glistened in his eyes.

A month later, Gehrig visited the Mayo Clinic in Rochester, Minnesota. The six-day visit produced the following diagnosis from Dr. Harold H. Habian:

"After a careful and complete examination, it was found that he is suffering from amyotrophic lateral sclerosis. This type of illness involves the motor pathways and cells of the central nervous system. The nature of this trouble makes it such that Mr. Gehrig will be unable to continue his active participation as a baseball player."

Gehrig was also informed that the disease was incurable, and that he likely did not have long to live. Despite the earth-shattering diagnosis, he remained optimistic.

"The road may come to an end here," he wrote his wife. "Seems like our backs are to the wall. But there usually comes a way out. Where and what I know not, but who can tell that it might lead right on to greater things."

On July 4, in-between a double-header against the Washington Senators, a ceremony was held to commemorate Lou Gehrig and allow him to announce his retirement. 61,808 hushed fans watched as Yankees manager Joe McCarthy -- who had been like a father to Gehrig -- handed the outgoing slugger a trophy. They watched as Gehrig bent down with the apparent effort of a man forty years his senior to set it on the ground. They watched as Gehrig stood silently with his head slightly turned down, too moved to move. And then, they watched as Gehrig gathered himself, walked to the collection of microphones, and gave one of the greatest, most humble speeches ever delivered.

"Fans, for the past two weeks you have been reading about the bad break I got. Yet today I consider myself the luckiest man on the face of the earth...

When the New York Giants, a team you would give your right arm to beat, and vice versa, sends you a gift—that's something. When everybody down to the groundskeepers and those boys in white coats remember you with trophies—that's something. When you have a wonderful mother-in-law who takes sides with you in squabbles with her own daughter—that's something. When you have a father and a mother who work all their lives so that you can have an education and build your body—it's a blessing. When you have a wife who has been a tower of strength and shown more courage than you dreamed existed—that's the finest I know.     

So I close in saying that I might have been given a bad break, but I've got an awful lot to live for. Thank you."

Two years later, Lou Gehrig died.

ALS NOW AFFECTS more than 30,000 Americans. In those diagnosed, the motor neurons -- the cells that signal muscles to move -- suddenly and mysteriously start to degrade. As the motor neurons dwindle, the muscles they formerly controlled diminish as well from underuse. Paralysis eventually sets in, but cognitive function is often spared. In this respect, ALS is the opposite of Alzheimer's: the body goes, but the mind remains. Still incurable today, the disease is often fatal within five years of diagnosis. Most patients die from respiratory failure.

Though precise causes and risk factors haven't been identified, a number of genes and mutations have been linked to ALS. That means that those with a family history of the disease can get tested and receive an imperfect estimation of their risk.

In February, researchers revealed how ALS is spread from neuron to neuron. It seems that a mutant of the enzyme SOD1 causes the cells to go haywire. The researchers also found that certain antibodies can block SOD1 from being transmitted, which could potentially halt the progression of ALS. The method has yet to be tried in humans.

The ALS Ice Bucket Challenge has arrived at an "exciting time" for ALS research. With new drugs undergoing clinical trials and promising research pathways being elucidated, the money raised is sure to be put to good use. To donate, visit the website of the ALS Association.

Now you can dump that bucket of ice water on your head.

Primary Source and Images: Baseball: a Film by Ken Burns, AP

The Legendary Study That Embarrassed Wine Experts Across the Globe

A LITTLE OVER a dozen years ago, "la merde... hit le ventilateur" in the world of wine.

Nobody remembers the 2001 winner of Amorim Academy's annual competition to crown the greatest contribution to the science of wine ("a study of genetic polymorphism in the cultivated vine Vitis vinifera L. by means of microsatellite markers"), but many do recall the runner-up: a certain dissertation by Frédéric Brochet, then a PhD candidate at the University of Bordeaux II in Talence, France. His big finding lit a fire under the seats of wine snobs everywhere.

In a sneaky study, Brochet dyed a white wine red and gave it to 54 oenology (wine science) students. The supposedly expert panel overwhelmingly described the beverage like they would a red wine. They were completely fooled.

The research, later published in the journal Brain and Language, is now widely used to show why wine tasting is total BS. But more than that, the study says something fascinating about how we perceive the world around us: that visual cues can effectively override our senses of taste and smell (which are, of course, pretty much the same thing.)

WHEN BROCHET BEGAN his study, scientists already knew that the brain processes olfactory (taste and smell) cues approximately ten times slower than sight -- 400 milliseconds versus 40 milliseconds. It's likely that in the interest of evolutionary fitness, i.e. spotting a predator, the brain gradually developed to fast track visual information. Brochet's research further demonstrated that, in the hierarchy of perception, vision clearly takes precedence.

Here's how the research went down. First, Brochet gave 27 male and 27 female oenology students a glass of red and a glass of white wine and asked them to describe the flavor of each. The students described the white with terms like "floral," "honey," "peach," and "lemon." The red elicited descriptions of "raspberry," "cherry," "cedar," and "chicory."

A week later, the students were invited back for another tasting session. Brochet again offered them a glass of red wine and a glass of white. But he deceived them. The two wines were actually the same white wine as before, but one was dyed with tasteless red food coloring. The white wine (W) was described similarly to how it was described in the first tasting. The white wine dyed red (RW), however, was described with the same terms commonly ascribed to a red wine.

"The wine’s color appears to provide significant sensory information, which misleads the subjects’ ability to judge flavor," Brochet wrote of the results.

"The observed phenomenon is a real perceptual illusion," he added. "The subjects smell the wine, make the conscious act of odor determination and verbalize their olfactory perception by using odor descriptors. However, the sensory and cognitive processes were mostly based on the wine color."

Brochet also noted that, in general, descriptions of smell are almost entirely based on what we see.

"The fact that there are no specific terms to describe odors supports the idea of a defective association between odor and language. Odors take the name of the objects that have these odors."

Now that's deep. Something to ponder over your next glass of Merlot, perhaps?

A FEW YEARS after publishing his now famous paper, the amiable, bespectacled, and lean Brochet turned away from the unkind, meritocratic, and bloated culture of French academia and launched a career that blended his love for science and his passion for "creating stuff."

Yep. You guessed it. He makes wine.

(Images: AP, Morrot, Brochet, and Dubourdieu)

Six Big Lessons from the Ebola Outbreak

Not an Ebola expert.

The Ebola outbreak in West Africa, which continues to rage and has now claimed the lives of more than 1100 people, offers some big lessons for America.

#1. For all its flaws, the American public health system is pretty good. We transported two patients from the middle of a hot zone who were infected with one of the world's deadliest viruses to a major metropolitan area in the United States. We did this without infecting anybody else or putting the public in danger. The two Americans were treated with a "secret" remedy (that we reported on two years ago) and are continuing to improve. One of them may actually be discharged soon.

#2. Bringing the sick Americans home was the right thing to do. On August 1, the ever-present Donald Trump tweeted: "The U.S. cannot allow EBOLA infected people back. People that go to far away places to help out are great-but must suffer the consequences!" If Ebola was as infectious as, say, measles or influenza, then Trump would be right to be concerned. If such a virus were to emerge, quarantining the patients abroad would probably be the appropriate course of action to prevent unnecessary risk to the American public. But Ebola is not that infectious. Ignorance is no excuse to stir up public anxiety, and Trump's comments were completely out of line.

#3. Biotechnology and GMOs save lives. The antibody cocktail that was used to treat the patients was the product of biotechnology, specifically GMOs. Mouse genes were modified to become human-like, and then they were placed inside of a tobacco plant. The medicine was then extracted from the plant and given to the patients. (Read John Timmer's excellent article for the details.) Keep in mind that this is the sort of life-saving research that anti-GMO activists are fighting to prevent.

#4. Do not destroy smallpox. A few months ago, the world was once again debating whether or not to destroy the known vials of smallpox that exist at the CDC in Atlanta and at a facility in Russia. Since that debate, the Ebola outbreak exploded, and some previously forgotten vials of smallpox reappeared in an NIH storage room. When scientists say we should keep smallpox around "just in case," these are the sorts of surprises they are talking about. Yes, there is a real risk that smallpox (or some other deadly pathogen) could escape from a laboratory. But is the world really better off if we forego research out of fear?

#5. Americans need to pay more attention to global affairs. Separated by two vast oceans, and bordered by two friendly neighbors, we tend to be rather insular in terms of our global perspective. Unless there is a war or some other geopolitical instability that directly threatens our interests, we remain disinterested in the rest of the world. Even then, we still may not be able to find the troubled spot on a map, as 84% of Americans were unable to do with Ukraine. If merely 1 in 6 Americans can find a gigantic country bordering Russia on a map, just how few could find Liberia, Guinea, or Sierra Leone -- the center of the outbreak? In our modern, interconnected world, what happens on one side of the globe can and will affect the other side. Maybe it's time to teach more geography in school.

#6. NIH funding should be increased. The U.S. government has neglected the National Institutes of Health (NIH), more or less letting funding slide ever since 2003. As Pacific Standard reported last year, "the Obama administration's budget request for the 2014 fiscal year is $31.3 billion, more than 23 percent lower than the 2003 funding level in purchasing power." If the U.S. wants to remain globally competitive and ready to fight disease, this downward trend needs to be reversed. Maybe the Ebola outbreak will force some very much needed bipartisanship.