In 1992, UC-Riverside mathematician and physicist John Baez was overloaded, not with his day-to-day activities, but with emails from people touting "revolutionary ideas" that required his learned fine-tuning. This would have been fine, had the ideas at least had a foundation in reality. Sadly, almost all of them were not in accordance with recognized laws of nature.
In response, Baez created The Crackpot Index: "A simple method for rating potentially revolutionary contributions to physics." The index comprises 36 items tailored to determine whether an idea and the person behind it are brilliant or daffy. If your score is low, you might have something. But as it starts inching up, you might want to consider donning a hat made from aluminum foil and reconsidering your perception of reality.
Here are a few of the items:
1 point for every statement that is widely agreed on to be false.
5 points for using a thought experiment that contradicts the results of a widely accepted real experiment.
10 points for each new term you invent and use without properly defining it.
20 points for talking about how great your theory is, but never actually explaining it.
40 points for comparing those who argue against your ideas to Nazis, stormtroopers, or brownshirts.
Now, let's put The Crackpot Index to use. Andrea Rossi and Sergio Focardi's cold fusion Energy Catalyzer (E-Cat) should do nicely. A brief visit to E-Cat's website provides a number of examples:
1 point for every statement that is widely agreed on to be false.
Too many to count. Rossi and Focardi's international patent application for the E-Cat was judged to "offend against the generally accepted laws of physics and established theories."
10 points for offering prize money to anyone who proves and/or finds any flaws in your theory.
"So Rossi arranged a challenge for Prof. Focardi, telling him 'I will give you a prize (size non-disclosed) if you can show me that what I have done is wrong and does not work.'"
20 points for suggesting that you deserve a Nobel Prize.
50 points for claiming you have a revolutionary theory but giving no concrete testable predictions.
"Rossi knew he was on to something big, something so powerful it could change the world forever." Yet Rossi repeatedly conducts misleading, "black box" demonstrations not giving full access to independent reviewers.
Despite crafting the index, Baez is very empathetic to crackpots and cranks. As he told This American Life in 2005:
"I think they do it because they really want to understand the universe and they have very noble albeit grandiose motivations trying to do what us regular physicists are also trying to do... And I think what distinguishes them from physicists who can make a useful contribution is that they don't want to be somebody whose epitaph says they tightened the screws on a particle accelerator that made a great experiment, they want to be Einstein. And most of us can't be Einstein."
Why waste time and money testing medical treatments that defy the laws of physics and chemistry?
That's the pointed question posed by Drs. David Gorski and Steven Novella in a new op-ed published in the journal Trends in Molecular Medicine. To most, the answer is obvious: we shouldn't. But in the past decade, alternative medicines without any basis in science, like acupuncture, homeopathy, and chiropractic, have received hundreds of millions of dollars from the U.S. government, which, in turn, has been used to fund hundreds of randomized clinical trials.
Alternative medicine supporters insist that these trials are necessary to find out what does and does not work. That seems reasonable. But unlike proper scientists, they don't cast off that which evidence shows to be worthless. When a study's result is negative -- and almost all of them are -- they ignore it. And on the rare occasion when a study's result is positive -- however miniscule the effect may be -- they cling to it like there's no tomorrow. In the eyes of the alternative medicine proponent, more research will always be needed.
So what we're left with is a medical community endlessly analyzing treatments that amount to nothing more than a placebo, thus lending credibility to the practices themselves.
Evidence is the lifeblood of science and rational thought. But should we analyze hocus-pocus? Take homeopathy, for example.
"Homeopathy violates multiple laws of physics with its claims that dilution can make a homeopathic remedy stronger and that water can retain the ‘memory’ of substances with which it has been in contact before," Gorski and Novella write.
In other words, it's based on magic.
"Thus, treatments like homeopathy should be dismissed as ineffective on basic scientific grounds alone."
In evidence-based medicine, a treatment must first be shown to be plausible with basic science, then further studied in vitro on cell cultures and in vivo on animals. Only then is it allowed to continue to clinical trials in humans. But alternative medicine consistently seems to get a pass on the first three steps, proceeding straight to human trials, Gorski and Novella say. It is in these clinical trials, where confounding variables seep in, and occasionally produce false-positives. Moreover, it's ethically dubious to test implausible alternative treatments on patients with serious medical conditions. The $30 million TACT study analyzed unsubstantiated chelation therapy on patients with heart disease, who -- unsurprisingly -- received no benefit. Another trial examined an alternative treatment strategy for pancreatic cancer in which patients drank juices, used coffee enemas, and took large quantities of supplements. The results of this disturbing trial were tragically unsurprising.
"One year survival of subjects undergoing this protocol was nearly fourfold worse than subjects receiving standard-of-care chemotherapy," Novella and Gorski describe.
Terrible research like that can be avoided with a simple rule.
"All clinical trials should be based on scientifically well-supported preclinical observations that justify them," the duo says.
Until alternative medical practices pass the basic science test, we shouldn't waste time or money testing them on humans.
Source: David H. Gorski, Steven P. Novella. "Clinical trials of integrative medicine: testing whether magic works?" Trends in Molecular Medicine. August 2014. DOI: http://dx.doi.org/10.1016/j.molmed.2014.06.007
1996 was the last year that a commercial nuclear reactor came online in the U.S. That project, at the Watts Barr plant in Tennessee, began all the way back in 1973. We haven't built a new facility in 40 years.
A new startup called UPower is hoping to thaw some of this frozen market. Their plan: think small.
Currently, it is nearly impossible to open a new plant in the U.S. The reasons for this are well laid out here; they boil down to overregulation. A continuous increase in the number and complexity of regulations beginning in the early 1970s caused the materials and construction cost to increase dramatically. This increased the time required to construct a plant to nearly triple.
Vastly longer construction time has two huge negative effects. First, the loans needed to pay the high initial cost of building a plant accrue far more interest during those extra years of construction. Thus an exponential increase in cost occurs before the plant can begin its very profitable operating years. Second, during construction, new regulations often are introduced. This can require a redesign and perhaps even a partial tear-down and rebuild before the plant even opens.
The worst part? Most of these regulations would have done little to prevent previous accidents. Nuclear engineers and scientists don't believe they are useful at all. Rules stay because it's bad politics to oppose them.
Nuclear regulations, driven by a hype-fuelled media and anti-nuclear fearmongers such as the Union of Concerned Scientists, have strangled the building of nuclear plants. Ironically, these policies have directly contributed to our nation's reliance on fossil fuels, further damaging the environment and empowering tyrants in the Middle East. Given that nuclear power is our best energy strategy (as well as a good foreign policy strategy), what can be done to thwart this mess?
Go small. UPower's proposed reactor is tiny, making its design, testing, and implementation much easier. A typical U.S. nuclear reactor produces roughly 700-1300 megawatts (MW) of power at all times. In the current toxic regulatory environment, these reactors cost billions of dollars and take more than a decade to build. UPower's nuclear reactor would only produce about 2 MW of usable electricity. But after initial production of the first few units, they are hoping to reach a complete cost below $10 million each. The small, simple design allows a fast build time and easier accomodation of future regulatory burdens.
Its nuclear core can be made of several common nuclear fuels, depending upon availability, but it will not be suspended in water like most current nuclear plants. Instead, the reactor cycles coolant through an enclosed system within the device, carrying heat from the core to the outside. A particular strength of the design is that it is self-contained. No water, steam, or external electricity needs to be hooked up. The unit is placed in the ground and runs for more than a decade without needing constant micromanagement.
There are some hurdles. The reactor unit does not directly produce electricity: its output will be heat. UPower will need to design and package the machinery for turning that heat into energy. It's a relatively simple engineering task, which has been well understoood for centuries. Current nuclear, coal, natural gas, solar thermal, and geothermal power plants all generate power by converting the heat collected from those fuels into electricity via steam turbines.
In addition, they haven't yet produced a working model. However, nuclear reactor engineering is a technologically mature field. Thousands of nuclear fission reactors run all over the world today (e.g., in nuclear submarines and aircraft carriers); many of them cranking out more than 100% of the power output for which they were originally designed every single day without incident. Also, remember that these are 40-year-old designs; far better designs now exist, despite their being stifled in the US.
What about the nuclear waste? UPower claims that it will be minimal. After the plant runs for 12 years, the reactor is shut down, leaving some matter behind. This doesn't immediately become waste however; they claim that this spent fuel can easily be converted to a second material that can power the plant for a second 12-year cycle. Then after 24 years total, the fuel is spent and becomes waste. How much? Roughly the volume of a basketball. Not bad!
Whether this vision reaches commercial reality is anybody's guess. The idea, however, seems sound and could help melt the glacier of nuclear regulation in America.
One of the hazards of science journalism is the regularity with which we are called names, by both the Left and the Right. "Shills for Monsanto," "lackeys for the pharmaceutical industry," "enablers of the global warming hoax," and (of course) "Nazis" are some of the nicer things that have been said. But just like an auto mechanic who spends his day with oily, greasy hands, we too don't mind getting a little dirtied up for the sake of science. It's all in a day's work.
Because the relentless pursuit of data-based knowledge is our sole guiding principle at RealClearScience, we are not wedded to any particular scientific outcome. For instance, we are staunch supporters of the Big Bang, not because we want there to have been a Big Bang but because we accept the overwhelming data that backs it. The same goes for evolution, anthropogenic climate change, the benefits of GMOs, and so many other supposedly hot-button topics. However, if the evidence changes, our opinion changes. That is the primary benefit of having a fact-based worldview.
After reading literally thousands of articles and writing hundreds, we have become quite familiar with the scientific evidence favoring or opposing various controversial issues. The editorial team thought it would be useful if we compiled a list of those issues, categorizing them based on how well supported (or unsupported) they are by current evidence. For those issues in which we have written an article that further explains our position, we have provided a link.
The weight of scientific evidence FAVORS:
The weight of scientific evidence OPPOSES:
Based on current scientific evidence, we are CAUTIOUSLY OPTIMISTIC toward:
Based on current scientific evidence, we are SKEPTICAL of:
Again, we are not wedded to any of these conclusions. If the data changes, so too will our opinion!
BY NOW, THOUSANDS, perhaps millions, of Americans have already filmed themselves dumping ice water on their heads in the name of amyotrophic lateral sclerosis (ALS) -- Lou Gehrig's disease. Thousands more will follow suit. Whether or not you're a fan of the Ice Bucket Challenge -- and particularly its narcissistic nature -- you cannot deny that it's been extremely successful. As of Thursday, The ALS Association has received $41.8 million in donations from July 29th to August 21st, compared with just $2.1 million during the same time period last year.
The Ice Bucket Challenge has been an undeniable boon to the fight against ALS and online egos everywhere (Look at all the Facebook "likes!"), but how much awareness it has truly raised? While videos of people creatively dousing themselves with cold water abound on social media, the story and the science of ALS seem either absent or drowned out.
This is an attempt to fill that void. If social pressure isn't enough to convince you to donate to ALS research, the heart-wrenching story of Lou Gehrig and the science behind the illness that shares his name should be.
THOSE ATTENDING YANKEES spring training in 1939 saw slugging first baseman, Lou Gehrig, the "Iron Horse," set to return for his 17th season. Up until that point, Gehrig had been an "institution of the American League," hitting 493 home runs, averaging .341 at the plate, and playing in 2,122 consecutive games. But onlookers wondered how long that could continue. Gehrig was now 35, and his prior season had been a bit off his usual pace. He only hit .295, an amazing feat by most standards, but squarely subpar for Lou. Yankees fans hoped that Gehrig would get back on track in 1939.
However, as spring training pressed on, it was clear something was amiss. Sports writers picked up on it.
"They watch him at the bat and note he isn't hitting the ball well. They watch him around the bag and it's plain he isn't getting the balls he used to get. They watch him run and they fancy they can hear his bones creak and his lungs wheeze as he lumbers around the bases," the New York World Telegram's Joe Williams wrote.
"On eyewitness testimony alone, the verdict must be that of a battle-scarred veteran falling apart."
A rare few, like the New York Sun's James Kahn, were more perceptive.
"I think there is something wrong with him. Physically wrong, I mean. I don't know what it is, but I am satisfied that it goes far beyond his ball-playing. I have seen ballplayers 'go' overnight, as Gehrig seems to have done. But they were simply washed up as ballplayers. It's something deeper than that in this case, though. I have watched him very closely and this is what I have seen: I have seen him time a ball perfectly, swing on it as hard as he can, meet it squarely — and drive a soft, looping fly over the infield. In other words, for some reason that I do not know, his old power isn't there... He is meeting the ball, time after time, and it isn't going anywhere."
Things didn't improve when the season began. One day at batting practice, Yankees teammate Joe DiMaggio watched as Gehrig swung on and missed ten "fat" pitches in a row. Eight games in, right before a bout against the Detroit Tigers, and despite the protests of his teammates and manager, Gehrig benched himself "for the good of the team." Everyone, even the stadium announcer for the Tigers, was shocked. "Ladies and gentlemen," he announced, "this is the first time Lou Gehrig's name will not appear on the Yankee lineup in 2,130 consecutive games." Gehrig received a standing ovation from the Detroit fans. Tears glistened in his eyes.
A month later, Gehrig visited the Mayo Clinic in Rochester, Minnesota. The six-day visit produced the following diagnosis from Dr. Harold H. Habian:
"After a careful and complete examination, it was found that he is suffering from amyotrophic lateral sclerosis. This type of illness involves the motor pathways and cells of the central nervous system. The nature of this trouble makes it such that Mr. Gehrig will be unable to continue his active participation as a baseball player."
Gehrig was also informed that the disease was incurable, and that he likely did not have long to live. Despite the earth-shattering diagnosis, he remained optimistic.
"The road may come to an end here," he wrote his wife. "Seems like our backs are to the wall. But there usually comes a way out. Where and what I know not, but who can tell that it might lead right on to greater things."
On July 4, in-between a double-header against the Washington Senators, a ceremony was held to commemorate Lou Gehrig and allow him to announce his retirement. 61,808 hushed fans watched as Yankees manager Joe McCarthy -- who had been like a father to Gehrig -- handed the outgoing slugger a trophy. They watched as Gehrig bent down with the apparent effort of a man forty years his senior to set it on the ground. They watched as Gehrig stood silently with his head slightly turned down, too moved to move. And then, they watched as Gehrig gathered himself, walked to the collection of microphones, and gave one of the greatest, most humble speeches ever delivered.
"Fans, for the past two weeks you have been reading about the bad break I got. Yet today I consider myself the luckiest man on the face of the earth...
When the New York Giants, a team you would give your right arm to beat, and vice versa, sends you a gift—that's something. When everybody down to the groundskeepers and those boys in white coats remember you with trophies—that's something. When you have a wonderful mother-in-law who takes sides with you in squabbles with her own daughter—that's something. When you have a father and a mother who work all their lives so that you can have an education and build your body—it's a blessing. When you have a wife who has been a tower of strength and shown more courage than you dreamed existed—that's the finest I know.
So I close in saying that I might have been given a bad break, but I've got an awful lot to live for. Thank you."
Two years later, Lou Gehrig died.
ALS NOW AFFECTS more than 30,000 Americans. In those diagnosed, the motor neurons -- the cells that signal muscles to move -- suddenly and mysteriously start to degrade. As the motor neurons dwindle, the muscles they formerly controlled diminish as well from underuse. Paralysis eventually sets in, but cognitive function is often spared. In this respect, ALS is the opposite of Alzheimer's: the body goes, but the mind remains. Still incurable today, the disease is often fatal within five years of diagnosis. Most patients die from respiratory failure.
Though precise causes and risk factors haven't been identified, a number of genes and mutations have been linked to ALS. That means that those with a family history of the disease can get tested and receive an imperfect estimation of their risk.
In February, researchers revealed how ALS is spread from neuron to neuron. It seems that a mutant of the enzyme SOD1 causes the cells to go haywire. The researchers also found that certain antibodies can block SOD1 from being transmitted, which could potentially halt the progression of ALS. The method has yet to be tried in humans.
The ALS Ice Bucket Challenge has arrived at an "exciting time" for ALS research. With new drugs undergoing clinical trials and promising research pathways being elucidated, the money raised is sure to be put to good use. To donate, visit the website of the ALS Association.
Now you can dump that bucket of ice water on your head.
A LITTLE OVER a dozen years ago, "la merde... hit le ventilateur" in the world of wine.
Nobody remembers the 2001 winner of Amorim Academy's annual competition to crown the greatest contribution to the science of wine ("a study of genetic polymorphism in the cultivated vine Vitis vinifera L. by means of microsatellite markers"), but many do recall the runner-up: a certain dissertation by Frédéric Brochet, then a PhD candidate at the University of Bordeaux II in Talence, France. His big finding lit a fire under the seats of wine snobs everywhere.
In a sneaky study, Brochet dyed a white wine red and gave it to 54 oenology (wine science) students. The supposedly expert panel overwhelmingly described the beverage like they would a red wine. They were completely fooled.
The research, later published in the journal Brain and Language, is now widely used to show why wine tasting is total BS. But more than that, the study says something fascinating about how we perceive the world around us: that visual cues can effectively override our senses of taste and smell (which are, of course, pretty much the same thing.)
WHEN BROCHET BEGAN his study, scientists already knew that the brain processes olfactory (taste and smell) cues approximately ten times slower than sight -- 400 milliseconds versus 40 milliseconds. It's likely that in the interest of evolutionary fitness, i.e. spotting a predator, the brain gradually developed to fast track visual information. Brochet's research further demonstrated that, in the hierarchy of perception, vision clearly takes precedence.
Here's how the research went down. First, Brochet gave 27 male and 27 female oenology students a glass of red and a glass of white wine and asked them to describe the flavor of each. The students described the white with terms like "floral," "honey," "peach," and "lemon." The red elicited descriptions of "raspberry," "cherry," "cedar," and "chicory."
A week later, the students were invited back for another tasting session. Brochet again offered them a glass of red wine and a glass of white. But he deceived them. The two wines were actually the same white wine as before, but one was dyed with tasteless red food coloring. The white wine (W) was described similarly to how it was described in the first tasting. The white wine dyed red (RW), however, was described with the same terms commonly ascribed to a red wine.
"The wine’s color appears to provide significant sensory information, which misleads the subjects’ ability to judge flavor," Brochet wrote of the results.
"The observed phenomenon is a real perceptual illusion," he added. "The subjects smell the wine, make the conscious act of odor determination and verbalize their olfactory perception by using odor descriptors. However, the sensory and cognitive processes were mostly based on the wine color."
Brochet also noted that, in general, descriptions of smell are almost entirely based on what we see.
"The fact that there are no specific terms to describe odors supports the idea of a defective association between odor and language. Odors take the name of the objects that have these odors."
Now that's deep. Something to ponder over your next glass of Merlot, perhaps?
A FEW YEARS after publishing his now famous paper, the amiable, bespectacled, and lean Brochet turned away from the unkind, meritocratic, and bloated culture of French academia and launched a career that blended his love for science and his passion for "creating stuff."
Yep. You guessed it. He makes wine.
(Images: AP, Morrot, Brochet, and Dubourdieu)
Not an Ebola expert.
The Ebola outbreak in West Africa, which continues to rage and has now claimed the lives of more than 1100 people, offers some big lessons for America.
#1. For all its flaws, the American public health system is pretty good. We transported two patients from the middle of a hot zone who were infected with one of the world's deadliest viruses to a major metropolitan area in the United States. We did this without infecting anybody else or putting the public in danger. The two Americans were treated with a "secret" remedy (that we reported on two years ago) and are continuing to improve. One of them may actually be discharged soon.
#2. Bringing the sick Americans home was the right thing to do. On August 1, the ever-present Donald Trump tweeted: "The U.S. cannot allow EBOLA infected people back. People that go to far away places to help out are great-but must suffer the consequences!" If Ebola was as infectious as, say, measles or influenza, then Trump would be right to be concerned. If such a virus were to emerge, quarantining the patients abroad would probably be the appropriate course of action to prevent unnecessary risk to the American public. But Ebola is not that infectious. Ignorance is no excuse to stir up public anxiety, and Trump's comments were completely out of line.
#3. Biotechnology and GMOs save lives. The antibody cocktail that was used to treat the patients was the product of biotechnology, specifically GMOs. Mouse genes were modified to become human-like, and then they were placed inside of a tobacco plant. The medicine was then extracted from the plant and given to the patients. (Read John Timmer's excellent article for the details.) Keep in mind that this is the sort of life-saving research that anti-GMO activists are fighting to prevent.
#4. Do not destroy smallpox. A few months ago, the world was once again debating whether or not to destroy the known vials of smallpox that exist at the CDC in Atlanta and at a facility in Russia. Since that debate, the Ebola outbreak exploded, and some previously forgotten vials of smallpox reappeared in an NIH storage room. When scientists say we should keep smallpox around "just in case," these are the sorts of surprises they are talking about. Yes, there is a real risk that smallpox (or some other deadly pathogen) could escape from a laboratory. But is the world really better off if we forego research out of fear?
#5. Americans need to pay more attention to global affairs. Separated by two vast oceans, and bordered by two friendly neighbors, we tend to be rather insular in terms of our global perspective. Unless there is a war or some other geopolitical instability that directly threatens our interests, we remain disinterested in the rest of the world. Even then, we still may not be able to find the troubled spot on a map, as 84% of Americans were unable to do with Ukraine. If merely 1 in 6 Americans can find a gigantic country bordering Russia on a map, just how few could find Liberia, Guinea, or Sierra Leone -- the center of the outbreak? In our modern, interconnected world, what happens on one side of the globe can and will affect the other side. Maybe it's time to teach more geography in school.
#6. NIH funding should be increased. The U.S. government has neglected the National Institutes of Health (NIH), more or less letting funding slide ever since 2003. As Pacific Standard reported last year, "the Obama administration's budget request for the 2014 fiscal year is $31.3 billion, more than 23 percent lower than the 2003 funding level in purchasing power." If the U.S. wants to remain globally competitive and ready to fight disease, this downward trend needs to be reversed. Maybe the Ebola outbreak will force some very much needed bipartisanship.
Epigenetics is the next big field that the media, fearmongers, and political hacks will attempt to exploit. How do we know? Because there is a flurry of research in the field (which is not always a good sign), and journalists are already hacking away. You can find articles blaming epigenetics for obesity, cancer, personality, homosexuality, and (absurdly) how we vote.
Never mind the fact that there is serious reason to believe epigenetic changes are temporary and may not be passed down to multiple generations, particularly among mammals.
To be sure, epigenetic changes are probably linked (albeit very slightly) to some of those things. But, epigenetics will turn out to be just like regular genetics: Any one allele or epigenetic variation will only make a person ever so slightly more/less inclined toward a particular health outcome. That is because out of thousands or even millions of such potential variables, the impact of any one of them is usually marginal.
In other words, epigenetics is incredibly complicated. It is a field that is quite literally in its infancy, and there is still much to be learned. If the human genome is the black box of an aircraft, the epigenome is the black box of a UFO. Therefore, be wary of over-simplifications and general pronouncements. They will almost certainly be incorrect.
Other researchers have made similar observations. A commentary in the journal Nature indicates that the biggest headline-grabbers mostly involve women, specifically mothers:
‘Mother’s diet during pregnancy alters baby’s DNA’ (BBC), ‘Grandma’s Experiences Leave a Mark on Your Genes’ (Discover), and ‘Pregnant 9/11 survivors transmitted trauma to their chil- dren’ (The Guardian). Factors such as the paternal contribution, family life and social environment receive less attention.
The authors fear that "exaggerations and over-simplifications are making scapegoats of mothers, and could even increase surveillance and regulation of pregnant women."
They have a point. And they used a particularly persuasive example to illustrate it.
Everybody knows that pregnant women shouldn't drink. But everybody is wrong. Though it is completely taboo for a pregnant woman to be seen sipping a glass of wine, research has shown that moderate amounts of alcohol probably do not harm the developing fetus. Yet, the concern surrounding "fetal alcohol syndrome" -- a serious condition that can arise when an expecting mother drinks excessively -- resulted in mass government regulation, in some cases even making it illegal for pregnant women to drink at all.
The authors worry, perhaps rightly so, that the media hype surrounding epigenetics will once again turn its focus on mothers. Will the government once again regulate what pregnant women can eat, drink, and do? And if so, why not regulate the behavior of men, as well? Epigenetics, after all, can affect sperm quality.
The authors' conclusion provides an excellent framework for the media and policymakers to make sense of epigenetic studies:
First, avoid extrapolating from animal studies to humans without qualification. The short lifespans and large litter sizes favoured for lab studies often make animal models poor proxies for human reproduction. Second, emphasize the role of both paternal and maternal effects. This can counterbalance the tendency to pin poor outcomes on maternal behaviour. Third, convey complexity. Intrauterine exposures can raise or lower disease risk, but so too can a plethora of other intertwined genetic, lifestyle, socioeconomic and environmental factors that are poorly understood. Fourth, recognize the role of society. Many of the intrauterine stressors that DOHaD [developmental origins of health and disease] identifies as having adverse intergenerational effects correlate with social gradients of class, race and gender. This points to the need for societal changes rather than individual solutions.
This is eminently reasonable. Media and politicians, the ball is in your court.
Source: SS Richardson et al. "Don't Blame the Mothers." Nature 512:131-132. (2014)
The prospect of eating stomach is not something that would excite a great many Americans. But what's disgusting to us is a delicious, balanced meal to a great many animal predators. I mean, think about it.
"Stomachs are especially valuable because of what's inside them. The predator benefits from the nutrients of the plants and grains in the guts of its prey," science writer Mary Roach wrote in Gulp.
It's not just stomachs, of course. Bodily organs in general are amazingly nutritious.
"A serving of lamb spleen has almost as much vitamin C as a tangerine. Beef lung has 50 percent more," Roach wrote.
Yet thanks in part to our culturally evolved sense of disgust, most citizens of the Western world tend to turn up their noses at organs and ship them elsewhere.
"In 2009, the United States exported 438,000 tons of frozen livestock organs... Egypt and Russia are big on livers. Mexico eats our brains and lips. Our hearts belong to the Philippines."
Are we missing out? Alternative medicine proponent and supplement guru Joseph Mercola thinks so. But Mercola, who funds anti-vaccine and anti-fluoridation groups, regularly spouts woo-y mumbo jumbo. What are the facts?
Right now, Americans binge on muscle meat -- wings, legs, breasts, ribs... you get the idea. You should know, however, that muscle meat contains plenty of protein but little else. By comparison, a single 100-gram serving of liver contains roughly 800% of your daily value of Vitamin A and 1100% of Vitamin B12. Most organs sport similarly copious amounts of B vitamins, and only slightly less protein than muscle meat. They also generally contain more fat and a higher amount of cholesterol. For example, a serving of liver has more cholesterol than an egg, while a 4-ounce serving of brain has as much as ten eggs! (Coincidentally, brains are often served scrambled.)
It's obvious that organs are more nutritionally complete, but we live in an era of over nutrition, not under nutrition. To live a healthy life, you don't need a serving of liver each and every day. In fact, that might be unhealthy. The Institute of Medicine, a part of the National Academy of Sciences, recommends consuming on average no more than 3,000 micrograms of Vitamin A each day, and one serving of pork liver alone contains 6,500! The Vitamin A that we don't use is stored mostly in the liver (your own, not the one you'd be eating) and fat to be used later. So consuming more than needed over long periods can lead to a toxic build-up, known as Hypervitaminosis A.
Several members of a polar expedition led by Franz Josef Land in the late 1800s reportedly suffered from Vitamin A toxicity after consuming a stew made from the liver, heart, and kidneys from a single polar bear. (The organs of walruses, bears, seals, and moose contain especially high amounts of Vitamin A.) Within four hours, the explorers who partook of the stew grew nauseous, drowsy, and suffered severe headaches. Over the next 24 hours, the skin of half of those individuals began peeling off. All eventually recovered.
Organs from livestock animals such as cows, pigs, sheep, and chickens can certainly be consumed safely, but I'd caution against making liver an everyday staple. Eating brain can also be a risky proposition. You might think it would make you smart, but it can actually transmit neurodegenerative diseases like bovine spongiform encephalopathy, more commonly known as mad cow disease.
Now that you're sufficiently informed, care for some skewered heart?
Source: Gulp: Adventures on the Alimentary Canal, Mary Roach, 2013
(Image: AP, Shutterstock)
In modern society, antiperspirants are widely hailed as a godsend, dispelling the inconvenient odors wafting from armpits everywhere. But a new study casts doubts on their vaunted position. As it turns out, antiperspirants may actually make you smell worse in the long run.
For 90% of all Americans, slathering on deodorants and antiperspirants is a daily occurrence, a precautionary measure against foul odors and unsightly sweat stains. The odors arise when bacteria living in our armpits break down lipids and amino acids excreted in sweat into more smelly substances. Deodorants employ antimicrobial agents that kill off bacteria, as well as chemicals that replace noxious odors with pleasant aromas. Deodorants that double as antiperspirants, like Degree, Old Spice, and Dove, take the process one step further by physically plugging sweat glands with aluminum-based compounds.
While most of us might only concern ourselves with the dry, aromatic benefits of antiperspirants and deodorants, researchers at the Laboratory of Microbial Ecology and Technology at the University of Ghent in Belgium are more interested in the effects on bacteria. Billions of bacteria dwell in the "rain forests" under our arms, and the substances we don are mucking with their habitats!
To uncover how deodorants and antiperspirants affect armpit bacteria, Chris Callewaert, a Ph.D student specializing in microbial ecology, and a team of researchers recruited eight subjects for a task a great many people (and especially their friends) might deem unbearable: Six males and two females pledged not to use deodorant or antiperspirant for an entire month. Specifically, four subjects stopped using their deodorants and another four stopped using their antiperspirant deodorant. (Most antiperspirants are also deodorants. See image below for an example.) Another control subject who did not regularly use either was asked to use deodorant for a month. The duration was chosen because it takes approximately 28 days for a new layer of skin cells to form.
The researchers analyzed the diversity and abundance of subjects' armpit bacteria at various timepoints before they stopped using antiperspirant, during the period of abstaining from antiperspirant, and for a few weeks after resuming the use of antiperspirant. Switching hygiene habits plainly altered the armpit bacterial communities of every subject. Since no two armpits and their resident bacteria are identical, it was difficult to pinpoint precise changes brought about by deodorants or antiperspirants, but one clear trend did materialize: antiperspirants resulted in a clear increase of Actinobacteria.
You might not recognize the name of Actinobacteria, but chances are, you've smelled them. Dominated by Corynebacterium, they are the major instigators of noxious armpit odor. Other microbes that inhabit the armpit, like Firmicutes and Staphylococcus, don't produce odors as quickly, nor are those odors nearly as pungent.
Callewaert believes the aluminum compounds in antiperspirants may be to blame, killing off "good," less smelly bacteria and allowing "bad" bacteria to dominate. His study found that deodorants which lack these sweat-blocking antiperspirant compounds are actually linked to a slight decrease of stinky Actinobacteria. (Below: An example of the bacterial community of one subject. When using an antiperspirant, Firmicutes [diamonds] and Actinobacteria [dashes] dominate. Without antiperspirant, Firmicutes are more abundant and Actinobacteria are hardly present.)
Though antiperspirants and deodorants are widely used, they are only a temporary fix.
"The measures we utilize today do not take away the initial source: the odor causing bacteria," Callewaert told RealClearScience. "Deodorants only mask unpleasant odors. We can do better than that. The follow up of this research is finding better solutions."
And Callewaert is already working on one: "armpit bacterial transplantation."
"We take away the bad bacteria from the armpit of somebody with a body odor, and replace it with the good bacteria of a relative who doesn't have a body odor," he explained.
"So far we have helped over 15 people. For most subjects it brings immediate improvements. Most of them on a permanent time scale, although there are also people who suffer again from a body odor after some months."
For now, this approach seems rather extreme, but maybe it should be used as a last-line resort for the sort of person you can smell from the other side of the room.
The big limitation of the current study is its sample size. Just nine subjects took part.
"The sample size is rather small," Callewaert admits. "However, we see consistent outcomes."
Callewaert also says that this is the first study to specifically examine the effect of deodorant on the diversity and abundance of armpit bacteria.
"We barely know what lives in our armpits, on our clothes, in our laundry machines, and what causes all these unpleasant odors."
While the current study strongly suggests that antiperspirants can make their users smell worse via the growth of Actinobacteria, it does not directly assess body odor. As a follow up, Callewaert should recruit subjects to use antiperspirants and utilize methods like gas chromatography to directly measure the amount of the stinky volatile organic compounds emanating from their armpits. Professional smellers could also be put to work... at their own peril.
Source: Chris Callewaert, Prawira Hutapea, Tom Van de Wiele, Nico Boon. "Deodorants and antiperspirants affect the axillary bacterial community." Arch Dermatol Res. 2014 Jul 31.
(Images: Shutterstock, Callewaert et. al.)
Last week NASA released results of a test on a new space engine design. It seems to produce thrust without burning fuel. Is the impossible science fiction of the future now possible?
A simple picture of the proposed idea reveals its fundamental absurdity. Grab a friend and a hop into a car with a back seat (not like that). You push on the windshield repeatedly, while they push on the rear window just after every time you push on the front. Now, does the car roll forward? NASA is claiming that it might.
Peer a little deeper into the story and a forest of red flags start to appear. The design of the engine is disarmingly simple, but conceptually doesn't make sense. There are lots of equations to confuse the average reader. NASA didn't actually build the engine. It was given to them completely built from a design that has been criticized, refuted, discredited and described as a fraud by physicists.
NASA's test? Conducted by a guy who believes in warp drives. The second test of validity? A study published by an unknown researcher in a fourth-rate academic journal and never cited by anyone else.
Even worse, the control engine that was supposed to produce no thrust produced the same thrust as the test engine. When a "null" experimental control doesn't produce a null result... well, that's very bad.
Looks like NASA got duped.
According to Roger Shawyer and Guido Fetta, the peddlers of the EMDrive or Cannae drive, the engine works by bouncing microwaves back and forth within a metal container. The claim is that by making one side of the container bigger than the other, more thrust is deposited on that wall. Face that wall to the back and the engine pushes forward. What's the flaw?
The total energy flux of a wave doesn't change except for dissipation as it goes back and forth. Translation: the same amount of energy is deposited on the front of the engine as the back of the engine. You cannot possibly get net energy out of a system with no additional energy input. In reality, the net force is zero, producing zero thrust.
Once the basic idea is busted, the creators resort to true BS. First, they claimed relativistic electrodynamics explained the device's power. Then they switched tacks and claimed that vacuum quantum energy is the key. Science seems to refute these findings as somewhere between implausible and nonsensical. What other extraordinary evidence can we look for to support this extraordinary claim?
How about the only other test of the engine? Great scientific work is not always published in top journals, and sometimes fraudulent work is. However, better work from better researchers generally tends to be published in a select few well respected journals. This academic test was not published in one of the 10 best publications in physics, nor one of the 50 best, nor even one of the 500 best. It was published in a journal ranked 688th in the field, a place where weak research findings go to quietly die. This publication isn't even translated into English, the universal scientific language of the Earth.
Finally, common sense can be a last check: the smell test. Extracting free energy with no loss of fuel? Does this sound plausible? I leave that to you. Personally, I'd bet my salary against it. Not that that's much money, mind you.
The discovery that Earth revolves around the Sun was revolutionary. It fundamentally changed how we viewed the cosmos, as well as ourselves.
But the Earth does not revolve around the Sun. At least, not exactly. Time to get pedantic.
"Technically, what is going on is that the Earth, Sun and all the planets are orbiting around the center of mass of the solar system," writes Cathy Jordan, a Cornell University Ask an Astronomer contributor.
"The center of mass of our solar system very close to the Sun itself, but not exactly at the Sun's center."
Every single object in the solar system, from the gargantuan sun to the tiniest speck, exerts a gravitational pull on everything else. The solar system is basically a massive game of tug of war, and all of the yanking balances out at a specific point: the center of mass, or "barycenter." Everything in the solar system orbits around that point. Sometimes, it's almost smack dab at the Sun's center. Right now, the barycenter is just outside the Sun's surface. But it's constantly changing depending upon where the planets are in their orbital paths.
Because the Sun holds 99.87% of all the mass in the solar system, it's always going to win the tug of war. Even if all the planets were perfectly lined up on one side of the Sun, the center of mass would be just 800,000 kilometers off the surface of the Sun. That sounds like a lot, but remember, our solar system is big! Such a barycenter would be roughly 70 times closer to the Sun than the closest planet to the Sun, Mercury.
An even better illustration for center of mass is the binary star system. When two stars of comparable mass cohabit the same corner of space, they orbit about a point between each other.
However, rather than play a billion-year game of cosmic tag, more often than not, binary stars will take an elliptical orbit!
The rest of the universe certainly doesn't revolve around the Earth, but, like so many topics in science, it's an oversimplification to say that everything orbits around the Sun.
Some things in life are so indisputably true that it is a surprise that anyone disagrees. The safety and efficacy of vaccines, the benefits of GMOs, and the inability of Nicolas Cage to convincingly perform a single role all would be at the top of Captain Obvious's list. Now, the list has a new entry: Electronic cigarettes produce far more good than harm.
The idea behind e-cigarettes is straightforward. Tobacco cigarettes are bad for you, mostly because of the tar and other toxic chemicals that can lead to cancer or chronic obstructive pulmonary disease (COPD). The addictive component is nicotine, a nervous system stimulant that mimics the neurotransmitter acetylcholine. That is why smokers keep coming back for more; they need the "high" that nicotine delivers.
Though it is a powerful drug, nicotine does not cause cancer or COPD. It is widely assumed that, similar to nicotine patches or other forms of nicotine replacement therapy, e-cigarettes will give smokers an innovative (and fun) way to quit tobacco.
But there are naysayers. Some studies have shown that e-cigs produce carcinogens, and long-term biological effects are unknown. UCSF researchers insist that e-cigs do not help smokers quit, but merely lead to "dual use of e-cigarettes with conventional cigarettes." Still others claim that e-cigs may serve as a gateway to regular cigarettes, particularly among children. Thus, some adversaries have concluded that e-cigs are no good and, in fact, may be worse than regular cigarettes.
Such logic beggars belief. The worst part of a tobacco cigarette is the tar; e-cigs contain none of that, which means they will produce fewer carcinogens. It is true that long-term biological effects are unknown, but a reasonable hypothesis is that e-cigs will prove to be a much safer alternative. "Dual use" is a rather strange complaint; time spent vaping an e-cig is certainly better than inhaling tobacco smoke. And since the notion that marijuana serves as a gateway to harder drugs is a dubious one at best, why should we believe that e-cigs will serve as a gateway to real cigarettes?
Indeed, a new literature review in the journal Addiction analyzing the safety and use of e-cigs should quell some of those fears. The authors write:
"EC [E-cigarette] aerosol can contain some of the toxicants present in tobacco smoke, but at levels which are much lower. Long-term health effects of EC use are unknown but compared with cigarettes, EC are likely to be much less, if at all, harmful to users or bystanders. EC are increasingly popular among smokers, but to date there is no evidence of regular use by never-smokers or by non-smoking children. EC enable some users to reduce or quit smoking."
"Allowing EC to compete with cigarettes in the market-place might decrease smoking-related morbidity and mortality. Regulating EC as strictly as cigarettes, or even more strictly as some regulators propose, is not warranted on current evidence. Health professionals may consider advising smokers unable or unwilling to quit through other routes to switch to EC as a safer alternative to smoking and a possible pathway to complete cessation of nicotine use." (Emphasis added.)
Based on current data, the authors see no reason to regulate e-cigs as strictly as the real thing. Many e-cig opponents will balk at that, claiming that e-cigs have no business being in the hands of children or other non-smokers. Indeed, they could be right. But there is a simple fix: Regulate them so they are sold only to people over the age of 18 or, alternatively, as a prescription drug.
Source: Peter Hajek, Jean-François Etter, Neal Benowitz, Thomas Eissenberg & Hayden McRobbie. "Electronic cigarettes: review of use, content, safety, effects on smokers and potential for harm and benefit." Addiction. Published online before print. DOI: 10.1111/add.12659
We have nuclear submarines and nuclear ships, so why not nuclear planes?
Well, that's a very good question, one the United States spent $1.04 billion back in the 1950s trying to answer.
The idea for a nuclear-powered plane was originally hatched in 1944, during a time when the Manhattan Project was astonishing everyone with the potential of nuclear energy. Three years later, engineers at the newly established U.S. Air Force were awarded with initial funding of $10 million to research methods of wielding atomic power for aircraft. The request was buoyed by an obvious advantage: Theoretically, such a plane could remain aloft for weeks at time.
Under the new Aircraft Nuclear Propulsion program, Air Force engineers immediately went to work, developing three experimental engines (two of which are pictured below). By 1951, they settled on a method for directly transferring heat from a reactor and using it to propel an aircraft, described thusly in a 1963 government report:
"Air enters through the compressor, is forced into the reactor, and is heated by the fuel elements. After passing through the turbine, where energy is extracted to drive the compressor, the heated air is expelled at high velocity through the exhaust nozzle."
Concurrently, engineers at General Electric built a 2.5 Megawatt nuclear reactor designed specifically for a plane. It was a molten salt reactor, the first of its kind. Unlike the typical light water reactors used today -- where the uranium fuel is submersed in water -- a molten salt reactor utilizes a uranium-containing salt mixture as both a coolant and/or a fuel, allowing for a much smaller design.
With everything coming together on the technical end and funding continuing to pour in, it was time to face a key practical roadblock: how to protect the crew from the radiation emanating from their own plane. Thus, the early 1950s saw engineers affixing a gargantuan 11-ton cockpit lined with lead and rubber to the head of a Convair B-36 "Peacemaker" bomber. By the middle of the decade, Air Force personnel loaded General Electric's reactor onboard via the bomb bay and switched it on during the majority of 47 test flights. The reactor didn't propel the aircraft or power any of its systems, however. The flights were purely intended to test the radiation shielding, and their results were promising.
"All the data collected by these tests showed the program managers that the possibility of using a nuclear power plant to provide an aircraft with unlimited operational range was indeed at their disposal at this time," Raul Colon wrote for Aviation History. Reports from the time back Colon's assessment.
Meanwhile, January 1956 brought successful ground tests of the X-39 engine, powered completely by a nuclear reactor. Forecasts of atomic jets by 1980, published in Popular Science, were apparently on schedule!
But then the bipolar nature of politics abruptly intervened. Hesitance towards nuclear power supplanted excitement in the late 1950s and early 1960s, and politicians and a public nervous about the escalating Cold War started viewing nuclear planes less as a futuristic life changer and more as a recipe for mobile meltdown. President Kennedy canceled the Aircraft Nuclear Propulsion program in 1961.
Every so often, a call for nuclear-powered planes resurfaces, but it's either ignored or swiftly shot down. Given the current regulatory attitude toward grounded nuclear energy, the ubiquity and safety of fossil fuel-powered flight, and advancements in electric aircraft, it's difficult to imagine that nuclear power will ever take wing.
(Images: USAF, Wtshymanski)
Distant stars are pinpoint specks, too small to resolve. Exoplanets are ten times smaller in diameter and don't emit light of their own. They're vastly fainter than any star; we couldn't even see a single one until 20 years ago.
That's what makes a painstaking new study completed using the Hubble Space Telescope so beautiful. It not only located three exoplanets, but painstakingly measured the water composition of their atmospheres. There's water, but it's less than we expected. How is this possible?
Astronomers often find exoplanets by watching the light of many stars. If the brightness of a particular sun has a tiny (usually 1% or smaller) flicker that repeats in a regular pattern, we can calculate whether a planet's continuous orbit crossing in front of the star is the cause. Astronomical techniques have now evolved to the point that we not only look for a flicker in the total light from the star: we can see precisely how much each and every color of the rainbow flickers.
This is a more difficult version of the way we've been investigating stars for a century. We know the chemical makeup of far away suns because of their absorption spectrum: the colors missing from the light they broadcast to us.
The searing hot plasma at the core of a massive star emits light of all colors in the spectrum. The elements in the atmosphere of the star absorb a little bit of that light however, leaving certain colors absent from the light that reaches us.
Due to the quantum nature of energy states in atoms and molecules, they only absorb and emit energy in certain exact amounts, i.e., with very certain electromagnetic frequencies.
Water molecules absorb and emit a certain group of wavelengths due to their quantum transitions too. Upon absorbing an infrared photon of the correct wavelength (1380 nm for example), the atoms of the molecule will be kicked into vibrating back and forth in a certain pattern. This picture shows what such states look like.
An exoplanet atmospheric survey first looks to see how much light the star emits toward us at the colors absorbed by water when nothing else blocks any of the light. When the planet passes between the star and our telescopes during its orbit, we look at each of these wavelengths of light and see how much of it has been blocked. A certain percentage will be blocked solely by the mass of the planet itself.
However, a small part of the star's light will pass through the atmosphere of the planet and escape to the other side. We then look at this light that has passed through the sky of an alien world to see if it is missing a greater amount of those wavelengths that water likes to absorb.
The amount of water present in the measured alien atmosphere was actually something around 100 times less than predicted. This could mean that our models of how elements are distributed and retained in planet formation need tweaking. It could also be due to patterns of cloud or haze in the atmospheres of the planets. Measurements like these are the payoff of incredible recent improvements in astronomical instruments and techniques.
If this trend continues, it may not be long before we begin to look at something even more exciting: the atmospheres of earth-like planets.
Starting roughly 1.8 billion years ago and continuing for a "Boring Billion" years hence, life on Earth was slimy, and evolution stagnated. When the first eukaryotic organisms evolved many millennia before, things looked like they were going to get exciting! What followed was a mega letdown. Life consisted predominantly of microbes and algae. Our planet, so resplendent and colorful today, amounted to little more than a watery, brown-green mess.
Scientists don't have time machines, so how do they know this? The best ancient snapshots come from fossilized structures called stromatolites (pictured above). When scientists examine these primordial records of life -- specifically the ones that date back approximately 800 million to 1.8 billion years -- they see bacteria, single-celled organisms called Archaea, some eukaryotic algae, and little else.
While geologists generally agree on the monotony of the period, they offer many competing explanations for precisely why it was so dull. Through the first half of the year alone, three new studies joined the theoretical fray.
In a study published in April, geologists at the University of St. Andrews blamed the gradual cooling of Earth's interior, which may have caused Earth's surface to stabilize. Thus, with almost no tectonic activity to shake things up, life remained contentedly simplistic.
Ross Large, a distinguished professor of geology at the University of Tasmania, offered a different theory in March. By analyzing seafloor sediments from all across the world, Large and his team found that trace metals like cobalt, selenium, copper, zinc, molybdenum, vanadium, and cadmium seemed to decline in the oceans during the "boring billion." These elements, he says, are critical to life and evolution, and without them, life wouldn't have been able to diversify and advance.
Another recent hypothesis points the finger squarely at oxygen, or rather the lack thereof. Around 2.3 billion years ago, cyanobacteria sent atmospheric oxygen levels soaring 1000 times over, which in turn sent an untold number of oxygen-hating anaerobic bacteria to their graves. But according to a team of American researchers, after the brief spike -- which could very well have catalyzed the evolution of eukaryotes -- oxygen levels seem to have dipped back down. And so, eukaryotes persisted in oxygenated parts of the ocean, but evolution as a whole stalled.
While scientists research and debate why the "Boring Billion" was so boring, a few holdouts, like Harvard University's Andrew Knoll, proffer that the period wasn't as banal as its name suggests.
"There’s reason to believe that all of the properties of cell biology that made complex life possible in the next geologic era were put in place here: cytoskeletons that allow eukaryotic cells to change shape, and cell polarity that allows cells to send a molecular message to one side of the cell but not the other, and to interact with nearby cells. The molecular circuitry and cross talk that allow complex organisms like us to exist today all took root in the so-called Boring Billion."
The history of science is populated with three types of ideas.
First, there are the great ideas. Peter Atkins counts ten of them, in his book Galileo's Finger: The Ten Great Ideas of Science. Evolution by natural selection, the heritability of genetic information, the conservation of energy, and an expanding universe are among those ten ideas.
Second, there are the fascinating-but-wrong ideas, such as alchemy, phlogiston, and spontaneous generation. All of those ideas seemed reasonable in their day, but subsequent investigation proved they were incorrect.
And then there are the stone cold crazy ideas. This is a post about those.
At RealClearScience, we have had the immense pleasure of stumbling across a Turkish researcher named M. Kemal Irmak. It was he who proposed that schizophrenia is actually demonic possession. Looking through his other published papers, he also apparently believes that fluoride causes diabetes in Finland.
He has other ideas. And, bless him, he's not afraid to tell us what they are.
The New Testament narrative of the virgin birth of Jesus has two widely believed explanations. On the one hand, most Christians (and Muslims) accept the biblical account as true but, due to its miraculous nature, is well beyond the reach of science. On the other hand, most skeptics and non-Christians insist the story is a myth, not unlike the Greco-Roman tales of the semi-divine birth of demigods, such as Hercules.
Dr. Irmak has his own hypothesis: The Virgin Mary was a hermaphrodite, and her pregnancy was the result of self-fertilization. He explains:
"Virgin Mary is a chimera of 46,XX/46,XY type resulting from the fusion of two embryos of different sex types and both ovarian and testicular tissues develop in Her body as seen in a beautiful plant."
In other words, the mother of the Virgin Mary was going to give birth to twins. Instead, the twins -- one male, one female -- fused together, forming a hermaphroditic Virgin Mary. After reaching adulthood, she accidentally self-fertilized. As crazy as this sounds, Dr. Irmak actually found an obscure 1990 paper in the journal The Veterinary Record that documents the curious case of a hermaphroditic rabbit that apparently became pregnant after living in isolation.
But, there are (at least) two very big problems with the "hermaphroditic Virgin Mary" hypothesis: (1) Mary's husband, Joseph, probably would have noticed; and (2) While human hermaphrodites certainly do exist, it is very unlikely that they would develop two sets of functioning genitals.
For the sake of argument, however, let's assume that the hypothesis is correct. Are there any milestones we should be looking for in the developing fetus? Dr. Irmak does not address the embryological development of Jesus, specifically, but he does have something rather unique to contribute to the field of embryology: It involves dark matter.
The allocortical birth theory, apparently coined by the esoteric Dr. Irmak, claims that a fetus receives a soul during the 13th week of development. This is predicated upon Dr. Irmak properly understanding a brain structure called the allocortex. Granting him that, where does the soul come from? Dr. Irmak explains:
"The dark matter constitutes most of the mass in our universe, but its nature remains unknown. The soul is likely to work into man’s physical body directly via that dark matter."
Aha! Dark matter, that mysterious substance that makes up 26.8% of the mass-energy of the universe, is actually soulstuff. And it enters the fetal brain during the 13th week of development... through its nose, via a vestigial structure called the vomeronasal organ.
If you've made it this far, you are probably thinking, I'm not terribly convinced by his argument. After all, can he make a testable prediction? I would strongly argue yes.
People who do not have noses would not have souls. People who don't have souls are evil. Can you think of any malevolent humans lacking a proboscis? Of course you can. His name is Lord Voldemort.
Source: Irmak MK. "Embryological basis of the virgin birth of Jesus." J Exp Integr Med 4(2): 143-146. (2014) doi: 10.5455/jeim.060113.hp.011
Source: Irmak MK. "Cosmological dark matter and ensoulment." J Exp Integr Med 3(4): 343-346. (2013) doi: 10.5455/jeim.110813.hp.006
THE DAY WAS progressing normally enough for Pam Johnson (not her real name, in the interest of privacy), but when the 77-year-old sat down to watch television, the monotony turned disturbing. Out of the corner of her eye, she witnessed her left hand creep off its resting position on the armchair and slowly arc across her field of vision. It then began to stroke her face and hair. Now, none of this would be out of the ordinary... had she actually been controlling the appendage. But the hand was doing it all by itself!
Frightened, she called for her husband, and the pair zoomed off to the hospital. By the time they arrived, the hand was back under Pam's control. The 30-minute ordeal seemed like an apparition, but she didn't imagine it. Brain scans confirmed that something was amiss.
The story of Pam's temporarily wayward hand is not at all foreign to neurologists.
"ALIEN HAND SYNDROME is a phenomenon in which one hand is not under control of the mind. The person loses control of the hand, and it acts as if it has a mind of its own," scientists recently described in the Proceedings of Baylor University Medical Center.
Pam got off lucky. Her bout of alien hand syndrome is the shortest known to science. In the dozens of times that the condition has been documented previously, it usually lasts around a year, but can persist for more than a decade.
Pam's case of alien hand appeared to have been caused by a very minor stroke, but that's quite out of the ordinary. Most often, alien hand syndrome arises in the wake of some sort of damage to the corpus callosum or the supplementary motor area of the brain.
For the average patient, once the initial shock wears off, having an arm with a mind of its own can grow to be somewhat amusing, if a little annoying. The syndrome generally does not pose risks, though there are rare instances of out-of-control hands slapping or choking their owners. Occasionally, the hand will reach out and pick up random objects. Oftentimes, it does nothing. However, in some cases, it will maddeningly do the opposite of what the consciously controlled hand does. Imagine trying to change the television channel or get dressed in the morning!
Sergio Della Sala, one of the foremost experts on "anarchic hand" syndrome (as he calls it), described a meeting with one of his patients in a 2005 issue of The Psychologist:
We were discussing the implication of her medical condition for her and her relatives, when, out of the blue and much to her dismay, her left hand took some leftover fish-bones and put them into her mouth. A little later, while she was begging it not to embarrass her any more, her mischievous hand grabbed the ice-cream that her brother was licking. Her right hand immediately intervened to put things in place and as a result of the fighting the dessert dropped on the floor. She apologised profusely for this behaviour that she attributed to her hand’s disobedience. Indeed she claimed that her hand had a mind of its own and often did whatever ‘pleased it’.
IN THE PAST decade, neuroscientists have used functional magnetic resonance imaging (fMRI) to compare the brains of people with alien hand syndrome to those of normal individuals.
"In normal individuals, initiation of motor activity shows activation of multiple extensive neural networks," the Baylor scientists detailed. "However, in patients with alien hand syndrome, only isolated activation of the contralateral primary motor cortex is observed."
So while the systems that move the hand function normally, adjacent systems in the brain that allow patients to consciously register the actions of their hand do not.
Currently, there is no cure for alien hand syndrome, but long-term suffers have been known to deal with their disobedient hands by constantly wearing an oven mitt or giving the hand something to hold on to. Dousing it in warm water throughout the day also may sate the hand's desire to move. It's possible that constant tactile stimulation serves as a basic form of treatment.
The reviews aren't yet in for Scarlett Johansson's new movie Lucy, but a single viewing of the trailer is enough to give the film a resounding "two thumbs down" on science.
Here's the synopsis:
In a near future where corruption reigns, the Taiwanese mob forces a young woman named Lucy to work as a drug runner. They implant a mysterious compound into her body for her to transport. But when its container ruptures, Lucy begins to experience radical changes. The chemical contained within boosts her brain capacity far beyond the normal 10% that humans utilize. She gains superhuman strength, dexterity, and accuracy, absorbs knowledge almost instantaneously, and develops telepathic abilities. Commence kickassery.
Have you spotted the problem?
The idea that humans only use 10% of their brains is a complete, utter, and total myth. Lucy is entirely premised on neuroscientific BS.
Now, as an ardent cinephile (I've often wondered how much better a science writer I'd be if my memory were packed with chemistry and physics facts instead of useless movie trivia), I'm not usually one to be overly pedantic when it comes to science and cinema. I remain in awe of Gravity, despite its scientific shortcomings. I'm willing to suspend belief in physics to allow the existence of a 700-foot wall of ice in Westeros. Heck, I even enjoyed Prometheus, even though the "scientists" in the film acted with complete and total disregard for, well, science.
Oh, and I'm almost certainly going to see Lucy.
But I refuse to pretend that the "10% Myth" is anywhere close to grounded in reality, even though 65% of the public believes it! Thanks to modern brain scanners, scientists have shown that virtually every part of the brain is in use at all times, even when we're sleeping! Moreover, it makes no logical sense that humans would evolve such large brains if we didn't use them.
It's easy to see why the myth has perpetuated for so long: It's an enticing fallacy! There might be psychic powers and super intelligence locked away within our minds. All that's needed is a key!
Or illicit drugs, apparently.
*Because Lucy is totally premised on a notion that's been thoroughly disproven, yet strives to maintain a semblance of realism, I think it enters contention for the most unscientific movie of all time. If you can think of any other nominees, share them in the comments below!
So you spent 17 years and $5 billion to build a fusion experiment. You built a facility wider than the length of three football fields. You built a 400-foot-long laser with more than 33,000 optical parts; it is currently the highest energy laser in the world. You've been through more budget overruns and management problems than you'd care to admit.
Now, you finally turn the thing on at full power and carry out your experiment. And it fails monumentally. Now what?
This is the dilemma facing the National Ignition Facility (NIF). Built with the promise of providing ignition -- creating fusion energy greater than the energy needed to release it -- NIF fell 28,000 times short of its goal. No one knows how to fix it. So NIF has now been finding other things to occupy its time.
There are many useful things to do with a 1.8 MJ laser system.
One recent experiment tested whether diamond can take on a new crystalline form if you squeeze it hard enough. A pressure of 49 million Earth atmospheres was applied to a tiny sliver of the hardest natural material by placing it in the target chamber and firing the giant laser. The predicted crystal restructuring didn't occur. As the pressure increased, the diamond remained resilient instead of abruptly reordering its atoms. This tells us more about diamond and also about the behavior of carbon structures in the cores of extremely dense planets like Jupiter.
Another use for the NIF laser is to gather data for the design and upkeep of nuclear weapons. Since 1992 the US has not tested a single nuclear device. Real world tests of new bomb designs and the working condition of older weapons can no longer be carried out. The task falls to extremely complex codes run on supercomputers. NIF can produce heats and pressures found in bomb detonations. It also mimics the crushing action applied from the fission detonation to produce the far more powerful fusion reaction used in thermonuclear "hydrogen" bombs. Implosion-triggered fusion data can be gathered from NIF to guide the computer simulations.
NIF has also been running experiments on the properties of various materials at extremely high pressures and temperatures, shock wave creation, hydrodynamics and fuel pellets to look for better fusion results.
There are also a few useless things to do with a 1.8 MJ laser system.
This seems like piling on, so let's be clear: fusion power is a goal worth spending billions on. It will take decades of work and massive resources to succeed. Taking gambles on facilities like NIF is a part of this process, and certainly a worthwhile investment. Now that this effort has failed, however, we need to find new science to produce at this uniquely powerful and capable facility.