From the pulpit of University Presbyterian Church in Seattle, the now-retired Rev. Earl Palmer once said, "We have nothing to fear from science." It was one of the many times I knew that I had found the right church for me.
Rev. Palmer is not alone in his embrace of science. Indeed, theologians past and present have found science to be illuminating rather than threatening, undermining the widespread notion of a "war" between science and religion. Just as many prominent scientists have spoken kindly of religion, many theologians have spoken highly of science. We have compiled some of the best quotes below:
"Men...have a moral responsibility to be intelligent. Must we not admit that the church has often overlooked this moral demand for enlightenment? At times it has talked as though ignorance were a virtue and intelligence a crime."
--Rev. Dr. Martin Luther King, Jr. (Strength to Love)
"Art along with science is the highest gift God has given [man]."
--Pope Benedict XVI
"No doubt those who really founded modern science were usually those whose love of truth exceeded their love of power."
--C.S. Lewis (The Abolition of Man)
"Science can purify religion from error and superstition; religion can purify science from idolatry and false absolutes."
--Pope John Paul II (Letter to Director of the Vatican Observatory)
"Even the other sciences and their development help the church in its growth in understanding... The thinking of the church must recover genius and better understand how human beings understand themselves today, in order to develop and deepen the church's teaching."
--Pope Francis (Interview with America magazine)
"I don't think that there's any conflict at all between science today and the Scriptures. I think we have misinterpreted the Scriptures many times and we've tried to make the Scriptures say things that they weren't meant to say, and I think we have made a mistake by thinking the Bible is a scientific book. The Bible is not a book of science."
--Rev. Billy Graham
"Those theologians who are beginning to take the doctrine of creation very seriously should pay some attention to science's story."
--Rev. Dr. John Polkinghorne
Pioneering 19th century German physician Carl Reinhold August Wunderlich is widely known for championing the empirical observation of hospital patients and sagely spreading the idea that fever is a symptom, not a disease, but he is best known for persistently sticking a one-foot rod in the armpits of thousands of people.
The rod was a thermometer, of course, and the temperature measurements he recorded led him to reveal in his 1868 magnum opus, Dos Verhalten der Eigenwarme in Krankenheiten, a number that remains with us even today: 37 °C, or 98.6 °F, the average human body temperature.
Except that number is wrong.
In 1992, researchers at the University of Maryland used the latest equipment and employed rigorous methodology to determine the average human body temperature from a sample of 148 healthy men and women aged 18 through 40 years. Taking over 700 temperature readings spaced out at various times throughout the day, they found that the average human body temperature is closer to 98.2 °F, and, in a bold conclusion that flew in the face of 120 years of common knowledge, stated:
"Thirty-seven degrees centigrade (98.6 °F) should be abandoned as a concept relevant to clinical thermometry."
Even if Mr. Wunderlich is truly wrong -- and it's not looking good for him -- one can't really denigrate his effort. His readings reportedly came from a sizable sample group: as many as 25,000 individuals! But unfortunately, they also came from a now antiquated device. Wunderlich's data almost certainly was hindered by shoddy thermometers.
"Thermometers used by Wunderlich were cumbersome, and had to be read in situ, and, when used for axillary measurements [under the arm] required fifteen to twenty minutes to equilibrate," the researchers noted.
With 98.2 °F now considered to be the correct average human body temperature, it's worth mentioning that, if you took your temperature right now, it almost certainly won't be that number. Your body's temperature fluctuates throughout the day, from roughly 97.6 °F at six in the morning to 98.5 °F at six in the evening. In fact, a temperature as high as 99.5 °F is still considered healthy.
The human body relies on consistent temperatures to function properly. A body temperature ten degrees too warm or twenty degrees too cool more often than not results in death. But there are notable exceptions to the norm. Willie Jones of Atlanta, Georgia survived after reaching an internal temperature of 115.7 °F during a bout of heatstroke. Two-year-old Karlee Kosolofski's body temperature dipped down to 57.5 °F after spending five hours outside in a Canadian winter, yet she lived to tell the tale.
EPA says it focuses on environmental protection. The Animas River disaster shows that it is more concerned with protecting itself.
The accidental spill of toxic wastewater into Colorado’s Animas River is an ironic case study: The very organization meant to protect Americans from environmental catastrophes was responsible for perpetrating it. How should the Environmental Protection Agency be held accountable?
Colorado, and the states downstream of the spill, should sue the EPA. But, instead of merely recovering the cost of environmental damage, the lawsuit should focus on taming the leviathan the EPA has become.
Created in 1970 by President Richard Nixon, the EPA, at its best, has been an important part of improving air and water quality. Clear standards, enforced in a straightforward way have been successful. The fact that the American environment is cleaner and safer than it has been in a century is partially due to EPA action.
In recent years, however, the EPA has moved away from those clear standards, preferring to exercise vague discretion in a way that is costly and often ineffective.
After the Gulf oil spill, the agency was vindictive in its treatment of BP. It banned the oil company, as well as 21 subsidiaries unconnected to the spill, from obtaining new federal contracts due to a “lack of business integrity.” The ban was lifted only after BP sued the EPA. In total, BP paid $54 billion in settlements, including $5.5 billion to the EPA for violating the Clean Water Act.
To be clear, it is not vindictive to hold BP – or anyone else – accountable for environmental damage. But, it is not responsible for the EPA to strain its authority to engage in a self-serving money grab.
The situation with the Animas provides more evidence that EPA’s desire to expand or protect its power can too often trump environmental stewardship.
For example, EPA Director Gina McCarthy told reporters, “The good news is [the Animas River] seems to be restoring itself.” Imagine the (justifiable) outrage from the EPA had BP made such a claim only a few days after the Gulf spill was capped when much of the damage had yet to be assessed.
And it’s not just British oil companies the EPA targets. The EPA threatened a Wyoming man with a $75,000-per-day fine for building a pond on his own property. Such behavior led a Washington Post editorial to observe, “The EPA is earning a reputation for abuse.”
The EPA often argues that money should be no object when protecting the environment. The same agency, however, has been circumspect about paying the significant costs for the damage it caused.
The wide gap between the cavalier attitude toward businesses and personal property rights and their own squeamishness to hold themselves accountable demonstrates that institutional – rather than environmental – protection is playing a decisive factor in EPA decision-making.
If EPA chooses to protect is own, rather than holding employees accountable, can we accuse Director McCarthy of a “lack of integrity”? To what standard will she be held?
The contrasting way the EPA dealt with BP and its own damage at the Animas River demonstrates that agency motives are not always entirely pure. They are quick to demand others pay and give them power, using the environment as a lever. But when their own funding and power is questioned, they minimize the environmental damage and cost. Director McCarthy even had the lack of awareness to tell the people of Colorado not to worry because the “EPA is here.”
The bottom line is that while the EPA has done much good, it has come to associate environmental protection with its own aggrandizement. Now is the time to make it clear that environmental protection, not a self-serving power grab, is what the public wants.
Gas giants are remarkable planets, if for no other reason than they are so unlike our own. But their name is misleading -- most of the matter in gas giants is not in gaseous form. Owing to the immense temperatures and pressures seen within these stellar monstrosities, most matter exists as neither liquid nor gas, but as a supercritical fluid, which shares the properties of both.
The two gas giants in our solar system -- Jupiter and Saturn -- are both visible with the naked eye. But seeing, and imagining, are far from actually being there. No human has yet experienced the grandeur of Jupiter from up close. The Galileo spacecraft is one of the few manmade objects to have touched the Jovian atmosphere. On September 21st, 2003, it plunged into the planet at a speed of thirty miles per second, never to be heard from again.
What happened to Galileo? Did it disentegrate? Is its wreckage floating in Jupiter's supercritical atmosphere? We can only speculate. And on that matter, some of the best speculation recently appeared over at the AskScience section of Reddit.
In response to the question, "Could you stand on a gas planet or would you fall to the center?" user Astromike23 described what it would be like, as a human, to fall through Jupiter's atmosphere. A fair warning, if you're not a fan of heights, this story may disturb you:
You start falling through the high, white ammonia clouds starting at 0.5 atmospheres, where the Sun is still visible. It's very cold here, -150 C (-240 F). Your rate of descent is roughly 2.5x that of Earth, since gravity is much stronger on Jupiter.
You emerge out the bottom of the cloud deck somewhere near 1 atmosphere. It's still somewhat bright, with sunlight filtering through the ammonia clouds much like an overcast day on Earth. Below, you see the second cloud-deck made of roiling brown ammonium hydrosulphide, starting about 2 atmospheres.
As you fall through the bottom of this second cloud deck, it's now quite dark, but warming up as the pressure increases. Beneath you are white water clouds forming towering thunderstorms, with the darkness punctuated by bright flashes of lightning starting somewhere around 5 atmospheres. As you pass through this third and final cloud-deck it's now finally warmed up to room temperature, if only the pressure weren't starting to crush you.
Emerging out the bottom, the pressure is now intense, and it's starting to get quite warm, and there's nothing but the dark abyss of ever-denser hydrogen gas beneath you. You fall through this abyss for a very, very long time.
You eventually start to notice that the atmosphere has become thick enough that you can swim through it. It's not quite liquid, not quite gas, but a "supercritical fluid" that shares properties of each. Your body would naturally stop falling and settle out somewhere at this level, where your density and the atmosphere's density are equal. However, you've brought your "heavy boots" and continue your descent.
After a very, very long time of falling through ever greater pressure and heat, there's no longer complete darkness. The atmosphere is now warm enough that it begins to glow - red-hot at first, then yellow-hot, and finally white-hot.
You're now 30% of the way down, and have just hit the metallic region at 2 million atmospheres of pressure. Still glowing white-hot, hydrogen has become so dense as to become a liquid metal. It roils and convects, generating strong magnetic fields in the process.
Most materials passing through this deep, deep ocean of liquid metallic hydrogen would instantly dissolve, but thankfully you've brought your unobtainium spacesuit...which is good, because it's now 10,000 C (18,000 F). Falling ever deeper through this hot glowing sea of liquid metal, you reflect that a mai tai would really hit the spot right about now.
After a very, very, very long time falling through this liquid metal ocean, you're now 80% of the way down...when suddenly your boots hit a solid "surface", insomuch as you can call it a surface. Beneath you is a core weighing in at 25 Earth-masses, made of rock and exotic ices that can only exist under the crushing pressure of 25 million atmospheres.
(Image: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute, Ron Miller via NASA)
Scurvy doesn't just turn your skin yellow.
In fact, in the later stages of the disease, the skin turns black, often right before you die, horribly, from massive internal hemorrhaging near the brain or heart.
While in middle school health class, you probably learned that sailors of centuries past suffered scurvy when they didn't eat enough oranges. But what you didn't hear was that between 1500 and 1800, an estimated two million of them died from it!
"It was such a problem that ship owners and governments counted on a 50 percent death rate from scurvy for their sailors on any major voyage," science journalist Catherine Price wrote in her book Vitamania. "[A]ccording to historian Stephen Bown, scurvy was responsible for more deaths at sea than storms, shipwrecks, combat, and all other diseases combined."
British Commodore George Anson's celebrated voyage around the world may have earned him fame and fortune, but it also resulted in the deaths of 65% of his crew. 1,300 sailors, stationed across six ships, lost their lives, the vast majority of them to scurvy. Nice job, commodore.
Scurvy wasn't simply a recurring nuisance, it was an appalling scourge, one exacerbated by its gruesome symptoms.
"Scurvy starts with lethargy so intense that people once believed laziness was a cause, rather than a symptom, of the disease," Price wrote. "Your body feels weak. Your joints ache. Your arms and legs swell, and your skin bruises at the slightest touch. As the disease progresses, your gums become spongy and your breath fetid; your teeth loosen and internal hemorrhaging makes splotches on your skin. Old wounds open; mucous membranes bleed."
That's not even the worst of it. Two separate accounts -- one from a chaplain and the other from a surgeon -- describe how the gums engorge and grow over the teeth. If not cut off, the tissue may protrude from the mouth and start to decay. The dying tissue endowed sufferers with the worst possible breath imaginable.
The scope and severity of scurvy was remarkable, especially considering how easy it is to prevent and treat. Scurvy results from a deficiency of vitamin C, which is commonly found in citrus fruits, peppers, and a variety of other plant sources. As little as 10 milligrams of the vitamin per day -- one-fifth the amount found in a single orange -- administered over a week or so, can bring a scurvy sufferer back from the brink of death.
But back when scurvy was at its deadliest, humanity was unaware of the existence of vitamins. Ships on long overseas voyages were also ill equipped to store fresh fruits and vegetables. Moreover, cooks onboard didn't know that vitamin C is destroyed by heat, as well as cutting, and even exposure to air. But perhaps the biggest impediment to solving scurvy was a slow spread of information. Journals of ship physicians dating back to the 17th century reveal that a good few stumbled upon the healing powers of oranges, limes, lemons, and cabbage, but their discoveries never made it to common knowledge.
Scurvy's decline began in 1747, when James Lind demonstrated and publicized citrus' power to treat the disease. In the 19th century, scurvy dwindled at a healthy pace. Today, scurvy is mostly a defunct disease of the past, but citizens of underdeveloped countries -- particularly children -- are still susceptible.
Source: Price, Catherine. Vitamania: Our Obsessive Quest For Nutritional Perfection. Penguin Press, 2015. The science and technology focused Alfred P. Sloan Foundation helped make this book possible.
I crush GMOs!
Imagine if Congress voted on whether or not to teach evolution and climate change in school. And imagine that 73% of Republicans voted against it. The backlash would be easy to predict: The national media, and science journalists in particular, would spend a week making somber declarations of impending educational and scientific collapse that would reverberate across the cosmos.
As it so happens, Congress did just vote on something of tremendous scientific importance: Biotechnology. And, as it so happens, 73% of Democrats voted against the bill. Yet, the national media remained deafeningly and hypocritically silent.
On July 23, the U.S. House of Representatives passed a bill, H.R. 1599, that, among other things, would block states from requiring foods containing genetically modified ingredients to carry special labels. From a scientific viewpoint, this is the correct policy. Yet, the Democratic Party, which has branded itself the "pro-science" party over the last two decades, overwhelmingly opposed it.
Why? Well, it's hard to say, though the fact that places like the GMO-hating Whole Foods tending to be located in counties that voted for Barack Obama might have something to do with it.
In the final vote tally, 94% of House Republicans supported the bill, while a stunning 73% of Democrats voted against it. Even Democrats who represent districts with a large biotechnology constituency voted against the bill: Nancy Pelosi (CA-12), Jackie Speier (CA-14), Mike Honda (CA-17), and Anna Eshoo (CA-18) -- all from the Bay Area -- as well as Boston's Michael Capuano (MA-7) and Stephen Lynch (MA-8) and Seattle's Jim McDermott (WA-7).
The vote pattern made it abundantly clear: On the needlessly hot-button issue of genetic modification, Democrats sided with fearmongers and organic foodies, while Republicans sided with the medical and scientific mainstream.
And yes, just like vaccines, evolution, and anthropogenic climate change, GMOs are mainstream and non-controversial in the scientific community. Indeed, the American Medical Association (AMA) and the American Association for the Advancement of Science (AAAS) (PDF) -- organizations that represent our nation's finest doctors and scientists -- reject GMO labels.
But don't just take their word for it. A massive literature review published in 2013 in the journal Critical Review of Biotechnology, which examined 1,783 papers on the topic, found that GMOs were safe for humans and the environment. In other words, the scientific community is solidly united behind the science of genetic modification; in fact, the toxic C-word, "consensus," is entirely appropriate.
Unfortunately, Democratic politicians aren't the only ideologues who are opposed to GMOs. The $72-billion organic food industry is, too. And anti-GMO activists, such as Gary Ruskin, use the legal system to harass academic scientists. His group, U.S. Right to Know, abuses FOIA requests in order to smear the reputation of honest biotech scientists. And who serves on his Board of Directors? None other than former Democratic Party apparatchik, Lisa Graves, who is now Executive Director of the far left-wing propaganda outlet, Center for Media and Democracy.
Our food is precious. Labels are meant for nutritional and health purposes, not for scoring political points against Monsanto or buttressing Luddite protests against biotechnology.
Let us hope that President Obama and the U.S. Senate can unite behind a bipartisan victory for science and approve the House bill.
In 2007, a 44-year-old happily married man with a white-collar job and two children visited a hospital in Marseille, France complaining of mild weakness in his left leg. Some time later, he concluded his hospital episode with his leg weakness cured, but with another, intriguing diagnosis in tow: he was missing most of his brain.
A disconcerting notion to most, the condition didn't seem to trouble the man much at all. Sure, his IQ tested a tad below average, but his medical history and neurological development were otherwise normal. So how did he develop his strange, yet innocuous infirmity?
The doctors soon learned that when he was just six months old, the man had a condition called hydrocephalus, where an excess amount of cerebrospinal fluid accumulated in the ventricles of his brain. Luckily, it was caught early, and doctors inserted a shunt -- a valve of sorts -- to ensure that the fluid drained properly. Fourteen years later, the shunt was removed.
Perhaps it should have been left in, because over the next thirty years, the fluid gradually built up in the ventricles, ever so slowly condensing or consuming the actual brain matter until it only remained at the outer recesses of the skull.
How the man was able to function normally remains a mystery, but then again, so do many aspects of the brain's operation. The best explanation scientists gave is that the brain is plastic and highly adaptable.
Discover Blogs' Neuroskeptic offered another theory:
While the enormous “holes” in these brains seem dramatic, the bulk of the grey matter of the cerebral cortex, around the outside of the brain, appears to be intact and in the correct place – this is visible as the dark grey ‘shell’ beneath the skull. What appears to be missing is the white matter, the nerve tracts that connect the various parts of the cerebral cortex with each other, and with the other areas of the brain.
According to Neuroskeptic, instances like these call into question how much white matter is truly needed for the brain to function properly. Could it be that a good chunk of the 1,400-gram human brain is superfluous?
Perhaps by closely studying more of these cases of "missing brain," scientists can find out.
Paul Frampton has prodigious intelligence. A (former) tenured professor of physics at the University of North Carolina with more than 450 publications to his name, he undoubtedly possesses a high IQ, as well as a considerable capacity for "logic, abstract thought, understanding, self-awareness, communication, learning, emotional knowledge, memory, planning, creativity, and problem solving."
But Paul Frampton is not intelligent. After all, no smart 68-year-old man would get tricked into thinking that transporting $400,000 dollars worth of cocaine from Bolivia to Brussels would earn him the love and affection of world-renowned bikini model Denise Milani. To fall for such an obvious honey trap, one would have to completely lack skeptical reason, sound judgment, and the ability to think critically.
Frampton's case perfectly exemplifies the most important thing to know about intelligence: Intelligent people can be, and often are, very stupid.
"Intelligence is a capacity," he said. "Intelligence tests don’t tell us what people will do, they tell us what people are potentially able to do. It’s how people use their abilities that make the difference."
In her piece, Drescher described joining Mensa, the International High IQ Society open to anyone with a tested IQ in the top 2%. Upon joining, she was instantly irked to find that the organization featured special interest groups for pseudoscientific topics like extrasensory perception, astrology, and parapsychology. "Why would Mensa, an organization made up of the world's smartest people, maintain groups for disproven subjects?" she wondered. The thought prompted her to explore the science of intelligence, and specifically why supposedly smart people can be so irrational.
Drescher found answers courtesy of a series of laboratory experiments conducted by researchers Keith Stanovich and Richard West. Stanovich and West posed various situations and problems to participants and examined how they tried to think through and solve them. In particular, the duo was interested in how and why subjects answered incorrectly. They found that people tended to be wrong by being ignorant, arrogant, cognitively lazy, or closed-minded.
If you haven't noticed, all of those traits are almost entirely under one's control! We can choose to be curious and seek out information. We can choose to admit that we might be wrong. We can choose to think slowly and surely. We can choose to be open-minded!
We can choose to be intelligent! Or we can choose to be stupid.
So what'll it be?
Over the past month, the first close-ups ever taken of Pluto were beamed in from the New Horizons spacecraft. Our best images of Pluto made a leap from this:
NASA describes the bounty of new science seen in these photographs as "flowing ices, exotic surface chemistry, mountain ranges, and vast haze." These flowing ices are like Earth's glaciers, but instead made of frozen solid nitrogen. Other surface features include mountains likely made of water ice. Understanding the geology of Pluto is an entire scientific field that just turned three weeks old.
The New Horizons Pluto flyby is the latest of several space missions designed to reveal the little details of our solar system. We've known about the big obvious stuff for a long time. The Sun and each planet out to Saturn can be seen with the naked eye. These bodies, as well as the more distant gas giants, have been studied through telescopes for centuries. Between 1960 and 1990 we took a closer look: flying past, orbiting, crashing into, or landing on every planet.
Now we are exploring the smaller and farther stuff. These include objects in and around the asteroid belt, a lot of strange things that make up the "Kuiper belt" beyond Neptune, the comets that live further out still (but occasionally plunge in past us), and the many intriguing moons that orbit the larger bodies. These are the details that transform a sketch of the solar system into a complete landscape.
The asteroid belt is home to a huge number of tiny rocks, but fully half of its mass is concentrated into just four larger bodies. NASA's Dawn mission orbited the second largest body, Vesta, in 2011 and reached the largest, Ceres, this year. Ceres is recognizable as a nearly round dwarf planet, while Vesta is slightly too aspherical to qualify. The ability of a body to create strong enough gravity to pull all of its mass into a nearly round shape is the hallmark of a property called hydrostatic equilibrium. This property is currently used to separate the eight major planets and the dwarf planets from smaller objects.
The Kuiper belt is home to Pluto, as well as at least three more dwarf planets similar in size to Pluto. We have yet to see Eris, Haumea or Makemake in detail. Dozens to hundreds more dwarf planets likely also exist in this region. These objects are hard to discover and even harder to measure because they are too distant to reflect enough sunlight to appear as more than mere specks.
Rosetta and Philae, the European Space Agency's comet probe and lander have visited two more asteroids and touched down on a comet. The mission discovered what makes up a comet's surface and atmosphere and also fueled some hilarious crackpot speculation.
We've also been checking out interesting moons. Saturn's moon Titan has been called the closest thing to Earth in our solar system because space probe Cassini found a thick atmosphere, dunes, mountains and lakes. Flybys of Enceladus, Io and Triton revealed hot, active internal geology and tectonics, punctuated by volcanoes erupting into space. Colder cryovolcanoes ejecting liquid water or ammonia instead of lava have been observed as well. A tiny moon (Dactyl) was even found orbiting a mid-size asteroid (243 Ida).
After Voyager 2 flew by Neptune in 1989, the first era of solar system exploration was in some sense complete. Now, we've finished much of the second phase. We've had a lot of success visiting dwarf planets, comets, asteroids and moons. What comes next? Manned exploration, perhaps? Hopefully we'll at least send some kick-ass robots.
By now, you've probably heard that Cecil, one of Africa's most beloved lions, was killed by a Minnesota dentist on a trophy hunt. Cecil, who dwelled in Zimbabwe's Hwange National Park and was closely monitored via GPS collar by scientists at Oxford, was apparently lured out of the wildlife refuge -- where hunting is illegal -- shot with a crossbow, stalked, and finally finished off with a rifle forty hours later. The incident, whose legality has yet to be conclusively determined, has already sparked international outrage. Many are calling for lion trophy hunting to be banned altogether
Whatever your personal feelings about killing a majestic animal solely to have a trophy (I personally find it repugnant), reasoned discourse demands a dispassionate examination of the evidence, and the evidence suggests that, when managed properly, trophy hunting isn't a problem, and can actually help species recover.
African lions have taken a beating over the decades. While they numbered in the hundreds of thousands a century ago, today, between 23,000 and 39,000 animals remain, spread across just one-fifth of their historic territory. The International Union for Conservation of Nature lists lions as "vulnerable," one step up from "endangered." Habitat loss, disease, and human interference are the major reasons for the decline.
Considering those dire statistics, you might think that the IUCN would oppose trophy hunting. After all, the singular act of killing reduces a species' population. But the organization actually supports it.
"Trophy hunting is a form of wildlife use that, when well managed, may assist in furthering conservation objectives by creating the revenue and economic incentives for the management and conservation of the target species and its habitat, as well as supporting local livelihoods," the IUCN announced in a 2012 report.
That same report reveals two case studies where the establishment and proper regulation of trophy hunting grounds actually helped threatened animal populations recover. Nature writer Richard Conniff shared even more examples in a 2014 op-ed published in the New York Times, including that of Namibia, where lion populations are now increasing. In Conservation Magazine, Jason Goldman shared another instance:
"According to a 2005 paper by Nigel Leader-Williams and colleagues in the Journal of International Wildlife Law and Policy... the legalization of white rhinoceros hunting in South Africa motivated private landowners to reintroduce the species onto their lands. As a result, the country saw an increase in white rhinos from fewer than one hundred individuals to more than 11,000, even while a limited number were killed as trophies."
In a 2013 study published to PLoS ONE, an international team of researchers zeroed in on the trophy hunting of lions. They found that the number of hunting kills in Africa has fallen considerably, down to just 244 per year. That number was as high as 550 a decade ago. They also urged countries with legalized lion trophy hunting to restrict trophy kills to males six years of age or older, to ban the hunting of females altogether, and to require minimum hunt lengths of at least 21 days to ensure that hunters are being properly selective. Most importantly, the researchers recommended that countries take an evidence-based approach to setting hunting quotas.
When it came to choosing bans or management reforms for lion trophy hunting, the authors elected the latter.
"Reforms are arguably preferable to trade bans because they would provide scope for the retention of financial and economic incentives for the retention of land for wildlife and for tolerance of lions, while reducing the negative impacts on lion populations. Given the resilience of lions, populations affected by excessive trophy harvests would likely recover rapidly if lion hunting was managed more sustainably."
If properly wielded, trophy hunting can be a valuable tool for conservation. While Cecil the Lion's death is regrettable, let's make sure it doesn't result in knee-jerk decisions that could actually harm his species as a whole. That would make Cecil's demise even more of a tragedy than it already is.
(Image: Associated Press)
John Harvey Kellogg did not like sex. A Seventh-day Adventist since he was twelve years old, his distaste for procreation went far beyond even his religion's conservative stance. By all accounts, Kellogg abstained from the act his entire life, even through 41 years of marriage! He and his wife Ella slept in different rooms, and adopted all eight of their children.
Kellogg bore an even greater resentment towards masturbation, which he called the "solitary vice." As a prominent surgeon, speaker, and writer in the late 19th and early 20th centuries, he was in a solid position to make war on it.
Through books, speeches, and as chief medical officer of the Battle Creek Sanitarium, a wildly popular holistic health resort that attracted celebrities, presidents, and business moguls to its doors, Kellogg relentlessly lectured on the evils and risks of masturbation. Masturbation, he argued, led to impotence, urinary diseases, insanity, poor posture, acne, epilepsy, and blindness, as well as a preponderance of other infirmities.
As Kellogg wrote in his book Plain Facts for Old and Young, to avoid the temptation of masturbation one must weary of "sexual precocity, idleness, pernicious literature, abnormal sexual passions, exciting and irritating food, gluttony, sedentary employment, libidinous pictures, and many abnormal conditions of life..."
Kellogg particularly urged purity in diet.
"A man that lives on pork, fine-flour bread, rich pies and cakes, and condiments, drinks tea and coffee and uses tobacco, might as well try to fly as to be chaste in thought," he wrote.
Kellogg advocated against eating "spices, pepper, ginger, mustard, cinnamon, cloves, essences, all condiments, salt, pickles... fish, fowl, oysters, eggs, and milk."
"Stimulating drinks should be abstained from with still greater strictness," he added. "Wine, beer, tea, and coffee should be taken under no circumstances."
Partly to help his followers and patients stick to their bland, unstimulating diets, Kellogg and his brother invented corn flakes in 1878. For nearly twenty years, corn flakes were only available at Battle Creek, until the Kellogg brothers started the Sanitas Food Company and began to sell their cereal to the general public. Compared to the porridge and gruel commonly eaten around breakfast tables at the time, the crispy flakes were a hit, though they almost certainly didn't help to curb masturbation.
Today, of course, scientists agree that masturbation does lots of good, and little to no harm. It improves immune functioning, alleviates depression, and reduces the risk of prostate cancer, among many other benefits.
Despite Kellogg's quackery when it came to masturbation, as well as a few other pseudoscientific practices he recommended, he actually was far ahead of his time on some important healthy lifestyle issues. He contended that smoking caused lung cancer decades before the risks were conclusively known and recommended regular exercise. During an age where hypocrisy and corruption were rife among public figures, Kellogg practiced precisely what he preached, living his salubrious lifestyle to the ripe old age of 91.
(Image: Public Domain)
Approximately twenty million Americans visit a chiropractor each year, according to the American Chiropractic Association, making it the largest alternative medicine profession. But if those people were aware of these five facts about chiropractic, I wonder if they'd still be so keen to get their spines manipulated. If you haven't tried chiropractic, these facts might banish any desire to do so.
1. Chiropractic doesn't work. Thousands upon thousands of studies have placed chiropractic under the microscope, examining its effectiveness in treating conditions such as back pain, neck pain, infant colic, headache, and scoliosis. Some studies have found positive results, but many more have shown no effect whatsoever. When the jumble of mixed data is grouped together and examined, only one conclusion is warranted: "these data fail to demonstrate convincingly that spinal manipulation is an effective intervention for any condition."
2. There's a genuine risk of stroke. While spinal manipulation at the hands of a trained chiropractor is generally safe, there's a boatload of evidence to suggest that you should never let a chiropractor touch your neck. The primary vertebral artery, which supplies blood to the brain, is located at the crest of your neck just below your skull. Abrupt manipulations of the cervical vertebrae in the neck, can, and have, caused the artery to rupture, resulting in stroke, coma, or even death. As one would expect, the American Chiropractic Association denies the existence of these events.
3. Chiropractic's most fundamental theory is bunk. Chiropractic was founded on the idea that correcting misaligned vertebrae in the spine -- called subluxations -- could cure all forms of disease. "A subluxated vertebra ... is the cause of 95 percent of all diseases ... The other five percent is caused by displaced joints other than those of the vertebral column," D.D. Palmer, the creator of chiropractic, wrote. Most modern day chiropractors now admit that this is totally wrong.
In 2009, four curious chiropractors reviewed all available evidence on the topic and concluded, "No supportive evidence is found for the chiropractic subluxation being associated with any disease process or of creating suboptimal health conditions requiring intervention. Regardless of popular appeal, this leaves the subluxation construct in the realm of unsupported speculation."
4. Chiropractic's founder was probably crazy. D.D. Palmer created chiropractic back in the late 1800s, but if you asked him, he would say that he got the idea from a medical physician named Dr. Jim Atkinson. As humble as it is for Mr. Palmer to share credit, it's also a little strange, especially considering Jim Atkinson was dead, and according to Palmer, relayed the instructions for chiropractic from beyond the grave. According to B.J. Palmer, D.D.'s son, "Father often attended the annual Mississippi Valley Spiritualists Camp Meeting where he first claimed to receive messages from Dr. Jim Atkinson on the principles of chiropractic."
5. Chiropractic hurts. Simply put, spinal manipulation usually doesn't feel good. "It often involves a high velocity thrust, a technique in which the joints are adjusted rapidly, often accompanied by popping sounds," described Edzard Ernst, a Professor of Complementary Medicine at the University of Pennsylvania. These disconcerting sounds are often harbingers of adverse side effects. Thirty to 61 percent of patients generally experience pain, numbness, stiffness, dizziness, tingling and headaches, which can persist for up to 48 hours after their appointment. These generally mild pains might be worth the discomfort if chiropractic actually worked in the long term. But it doesn't.
On December 20th, 2013, while waiting to board a flight to Cape Town, South Africa, 30-year-old Justine Sacco tweeted a mildly offensive joke to her 170 Twitter followers. An overtly insane comment, she didn't think anyone would take it literally:
Wheels down eleven hours later, she switched her phone off airplane mode to find that the joke had traveled far beyond her small sect of online groupies. Actually, it seemed to have spread across the entire Internet! And people weren't laughing; they were outraged.
Mean-spirited tweets bombarded Justine from all angles. They demanded that she be fired from her job as senior director of corporate communications at IAC, the company that owns OKCupid and Vimeo. They publicly lambasted her for being racist, insensitive, stupid, and privileged. They threatened her with rape. Sacco was let go from her job soon thereafter, a livelihood erased by the clamorous whims of a social media mob.
"I cried out my body weight in the first 24 hours,” she later told journalist Jon Ronson. “It was incredibly traumatic. You don’t sleep. You wake up in the middle of the night forgetting where you are.”
How could thousands of people who never met Justine Sacco be so willing to ruthlessly destroy her?
Psychologists have actually known the answer to this question for decades. The human brain evolved to be social. Our ancestors who survived were the ones who stayed in a group, not strayed as individuals. As such, the desire to conform is hardwired, and it's an urge that can be harnessed both for good and for unforgiving brutality.
Psychological offshoots of our evolved sociality are groupthink, mob mentality, and the bandwagon effect, each of which has been repeatedly demonstrated in the scientific literature. All of these phenomena describe a propensity for individuals to conform to group beliefs and ideals. But as a side effect, they tend to evoke irrational behavior that goes against common sense and evidence. They hearken back to a more primal mode of thinking, one that still persists within us.
We've witnessed the results of such primitive thinking already, with casualties measured in lives, not jobs or twitter followers. Roughly 40,000 to 60,000 "witches" were burned at the stake in Europe between 1450 and 1750, a mass murder resulting from hysteria and fear. These "witches" weren't actually casting black magic, of course. Often, they were old, cantankerous women viewed as nuisances by local nobility. Social media shaming is a modern day form of witch hunting, reinforcing ties within the group by expelling those who seem to go against social norms.
Though it may not go to the same extremes of physical brutality as witch-hunting, social media shaming can be even more excruciating mentally. The insults and comments that torture the targeted casts out are regularly based on snap judgments with little forethought. And yet, as vigilantes fire out their tweets and posts, they probably think they're doing some sort of good.
I wonder, did the bystanders who jeered as "witches" were burned alive think they were doing right by society as well?
Public shaming doesn't need to be eliminated -- if performed correctly, it can serve as a strong show of solidarity against unnacceptable behaviors -- but it does need to be improved. The solution is civility and slow, measured scientific thinking. Wait for more evidence. Scrutinize that evidence. Did it come from a repubtable source? Is it hard data or hearsay? Before there is evidence, don't draw unfounded conclusions. Above all, be empathetic. Give your fellow man and woman the benefit of the doubt.
With reason and thoughtful restraint, what happened to Justine Sacco need not befall anyone else.
Scarcely a week goes by without news of a blood shortage somewhere in the United States. Summertime in particular sees supplies on the wane. With families on vacation and schools out of session, the American Red Cross regularly witnesses a dip in donations.
But with one simple change, blood shortages in the United States could be drastically reduced, or perhaps eliminated entirely. It's a solution seemingly out of Count Dracula's playbook: drain blood from the dead.
Unpalatable and macabre at first glance, the idea actually makes a lot of sense. Roughly 15 million pints of blood are donated each year by approximately 9.2 million individuals. Over the course of the same year, about 2.6 million Americans will -- sadly -- pass away. If hospitals were to harvest the blood from a third of those people, roughly 4.5 million liters would be added to the reservoir.
Contrary to what you might think, blood from cadavers is not only usable, but quite safe.
"For six to eight hours, the blood inside a dead body remains sterile and the red blood cells retain their oxygen-carrying capabilities," Mary Roach reported in her book Stiff.
In fact, as Roach further described, "For twenty-eight years, the Sklifosovsky Institute [in Moscow] happily transfused cadaver blood, some twenty-five tons of the stuff, meeting 70 percent of its clinics needs."
The idea has never caught on in the United States, however, primarily out of public distaste. Tampering with the body of a deceased individual frequently evokes ethical conundrums and moral aversions in the minds of many.
However, draining the blood from a body is hardly out of the ordinary; it's actually a regular part of the embalming process. To prepare a dead body for funeral services and eventual burial or cremation, morticians pump out all of the blood and interstitial fluids and replace them with an embalming solution, typically containing formaldehyde and methanol. Would it not make more sense to remove the blood at the hospital soon after death, rather than let it all go to waste?
Public opinion isn't the only hurdle to implementing this plan. Without a beating heart, blood does not flow, so hospital staff can't simply stick a syringe into the median cubital vein on an arm and expect blood to come spurting out. Nor can they necessarily use an embalming machine, which forces the blood out by suffusing the veins and arteries with fluid. That would likely contaminate the blood.
Instead, staff might have to be trained in a more primitive technique. After obtaining familial consent and conducting necessary tests, a larger needle attached to a more voluminous tube would be inserted into the jugular vein at the neck. Then the body would be tilted downward so the blood flows out with the aid of gravity. Simple, effective, yet perhaps a tad morbid...
According to the American Red Cross, someone in the U.S. needs blood every two seconds, and more than 41,000 donations are needed each day. Taking blood from cadavers could ensure that no patient is ever deprived of the life-giving blood they need.
With the "Dog Days of Summer" in full swing, school is the last thing on many young Americans' minds. But as sure as the changing of the seasons, school will return, along with homework, tests, and grades.
The demands of the college environment in particular quickly dispel the intellectual ease of summer. As students set in for the semester's academic haul, some turn to "smart pills" to help shoulder the load.
These so-called "smart pills" are drugs intended to treat attention-deficit/hyperactivity disorder (ADHD). They include methylphenidate, most commonly known as Ritalin, and amphetamine salts, mixed together under the name of Adderall. With their prescription skyrocketing over the past decades, these drugs are becoming incredibly easy for otherwise healthy college students to obtain. Studies examining their use for "nonmedical" purposes give wide-ranging estimates, but generally indicate that between five and thirty percent of college students have partaken. The most common reasons students report using the drugs are to boost concentration and attention, and to help study and memorize information.
But do the drugs actually do what students think they do? To the people I've surveyed who used the drugs, the answer is obvious: "Absolutely." But the science isn't so clear.
When reviewers at the University of Pennsylvania poured through the published scientific literature on the topic, they found mixed results, though they tentatively concluded that stimulant drugs like Adderall and Ritalin probably do enhance learning and working memory. The effects, however, would likely be small, and may not even translate to improved academic performance. The researchers also added that there are other, far safer and more effective methods of boosting brain power.
"Low-tech methods of cognitive enhancement include many components of what has traditionally been viewed as a healthy lifestyle, such as exercise, good nutrition, adequate sleep, and stress management," they wrote.
"Smart pills" present genuine risks, including irregular heartbeat and seizures for the occasional user, as well as dependence and depression for the frequent user. Their use is also commonly linked to perhaps the most counterproductive study practice: the all-nighter. Study after study after study has found that staying up all night to cram for an exam often yields little to no benefit. The loss of sleep hurts one's test scores as much as all-night studying helps.
"The rumored effects of 'smart drugs' may be a false promise," Shaheen Lakhan and Annette Kirchgessner, researchers at the Global Neuroscience Initiative Foundation, wrote in 2012.
If the "promise" is that a simple pill will boost one's grades, then it's definitely false. Our society seems ever on the hunt for miracle pills and magic cures, but rarely does one ever get pulled out of a hat. Louisiana State University psychologists presented a more nuanced view of "smart drugs" in 2013.
"Like many drugs, stimulants influence behavior in multiple ways," they wrote. "Depending on the circumstances, stimulants may, or may not, enhance cognition."
Immunologists are fond of making analogies with law enforcement to explain how the immune system works. Macrophages, cells which gobble up invading microbes, are often compared to beat cops, patrolling the neighborhood for any signs of trouble. Neutrophils, which my former graduate school advisor likens to "little hand grenades," are like miniature SWAT teams, rushing in with guns blazing, shooting first and asking questions later. T-cells, which coordinate the immune response, are akin to intelligence officers, while the antibody-producing B-cells, which target highly-wanted suspect pathogens, are similar to the FBI.
But none of these are the coolest cells in the immune system. That distinct honor goes to the natural killer (NK) cells, your body's very own "secret police."
Most of your body's cells, with the major exception of red blood cells, are covered with a sort of identity card called major histocompatibility complex (MHC) class I molecules. It is these proteins, along with their class II counterparts, that must be matched for an organ transplant to be successful. If there is too much of a mismatch between a donor's and a recipient's MHC proteins, the recipient's immune system will reject the organ.
This may, at first glance, seem unhelpful. Organ rejection is a bad thing. True, but our immune system didn't develop with organ transplants in mind. Instead, it evolved to fight off foreign invaders, and that is where the "identity card" function of the MHC proteins becomes invaluable.
NK cells keep vigilance by knocking on cells' doors, probably at midnight when they least expect it, asking for the secret password. Cells then flash their identity cards, i.e., the MHC class I molecules. Cells that are unable to show their identity cards are killed on the spot. No questions asked. (See figure.)
(Image: Janeway's Immunobiology. 8th Edition.)
As shown above, a healthy body cell expresses both an identity card (MHC class I) and an activating ligand. The latter tells the NK cell to, "Kill me!" while the MHC class I molecule says, "I'm safe! Don't kill me!" If an NK cell receives both signals, it does nothing and lets the cell live. But cells that express an altered or damaged MHC class I molecule are unable to transmit the "Don't kill me!" signal. Thus, they are obliterated by the NK cells.
This method of law enforcement might seem rather harsh. However, NK cells know that tumor cells and cells infected with pathogens often are unable to properly express MHC class I molecules on their surfaces. The lack of an identity card is a telltale sign that something went horribly wrong inside the cell.
Though the term "secret police" makes us feel rather uncomfortable, at the microbiological level, this clandestine group of assassins is absolutely vital to our health. Indeed, the natural killer cells appear to be most active during the long lag when T-cells and B-cells are preparing to launch their own assault on a foreign invader.
Source: Murphy KM (2012). Janeway's Immunobiology. 8th Edition. New York: Garland Science.
The Affordable Care Act required many types of birth control to be made available to women free of charge, but not everyone can take advantage. Millions of women too poor to afford private insurance or working at employers granted a religious exemption to the law are still on the hook for contraception, the costs of which can range from $15 to $50 a month for pills to an $800 up-front cost for an intrauterine device (IUD).
There is, of course, a simpler solution to ensure that women have access to contraception: make it free for all low-income women. Publicly funded birth control.
Public birth control is an issue enveloped in ideology. Let's strip it all away and look at the evidence.
From 2007 to 2011, doctors at the Washington University School of Medicine in St. Louis provided free birth control and counseling services to 9,256 women and girls ages 14 to 45 in the St. Louis, Missouri area. The effort was a resounding success. Abortion rates among study participants were 62 to 78% lower than the national average. Critically, among girls ages 15 to 19 enrolled in the study, the annual birth rate was 6.3 per 1,000, well below the U.S. rate of 34.3 per 1,000 for girls the same age.
Researchers from the University of North Carolina conducted a similar study in 2010, providing free IUDs to low-income women and comparing their pregnancy rates to a control group of women without free birth control. Rates of pregnancy were significantly lower in the group given IUDs.
Will these successes bear out on a large scale? If Colorado is any indication, the answer is an emphatic "yes." In 2009, the state began offering IUDs to low-income women at low- or no-cost. Four years later, the states' teen birth rate had dropped 40%, and in the counties with the program in place, the teen abortion rate plummeted 35%. What's more, every dollar spent on the contraceptive program saved roughly $5.68 in Medicaid costs.
The United States currently spends $2.37 billion on family planning services, mostly through Medicaid. Many of these services offer birth control free of charge. The nonprofit Guttmacher Institute has quantified the benefits:
"In 2010, publicly funded contraception services helped women prevent 2.2 million unintended pregnancies; 1.1 million of these would have resulted in unplanned births and 760,000 in abortions. Without publicly funded contraceptive services, the rate of unintended pregnancies, unplanned births, and abortions would all be 66% higher; the rates for teens would be 73% higher."
The Guttmacher Institute also found cost savings similar to those seen in Colorado.
"In 2010, these services resulted in a net savings to the federal and state governments of $13.6 billion — $7 for every public dollar spent," Kinsey Hasstedt, a public policy associate at the Guttmacher Institute, wrote in the New York Times.
Teen birth rates have been steadily dropping for decades, but as RCS' Alex Berezow reported last year, they are still unacceptably high.
"While fewer than 1 in 20 white teens became pregnant, about 1 in 10 Hispanic teens and more than 1 in 9 black teens became pregnant. Fetal loss was more than twice as common among black and Hispanic teens than among white teens, possibly indicating poorer access to prenatal care. And, more than 1 in 3 black teen pregnancies ended in abortion."
These statistics represent real situations that need not occur. Free contraception, accompanied with counseling on both abstinence and safe sexual activity, is an evidence-based remedy.
Here at RealClearScience, we pride ourselves on five things: (1) Explaining the science behind complex topics; (2) Debunking bad science or pseudoscience; (3) Endorsing policies and opinions that we feel are best aligned with scientific evidence; (4) Having the site operated by people who were trained in science, not journalism; and (5) Remaining politically agnostic and as transparent as possible.
Dedication to these principles has led us to operate in ways that few other media outlets would dare imitate. For instance, in an article titled "What RealClearScience Is For and Against," we listed our position on every controversial scientific topic that is discussed in the media. Perhaps surprisingly, our entire editorial team was in 100% agreement on the list, though each of us comes from different ideological backgrounds.
How is that possible? Because, as we have written before, we are first and foremost dedicated to a fact-based worldview. If the evidence changes, our opinion changes. Adherence to that simple mandate is quite liberating: It allows us the freedom to be curious and to follow the scientific data wherever it may lead. That intellectual emancipation further allows us to remain open and honest and, most importantly, to shake off the rigid partisanship that has gripped too much of America's mainstream media.
So, in honor of that tradition of scientific transparency that we began nearly five years ago, we felt that it would be important to reveal exactly where each of us fell on the political spectrum. Each of us took a "political quiz" (courtesy of CelebrityTypes.com) -- which to our estimation appears fairly accurate -- and we have posted the results below. (How many other journalists do you think would be willing to do that?)
Detailed descriptions of the chart can be seen here, but in general, this is a quick-and-dirty way to interpret the chart:
Left/Right = Applies to economic policy
Liberal/Communitarian = Translates to "individual rights" vs. "societal rights"
Red = Republicans (conservatism)
Yellow = Libertarians (libertarianism)
Blue = Democrats (social liberalism)
Green = Socialists (social democracy)
Editor Alex Berezow, who founded RealClearScience in October 2010, describes himself as a political centrist. His test result mostly confirms this description. He is precisely halfway between liberal/communitarian (meaning that he values individual and societal rights roughly equally), but he skews slightly to the right on economic policy.
Assistant editor Ross Pomeroy, who joined RealClearScience in June 2011, describes himself as a "lefty who is annoyed by most lefties." His test result mostly confirms the first part of that description, though it can't measure the latter.
Finally, our Newton Blog contributor and physics connoisseur, Tom Hartsfield, describes himself as a libertarian. His test result surprised him a bit; he was quite far to the right on economic policy (which is typical for a libertarian), but nearly centrist on the liberal/communitarian axis (which is not typical for a libertarian).
If we average our scores together, we obtain:
Left/Right: 33.3 (out of 100) to the Right
Liberal/Communitarian: 16.7 (out of 100) toward Liberal
Thus, as a whole, the RealClearScience editorial team is mostly centrist, with a very slight libertarian skew. And, we aren't afraid to admit it, because when it comes to science, we do our best to put all of our ideological baggage aside and analyze the data as objectively as we can.
Now, we challenge all media outlets to be as transparent as RealClearScience!
When free-reading time was announced during class trips to the elementary school library, I knew exactly where I was headed. Little eight-year-old me would stroll to a section in the shadow of a raised fort in the back corner. There, I'd pull out books on Bigfoot, UFOs, yetis, ghosts, the Loch Ness Monster, and the Bermuda Triangle, plop myself down on a bean bag chair, and flip through pages of what seemed to me to be real-life fantasy. Reading about sightings, disappearances, and unexplained occurrences, I was totally entranced. This, ironically, was where my interest in science truly began.
My experience is not unique. Paul Willis, director of RiAus, Australia's Science Channel, has noticed a similar thread among many of the scientists he's met during his long career.
"I've been struck by how many of my colleagues also shared an early interest in the pseudosciences," he wrote at ABC Science.
And why shouldn't this be the case? At the heart of the scientific endeavor is unbridled curiosity, a desire to seek out the strange and explain the unexplained. Without this flame, there is no fire. Fantastical pseudoscience can easily provide an initial spark. First enthralled by Bigfoot, budding zoologists might turn their attention to the no less amazing reservoir of undescribed species. Baffled by UFOs, young stargazers might seek out new life on other worlds, rather than wait to be "visited" on Earth.
To me, and to many scientists, pseudoscientific stories in our youth were enticing prospects, accounts we wanted to believe then, and still want to believe now. But there was a critical time where we learned that belief does not make something real. For that, we need evidence.
There's nothing sad about this evolution; it's simply a maturation of thought. There comes a time where reading slim accounts of mythical phenomena just isn't enough anymore. Scientists need to see those things, or even make them come alive. The fantastical is infinitely more amazing when it's genuinely real.
Purveyors of pseudoscience regularly accuse skeptical scientists of close-mindedness. The opposite is the case. Zoologists would be ecstatic if Sasquatch actually roamed the forests of the Pacific Northwest, or if a descendant of plesiosaurs dwelled in Scotland's Loch Ness. Physicists would love to find concrete evidence of ghosts, for their existence would surely indicate a new energy source, or even an undiscovered dimension. Perhaps the most famous of skeptics, astrophysicist Carl Sagan, openly admitted, "Nobody would be happier than me if we we're being visited [by aliens]."
But, he added, "What counts is not what sounds plausible, not what we'd like to believe, not what one or two witnesses claim, but only what is supported by hard evidence, rigorously and skeptically examined."
That's how we know the difference between what is truly real and what can only be found on the sylized pages of a book.
Thomas Gold was a deep thinker. A fellow of eight distinguished scientific organizations and a winner of numerous prestigious prizes, the Austrian-born American astrophysicist poured his heart and mind into science. Working out of Cornell University for much of his career, Gold published dozens of papers in astronomy and geophysics, thought up many ingenious, often controversial theories, helped establish the now iconic Arecibo Observatory in Puerto Rico, and hired a budding 34-year-old astrophysicist by the name of Carl Sagan.
Noted for his willingness to venture down the metaphorical rabbit hole, Gold climbed down a particularly deep one in the latter half of his life. By the 1970s, scientists broadly accepted that oil was of a biological origin. Labeling oil a "fossil fuel," they theorized that deceased algae and plankton sank to the bottom of oceans and, over millions of years were slowly transformed into oil by pressure and heat. Gold wasn't so sure.
Noticing that hydrocarbons, organic compounds of hydrogen and carbon, are present on and in other astronomical bodies, Gold reasoned that Earth's oil might originate from nonliving sources instead of living ones. He argued that an unfathomable amount of hydrocarbons were locked within Earth, and that as they seeped upwards, they were converted to oil and gas. He eventually outlined his theory in a book, The Deep Hot Biosphere, making two other fantastical propositions as well:
"that below the surface of the earth is a biosphere of greater mass and volume than the biosphere the total sum of living things on our planet's continents and in its oceans... [and] that the inhabitants of this subterranean biosphere are not plants or animals as we know them, but heat-loving bacteria that survive on a diet consisting solely of hydrocarbons that is, natural gas and petroleum."
In short, Gold claimed that within our world lies another world! An undeniably exciting prospect, there is currently no meaningful method of exploring it. Without evidence, a deep hot biosphere remains in the realm of fiction. We can, however, examine Gold's original claim, that oil and gas are not fossil fuels, that they are instead abiotic.
To his credit, Gold spent years molding and testing his idea. First conceiving of the theory in the 1950s, he let it simmer in his brain for around twenty years before finally fleshing it out. In numerous published works he contended that the movement of tectonic plates and faults allowed methane to migrate up into the mantle, where, through cooling and other processes, it would transform into crude oil.
In 1986, Gold used his influential standing to procure $40 million for a drilling expedition in a 360 million-year-old impact crater near Lake Siljan in Sweden. This would be his moment.
According to Gold's theory, abiotic oil should have been present below the crater in great quantities. It wasn't. Despite drilling down over six kilometers in two locations, the costly operation turned up just 80 barrels of oil over six years, and there was no definitive proof that it was abiotic in origin.
Empty-handed, and limited by the technology available, Gold could only return to theorizing. He speculated that the oil he recovered in Sweden may have started as hydrocarbons from a deep biosphere, then were converted to oil by deep-dwelling thermophilic bacteria. Additional knowledge of the Earth's mantle procured years later would provide critical evidence against his theory of abiotic oil. According to the University of Tokyo's Geoffrey Glasby:
"Methane can only be converted to higher hydrocarbons at pressures >30 kbar corresponding to a depth of ~100 km below the Earth's surface. The proposed reaction of methane to produce higher hydrocarbons above this depth and, in particular, in the upper layers of the Earth's crust is therefore not consistent with the second law of thermodynamics."
Despite the evidence against Gold's theory of abiotic oil, it cannot be completely ruled out. Though the weight of evidence supports a fossil fuel origin of oil and gas, most geologists admit there's still a tiny chance it might be wrong. Abiotic oil could very well exist, perhaps in small quantities.