Over the past month, the first close-ups ever taken of Pluto were beamed in from the New Horizons spacecraft. Our best images of Pluto made a leap from this:
NASA describes the bounty of new science seen in these photographs as "flowing ices, exotic surface chemistry, mountain ranges, and vast haze." These flowing ices are like Earth's glaciers, but instead made of frozen solid nitrogen. Other surface features include mountains likely made of water ice. Understanding the geology of Pluto is an entire scientific field that just turned three weeks old.
The New Horizons Pluto flyby is the latest of several space missions designed to reveal the little details of our solar system. We've known about the big obvious stuff for a long time. The Sun and each planet out to Saturn can be seen with the naked eye. These bodies, as well as the more distant gas giants, have been studied through telescopes for centuries. Between 1960 and 1990 we took a closer look: flying past, orbiting, crashing into, or landing on every planet.
Now we are exploring the smaller and farther stuff. These include objects in and around the asteroid belt, a lot of strange things that make up the "Kuiper belt" beyond Neptune, the comets that live further out still (but occasionally plunge in past us), and the many intriguing moons that orbit the larger bodies. These are the details that transform a sketch of the solar system into a complete landscape.
The asteroid belt is home to a huge number of tiny rocks, but fully half of its mass is concentrated into just four larger bodies. NASA's Dawn mission orbited the second largest body, Vesta, in 2011 and reached the largest, Ceres, this year. Ceres is recognizable as a nearly round dwarf planet, while Vesta is slightly too aspherical to qualify. The ability of a body to create strong enough gravity to pull all of its mass into a nearly round shape is the hallmark of a property called hydrostatic equilibrium. This property is currently used to separate the eight major planets and the dwarf planets from smaller objects.
The Kuiper belt is home to Pluto, as well as at least three more dwarf planets similar in size to Pluto. We have yet to see Eris, Haumea or Makemake in detail. Dozens to hundreds more dwarf planets likely also exist in this region. These objects are hard to discover and even harder to measure because they are too distant to reflect enough sunlight to appear as more than mere specks.
Rosetta and Philae, the European Space Agency's comet probe and lander have visited two more asteroids and touched down on a comet. The mission discovered what makes up a comet's surface and atmosphere and also fueled some hilarious crackpot speculation.
We've also been checking out interesting moons. Saturn's moon Titan has been called the closest thing to Earth in our solar system because space probe Cassini found a thick atmosphere, dunes, mountains and lakes. Flybys of Enceladus, Io and Triton revealed hot, active internal geology and tectonics, punctuated by volcanoes erupting into space. Colder cryovolcanoes ejecting liquid water or ammonia instead of lava have been observed as well. A tiny moon (Dactyl) was even found orbiting a mid-size asteroid (243 Ida).
After Voyager 2 flew by Neptune in 1989, the first era of solar system exploration was in some sense complete. Now, we've finished much of the second phase. We've had a lot of success visiting dwarf planets, comets, asteroids and moons. What comes next? Manned exploration, perhaps? Hopefully we'll at least send some kick-ass robots.
By now, you've probably heard that Cecil, one of Africa's most beloved lions, was killed by a Minnesota dentist on a trophy hunt. Cecil, who dwelled in Zimbabwe's Hwange National Park and was closely monitored via GPS collar by scientists at Oxford, was apparently lured out of the wildlife refuge -- where hunting is illegal -- shot with a crossbow, stalked, and finally finished off with a rifle forty hours later. The incident, whose legality has yet to be conclusively determined, has already sparked international outrage. Many are calling for lion trophy hunting to be banned altogether
Whatever your personal feelings about killing a majestic animal solely to have a trophy (I personally find it repugnant), reasoned discourse demands a dispassionate examination of the evidence, and the evidence suggests that, when managed properly, trophy hunting isn't a problem, and can actually help species recover.
African lions have taken a beating over the decades. While they numbered in the hundreds of thousands a century ago, today, between 23,000 and 39,000 animals remain, spread across just one-fifth of their historic territory. The International Union for Conservation of Nature lists lions as "vulnerable," one step up from "endangered." Habitat loss, disease, and human interference are the major reasons for the decline.
Considering those dire statistics, you might think that the IUCN would oppose trophy hunting. After all, the singular act of killing reduces a species' population. But the organization actually supports it.
"Trophy hunting is a form of wildlife use that, when well managed, may assist in furthering conservation objectives by creating the revenue and economic incentives for the management and conservation of the target species and its habitat, as well as supporting local livelihoods," the IUCN announced in a 2012 report.
That same report reveals two case studies where the establishment and proper regulation of trophy hunting grounds actually helped threatened animal populations recover. Nature writer Richard Conniff shared even more examples in a 2014 op-ed published in the New York Times, including that of Namibia, where lion populations are now increasing. In Conservation Magazine, Jason Goldman shared another instance:
"According to a 2005 paper by Nigel Leader-Williams and colleagues in the Journal of International Wildlife Law and Policy... the legalization of white rhinoceros hunting in South Africa motivated private landowners to reintroduce the species onto their lands. As a result, the country saw an increase in white rhinos from fewer than one hundred individuals to more than 11,000, even while a limited number were killed as trophies."
In a 2013 study published to PLoS ONE, an international team of researchers zeroed in on the trophy hunting of lions. They found that the number of hunting kills in Africa has fallen considerably, down to just 244 per year. That number was as high as 550 a decade ago. They also urged countries with legalized lion trophy hunting to restrict trophy kills to males six years of age or older, to ban the hunting of females altogether, and to require minimum hunt lengths of at least 21 days to ensure that hunters are being properly selective. Most importantly, the researchers recommended that countries take an evidence-based approach to setting hunting quotas.
When it came to choosing bans or management reforms for lion trophy hunting, the authors elected the latter.
"Reforms are arguably preferable to trade bans because they would provide scope for the retention of financial and economic incentives for the retention of land for wildlife and for tolerance of lions, while reducing the negative impacts on lion populations. Given the resilience of lions, populations affected by excessive trophy harvests would likely recover rapidly if lion hunting was managed more sustainably."
If properly wielded, trophy hunting can be a valuable tool for conservation. While Cecil the Lion's death is regrettable, let's make sure it doesn't result in knee-jerk decisions that could actually harm his species as a whole. That would make Cecil's demise even more of a tragedy than it already is.
(Image: Associated Press)
John Harvey Kellogg did not like sex. A Seventh-day Adventist since he was twelve years old, his distaste for procreation went far beyond even his religion's conservative stance. By all accounts, Kellogg abstained from the act his entire life, even through 41 years of marriage! He and his wife Ella slept in different rooms, and adopted all eight of their children.
Kellogg bore an even greater resentment towards masturbation, which he called the "solitary vice." As a prominent surgeon, speaker, and writer in the late 19th and early 20th centuries, he was in a solid position to make war on it.
Through books, speeches, and as chief medical officer of the Battle Creek Sanitarium, a wildly popular holistic health resort that attracted celebrities, presidents, and business moguls to its doors, Kellogg relentlessly lectured on the evils and risks of masturbation. Masturbation, he argued, led to impotence, urinary diseases, insanity, poor posture, acne, epilepsy, and blindness, as well as a preponderance of other infirmities.
As Kellogg wrote in his book Plain Facts for Old and Young, to avoid the temptation of masturbation one must weary of "sexual precocity, idleness, pernicious literature, abnormal sexual passions, exciting and irritating food, gluttony, sedentary employment, libidinous pictures, and many abnormal conditions of life..."
Kellogg particularly urged purity in diet.
"A man that lives on pork, fine-flour bread, rich pies and cakes, and condiments, drinks tea and coffee and uses tobacco, might as well try to fly as to be chaste in thought," he wrote.
Kellogg advocated against eating "spices, pepper, ginger, mustard, cinnamon, cloves, essences, all condiments, salt, pickles... fish, fowl, oysters, eggs, and milk."
"Stimulating drinks should be abstained from with still greater strictness," he added. "Wine, beer, tea, and coffee should be taken under no circumstances."
Partly to help his followers and patients stick to their bland, unstimulating diets, Kellogg and his brother invented corn flakes in 1878. For nearly twenty years, corn flakes were only available at Battle Creek, until the Kellogg brothers started the Sanitas Food Company and began to sell their cereal to the general public. Compared to the porridge and gruel commonly eaten around breakfast tables at the time, the crispy flakes were a hit, though they almost certainly didn't help to curb masturbation.
Today, of course, scientists agree that masturbation does lots of good, and little to no harm. It improves immune functioning, alleviates depression, and reduces the risk of prostate cancer, among many other benefits.
Despite Kellogg's quackery when it came to masturbation, as well as a few other pseudoscientific practices he recommended, he actually was far ahead of his time on some important healthy lifestyle issues. He contended that smoking caused lung cancer decades before the risks were conclusively known and recommended regular exercise. During an age where hypocrisy and corruption were rife among public figures, Kellogg practiced precisely what he preached, living his salubrious lifestyle to the ripe old age of 91.
(Image: Public Domain)
Approximately twenty million Americans visit a chiropractor each year, according to the American Chiropractic Association, making it the largest alternative medicine profession. But if those people were aware of these five facts about chiropractic, I wonder if they'd still be so keen to get their spines manipulated. If you haven't tried chiropractic, these facts might banish any desire to do so.
1. Chiropractic doesn't work. Thousands upon thousands of studies have placed chiropractic under the microscope, examining its effectiveness in treating conditions such as back pain, neck pain, infant colic, headache, and scoliosis. Some studies have found positive results, but many more have shown no effect whatsoever. When the jumble of mixed data is grouped together and examined, only one conclusion is warranted: "these data fail to demonstrate convincingly that spinal manipulation is an effective intervention for any condition."
2. There's a genuine risk of stroke. While spinal manipulation at the hands of a trained chiropractor is generally safe, there's a boatload of evidence to suggest that you should never let a chiropractor touch your neck. The primary vertebral artery, which supplies blood to the brain, is located at the crest of your neck just below your skull. Abrupt manipulations of the cervical vertebrae in the neck, can, and have, caused the artery to rupture, resulting in stroke, coma, or even death. As one would expect, the American Chiropractic Association denies the existence of these events.
3. Chiropractic's most fundamental theory is bunk. Chiropractic was founded on the idea that correcting misaligned vertebrae in the spine -- called subluxations -- could cure all forms of disease. "A subluxated vertebra ... is the cause of 95 percent of all diseases ... The other five percent is caused by displaced joints other than those of the vertebral column," D.D. Palmer, the creator of chiropractic, wrote. Most modern day chiropractors now admit that this is totally wrong.
In 2009, four curious chiropractors reviewed all available evidence on the topic and concluded, "No supportive evidence is found for the chiropractic subluxation being associated with any disease process or of creating suboptimal health conditions requiring intervention. Regardless of popular appeal, this leaves the subluxation construct in the realm of unsupported speculation."
4. Chiropractic's founder was probably crazy. D.D. Palmer created chiropractic back in the late 1800s, but if you asked him, he would say that he got the idea from a medical physician named Dr. Jim Atkinson. As humble as it is for Mr. Palmer to share credit, it's also a little strange, especially considering Jim Atkinson was dead, and according to Palmer, relayed the instructions for chiropractic from beyond the grave. According to B.J. Palmer, D.D.'s son, "Father often attended the annual Mississippi Valley Spiritualists Camp Meeting where he first claimed to receive messages from Dr. Jim Atkinson on the principles of chiropractic."
5. Chiropractic hurts. Simply put, spinal manipulation usually doesn't feel good. "It often involves a high velocity thrust, a technique in which the joints are adjusted rapidly, often accompanied by popping sounds," described Edzard Ernst, a Professor of Complementary Medicine at the University of Pennsylvania. These disconcerting sounds are often harbingers of adverse side effects. Thirty to 61 percent of patients generally experience pain, numbness, stiffness, dizziness, tingling and headaches, which can persist for up to 48 hours after their appointment. These generally mild pains might be worth the discomfort if chiropractic actually worked in the long term. But it doesn't.
On December 20th, 2013, while waiting to board a flight to Cape Town, South Africa, 30-year-old Justine Sacco tweeted a mildly offensive joke to her 170 Twitter followers. An overtly insane comment, she didn't think anyone would take it literally:
Wheels down eleven hours later, she switched her phone off airplane mode to find that the joke had traveled far beyond her small sect of online groupies. Actually, it seemed to have spread across the entire Internet! And people weren't laughing; they were outraged.
Mean-spirited tweets bombarded Justine from all angles. They demanded that she be fired from her job as senior director of corporate communications at IAC, the company that owns OKCupid and Vimeo. They publicly lambasted her for being racist, insensitive, stupid, and privileged. They threatened her with rape. Sacco was let go from her job soon thereafter, a livelihood erased by the clamorous whims of a social media mob.
"I cried out my body weight in the first 24 hours,” she later told journalist Jon Ronson. “It was incredibly traumatic. You don’t sleep. You wake up in the middle of the night forgetting where you are.”
How could thousands of people who never met Justine Sacco be so willing to ruthlessly destroy her?
Psychologists have actually known the answer to this question for decades. The human brain evolved to be social. Our ancestors who survived were the ones who stayed in a group, not strayed as individuals. As such, the desire to conform is hardwired, and it's an urge that can be harnessed both for good and for unforgiving brutality.
Psychological offshoots of our evolved sociality are groupthink, mob mentality, and the bandwagon effect, each of which has been repeatedly demonstrated in the scientific literature. All of these phenomena describe a propensity for individuals to conform to group beliefs and ideals. But as a side effect, they tend to evoke irrational behavior that goes against common sense and evidence. They hearken back to a more primal mode of thinking, one that still persists within us.
We've witnessed the results of such primitive thinking already, with casualties measured in lives, not jobs or twitter followers. Roughly 40,000 to 60,000 "witches" were burned at the stake in Europe between 1450 and 1750, a mass murder resulting from hysteria and fear. These "witches" weren't actually casting black magic, of course. Often, they were old, cantankerous women viewed as nuisances by local nobility. Social media shaming is a modern day form of witch hunting, reinforcing ties within the group by expelling those who seem to go against social norms.
Though it may not go to the same extremes of physical brutality as witch-hunting, social media shaming can be even more excruciating mentally. The insults and comments that torture the targeted casts out are regularly based on snap judgments with little forethought. And yet, as vigilantes fire out their tweets and posts, they probably think they're doing some sort of good.
I wonder, did the bystanders who jeered as "witches" were burned alive think they were doing right by society as well?
Public shaming doesn't need to be eliminated -- if performed correctly, it can serve as a strong show of solidarity against unnacceptable behaviors -- but it does need to be improved. The solution is civility and slow, measured scientific thinking. Wait for more evidence. Scrutinize that evidence. Did it come from a repubtable source? Is it hard data or hearsay? Before there is evidence, don't draw unfounded conclusions. Above all, be empathetic. Give your fellow man and woman the benefit of the doubt.
With reason and thoughtful restraint, what happened to Justine Sacco need not befall anyone else.
Scarcely a week goes by without news of a blood shortage somewhere in the United States. Summertime in particular sees supplies on the wane. With families on vacation and schools out of session, the American Red Cross regularly witnesses a dip in donations.
But with one simple change, blood shortages in the United States could be drastically reduced, or perhaps eliminated entirely. It's a solution seemingly out of Count Dracula's playbook: drain blood from the dead.
Unpalatable and macabre at first glance, the idea actually makes a lot of sense. Roughly 15 million pints of blood are donated each year by approximately 9.2 million individuals. Over the course of the same year, about 2.6 million Americans will -- sadly -- pass away. If hospitals were to harvest the blood from a third of those people, roughly 4.5 million liters would be added to the reservoir.
Contrary to what you might think, blood from cadavers is not only usable, but quite safe.
"For six to eight hours, the blood inside a dead body remains sterile and the red blood cells retain their oxygen-carrying capabilities," Mary Roach reported in her book Stiff.
In fact, as Roach further described, "For twenty-eight years, the Sklifosovsky Institute [in Moscow] happily transfused cadaver blood, some twenty-five tons of the stuff, meeting 70 percent of its clinics needs."
The idea has never caught on in the United States, however, primarily out of public distaste. Tampering with the body of a deceased individual frequently evokes ethical conundrums and moral aversions in the minds of many.
However, draining the blood from a body is hardly out of the ordinary; it's actually a regular part of the embalming process. To prepare a dead body for funeral services and eventual burial or cremation, morticians pump out all of the blood and interstitial fluids and replace them with an embalming solution, typically containing formaldehyde and methanol. Would it not make more sense to remove the blood at the hospital soon after death, rather than let it all go to waste?
Public opinion isn't the only hurdle to implementing this plan. Without a beating heart, blood does not flow, so hospital staff can't simply stick a syringe into the median cubital vein on an arm and expect blood to come spurting out. Nor can they necessarily use an embalming machine, which forces the blood out by suffusing the veins and arteries with fluid. That would likely contaminate the blood.
Instead, staff might have to be trained in a more primitive technique. After obtaining familial consent and conducting necessary tests, a larger needle attached to a more voluminous tube would be inserted into the jugular vein at the neck. Then the body would be tilted downward so the blood flows out with the aid of gravity. Simple, effective, yet perhaps a tad morbid...
According to the American Red Cross, someone in the U.S. needs blood every two seconds, and more than 41,000 donations are needed each day. Taking blood from cadavers could ensure that no patient is ever deprived of the life-giving blood they need.
With the "Dog Days of Summer" in full swing, school is the last thing on many young Americans' minds. But as sure as the changing of the seasons, school will return, along with homework, tests, and grades.
The demands of the college environment in particular quickly dispel the intellectual ease of summer. As students set in for the semester's academic haul, some turn to "smart pills" to help shoulder the load.
These so-called "smart pills" are drugs intended to treat attention-deficit/hyperactivity disorder (ADHD). They include methylphenidate, most commonly known as Ritalin, and amphetamine salts, mixed together under the name of Adderall. With their prescription skyrocketing over the past decades, these drugs are becoming incredibly easy for otherwise healthy college students to obtain. Studies examining their use for "nonmedical" purposes give wide-ranging estimates, but generally indicate that between five and thirty percent of college students have partaken. The most common reasons students report using the drugs are to boost concentration and attention, and to help study and memorize information.
But do the drugs actually do what students think they do? To the people I've surveyed who used the drugs, the answer is obvious: "Absolutely." But the science isn't so clear.
When reviewers at the University of Pennsylvania poured through the published scientific literature on the topic, they found mixed results, though they tentatively concluded that stimulant drugs like Adderall and Ritalin probably do enhance learning and working memory. The effects, however, would likely be small, and may not even translate to improved academic performance. The researchers also added that there are other, far safer and more effective methods of boosting brain power.
"Low-tech methods of cognitive enhancement include many components of what has traditionally been viewed as a healthy lifestyle, such as exercise, good nutrition, adequate sleep, and stress management," they wrote.
"Smart pills" present genuine risks, including irregular heartbeat and seizures for the occasional user, as well as dependence and depression for the frequent user. Their use is also commonly linked to perhaps the most counterproductive study practice: the all-nighter. Study after study after study has found that staying up all night to cram for an exam often yields little to no benefit. The loss of sleep hurts one's test scores as much as all-night studying helps.
"The rumored effects of 'smart drugs' may be a false promise," Shaheen Lakhan and Annette Kirchgessner, researchers at the Global Neuroscience Initiative Foundation, wrote in 2012.
If the "promise" is that a simple pill will boost one's grades, then it's definitely false. Our society seems ever on the hunt for miracle pills and magic cures, but rarely does one ever get pulled out of a hat. Louisiana State University psychologists presented a more nuanced view of "smart drugs" in 2013.
"Like many drugs, stimulants influence behavior in multiple ways," they wrote. "Depending on the circumstances, stimulants may, or may not, enhance cognition."
Immunologists are fond of making analogies with law enforcement to explain how the immune system works. Macrophages, cells which gobble up invading microbes, are often compared to beat cops, patrolling the neighborhood for any signs of trouble. Neutrophils, which my former graduate school advisor likens to "little hand grenades," are like miniature SWAT teams, rushing in with guns blazing, shooting first and asking questions later. T-cells, which coordinate the immune response, are akin to intelligence officers, while the antibody-producing B-cells, which target highly-wanted suspect pathogens, are similar to the FBI.
But none of these are the coolest cells in the immune system. That distinct honor goes to the natural killer (NK) cells, your body's very own "secret police."
Most of your body's cells, with the major exception of red blood cells, are covered with a sort of identity card called major histocompatibility complex (MHC) class I molecules. It is these proteins, along with their class II counterparts, that must be matched for an organ transplant to be successful. If there is too much of a mismatch between a donor's and a recipient's MHC proteins, the recipient's immune system will reject the organ.
This may, at first glance, seem unhelpful. Organ rejection is a bad thing. True, but our immune system didn't develop with organ transplants in mind. Instead, it evolved to fight off foreign invaders, and that is where the "identity card" function of the MHC proteins becomes invaluable.
NK cells keep vigilance by knocking on cells' doors, probably at midnight when they least expect it, asking for the secret password. Cells then flash their identity cards, i.e., the MHC class I molecules. Cells that are unable to show their identity cards are killed on the spot. No questions asked. (See figure.)
(Image: Janeway's Immunobiology. 8th Edition.)
As shown above, a healthy body cell expresses both an identity card (MHC class I) and an activating ligand. The latter tells the NK cell to, "Kill me!" while the MHC class I molecule says, "I'm safe! Don't kill me!" If an NK cell receives both signals, it does nothing and lets the cell live. But cells that express an altered or damaged MHC class I molecule are unable to transmit the "Don't kill me!" signal. Thus, they are obliterated by the NK cells.
This method of law enforcement might seem rather harsh. However, NK cells know that tumor cells and cells infected with pathogens often are unable to properly express MHC class I molecules on their surfaces. The lack of an identity card is a telltale sign that something went horribly wrong inside the cell.
Though the term "secret police" makes us feel rather uncomfortable, at the microbiological level, this clandestine group of assassins is absolutely vital to our health. Indeed, the natural killer cells appear to be most active during the long lag when T-cells and B-cells are preparing to launch their own assault on a foreign invader.
Source: Murphy KM (2012). Janeway's Immunobiology. 8th Edition. New York: Garland Science.
The Affordable Care Act required many types of birth control to be made available to women free of charge, but not everyone can take advantage. Millions of women too poor to afford private insurance or working at employers granted a religious exemption to the law are still on the hook for contraception, the costs of which can range from $15 to $50 a month for pills to an $800 up-front cost for an intrauterine device (IUD).
There is, of course, a simpler solution to ensure that women have access to contraception: make it free for all low-income women. Publicly funded birth control.
Public birth control is an issue enveloped in ideology. Let's strip it all away and look at the evidence.
From 2007 to 2011, doctors at the Washington University School of Medicine in St. Louis provided free birth control and counseling services to 9,256 women and girls ages 14 to 45 in the St. Louis, Missouri area. The effort was a resounding success. Abortion rates among study participants were 62 to 78% lower than the national average. Critically, among girls ages 15 to 19 enrolled in the study, the annual birth rate was 6.3 per 1,000, well below the U.S. rate of 34.3 per 1,000 for girls the same age.
Researchers from the University of North Carolina conducted a similar study in 2010, providing free IUDs to low-income women and comparing their pregnancy rates to a control group of women without free birth control. Rates of pregnancy were significantly lower in the group given IUDs.
Will these successes bear out on a large scale? If Colorado is any indication, the answer is an emphatic "yes." In 2009, the state began offering IUDs to low-income women at low- or no-cost. Four years later, the states' teen birth rate had dropped 40%, and in the counties with the program in place, the teen abortion rate plummeted 35%. What's more, every dollar spent on the contraceptive program saved roughly $5.68 in Medicaid costs.
The United States currently spends $2.37 billion on family planning services, mostly through Medicaid. Many of these services offer birth control free of charge. The nonprofit Guttmacher Institute has quantified the benefits:
"In 2010, publicly funded contraception services helped women prevent 2.2 million unintended pregnancies; 1.1 million of these would have resulted in unplanned births and 760,000 in abortions. Without publicly funded contraceptive services, the rate of unintended pregnancies, unplanned births, and abortions would all be 66% higher; the rates for teens would be 73% higher."
The Guttmacher Institute also found cost savings similar to those seen in Colorado.
"In 2010, these services resulted in a net savings to the federal and state governments of $13.6 billion — $7 for every public dollar spent," Kinsey Hasstedt, a public policy associate at the Guttmacher Institute, wrote in the New York Times.
Teen birth rates have been steadily dropping for decades, but as RCS' Alex Berezow reported last year, they are still unacceptably high.
"While fewer than 1 in 20 white teens became pregnant, about 1 in 10 Hispanic teens and more than 1 in 9 black teens became pregnant. Fetal loss was more than twice as common among black and Hispanic teens than among white teens, possibly indicating poorer access to prenatal care. And, more than 1 in 3 black teen pregnancies ended in abortion."
These statistics represent real situations that need not occur. Free contraception, accompanied with counseling on both abstinence and safe sexual activity, is an evidence-based remedy.
Here at RealClearScience, we pride ourselves on five things: (1) Explaining the science behind complex topics; (2) Debunking bad science or pseudoscience; (3) Endorsing policies and opinions that we feel are best aligned with scientific evidence; (4) Having the site operated by people who were trained in science, not journalism; and (5) Remaining politically agnostic and as transparent as possible.
Dedication to these principles has led us to operate in ways that few other media outlets would dare imitate. For instance, in an article titled "What RealClearScience Is For and Against," we listed our position on every controversial scientific topic that is discussed in the media. Perhaps surprisingly, our entire editorial team was in 100% agreement on the list, though each of us comes from different ideological backgrounds.
How is that possible? Because, as we have written before, we are first and foremost dedicated to a fact-based worldview. If the evidence changes, our opinion changes. Adherence to that simple mandate is quite liberating: It allows us the freedom to be curious and to follow the scientific data wherever it may lead. That intellectual emancipation further allows us to remain open and honest and, most importantly, to shake off the rigid partisanship that has gripped too much of America's mainstream media.
So, in honor of that tradition of scientific transparency that we began nearly five years ago, we felt that it would be important to reveal exactly where each of us fell on the political spectrum. Each of us took a "political quiz" (courtesy of CelebrityTypes.com) -- which to our estimation appears fairly accurate -- and we have posted the results below. (How many other journalists do you think would be willing to do that?)
Detailed descriptions of the chart can be seen here, but in general, this is a quick-and-dirty way to interpret the chart:
Left/Right = Applies to economic policy
Liberal/Communitarian = Translates to "individual rights" vs. "societal rights"
Red = Republicans (conservatism)
Yellow = Libertarians (libertarianism)
Blue = Democrats (social liberalism)
Green = Socialists (social democracy)
Editor Alex Berezow, who founded RealClearScience in October 2010, describes himself as a political centrist. His test result mostly confirms this description. He is precisely halfway between liberal/communitarian (meaning that he values individual and societal rights roughly equally), but he skews slightly to the right on economic policy.
Assistant editor Ross Pomeroy, who joined RealClearScience in June 2011, describes himself as a "lefty who is annoyed by most lefties." His test result mostly confirms the first part of that description, though it can't measure the latter.
Finally, our Newton Blog contributor and physics connoisseur, Tom Hartsfield, describes himself as a libertarian. His test result surprised him a bit; he was quite far to the right on economic policy (which is typical for a libertarian), but nearly centrist on the liberal/communitarian axis (which is not typical for a libertarian).
If we average our scores together, we obtain:
Left/Right: 33.3 (out of 100) to the Right
Liberal/Communitarian: 16.7 (out of 100) toward Liberal
Thus, as a whole, the RealClearScience editorial team is mostly centrist, with a very slight libertarian skew. And, we aren't afraid to admit it, because when it comes to science, we do our best to put all of our ideological baggage aside and analyze the data as objectively as we can.
Now, we challenge all media outlets to be as transparent as RealClearScience!
When free-reading time was announced during class trips to the elementary school library, I knew exactly where I was headed. Little eight-year-old me would stroll to a section in the shadow of a raised fort in the back corner. There, I'd pull out books on Bigfoot, UFOs, yetis, ghosts, the Loch Ness Monster, and the Bermuda Triangle, plop myself down on a bean bag chair, and flip through pages of what seemed to me to be real-life fantasy. Reading about sightings, disappearances, and unexplained occurrences, I was totally entranced. This, ironically, was where my interest in science truly began.
My experience is not unique. Paul Willis, director of RiAus, Australia's Science Channel, has noticed a similar thread among many of the scientists he's met during his long career.
"I've been struck by how many of my colleagues also shared an early interest in the pseudosciences," he wrote at ABC Science.
And why shouldn't this be the case? At the heart of the scientific endeavor is unbridled curiosity, a desire to seek out the strange and explain the unexplained. Without this flame, there is no fire. Fantastical pseudoscience can easily provide an initial spark. First enthralled by Bigfoot, budding zoologists might turn their attention to the no less amazing reservoir of undescribed species. Baffled by UFOs, young stargazers might seek out new life on other worlds, rather than wait to be "visited" on Earth.
To me, and to many scientists, pseudoscientific stories in our youth were enticing prospects, accounts we wanted to believe then, and still want to believe now. But there was a critical time where we learned that belief does not make something real. For that, we need evidence.
There's nothing sad about this evolution; it's simply a maturation of thought. There comes a time where reading slim accounts of mythical phenomena just isn't enough anymore. Scientists need to see those things, or even make them come alive. The fantastical is infinitely more amazing when it's genuinely real.
Purveyors of pseudoscience regularly accuse skeptical scientists of close-mindedness. The opposite is the case. Zoologists would be ecstatic if Sasquatch actually roamed the forests of the Pacific Northwest, or if a descendant of plesiosaurs dwelled in Scotland's Loch Ness. Physicists would love to find concrete evidence of ghosts, for their existence would surely indicate a new energy source, or even an undiscovered dimension. Perhaps the most famous of skeptics, astrophysicist Carl Sagan, openly admitted, "Nobody would be happier than me if we we're being visited [by aliens]."
But, he added, "What counts is not what sounds plausible, not what we'd like to believe, not what one or two witnesses claim, but only what is supported by hard evidence, rigorously and skeptically examined."
That's how we know the difference between what is truly real and what can only be found on the sylized pages of a book.
Thomas Gold was a deep thinker. A fellow of eight distinguished scientific organizations and a winner of numerous prestigious prizes, the Austrian-born American astrophysicist poured his heart and mind into science. Working out of Cornell University for much of his career, Gold published dozens of papers in astronomy and geophysics, thought up many ingenious, often controversial theories, helped establish the now iconic Arecibo Observatory in Puerto Rico, and hired a budding 34-year-old astrophysicist by the name of Carl Sagan.
Noted for his willingness to venture down the metaphorical rabbit hole, Gold climbed down a particularly deep one in the latter half of his life. By the 1970s, scientists broadly accepted that oil was of a biological origin. Labeling oil a "fossil fuel," they theorized that deceased algae and plankton sank to the bottom of oceans and, over millions of years were slowly transformed into oil by pressure and heat. Gold wasn't so sure.
Noticing that hydrocarbons, organic compounds of hydrogen and carbon, are present on and in other astronomical bodies, Gold reasoned that Earth's oil might originate from nonliving sources instead of living ones. He argued that an unfathomable amount of hydrocarbons were locked within Earth, and that as they seeped upwards, they were converted to oil and gas. He eventually outlined his theory in a book, The Deep Hot Biosphere, making two other fantastical propositions as well:
"that below the surface of the earth is a biosphere of greater mass and volume than the biosphere the total sum of living things on our planet's continents and in its oceans... [and] that the inhabitants of this subterranean biosphere are not plants or animals as we know them, but heat-loving bacteria that survive on a diet consisting solely of hydrocarbons that is, natural gas and petroleum."
In short, Gold claimed that within our world lies another world! An undeniably exciting prospect, there is currently no meaningful method of exploring it. Without evidence, a deep hot biosphere remains in the realm of fiction. We can, however, examine Gold's original claim, that oil and gas are not fossil fuels, that they are instead abiotic.
To his credit, Gold spent years molding and testing his idea. First conceiving of the theory in the 1950s, he let it simmer in his brain for around twenty years before finally fleshing it out. In numerous published works he contended that the movement of tectonic plates and faults allowed methane to migrate up into the mantle, where, through cooling and other processes, it would transform into crude oil.
In 1986, Gold used his influential standing to procure $40 million for a drilling expedition in a 360 million-year-old impact crater near Lake Siljan in Sweden. This would be his moment.
According to Gold's theory, abiotic oil should have been present below the crater in great quantities. It wasn't. Despite drilling down over six kilometers in two locations, the costly operation turned up just 80 barrels of oil over six years, and there was no definitive proof that it was abiotic in origin.
Empty-handed, and limited by the technology available, Gold could only return to theorizing. He speculated that the oil he recovered in Sweden may have started as hydrocarbons from a deep biosphere, then were converted to oil by deep-dwelling thermophilic bacteria. Additional knowledge of the Earth's mantle procured years later would provide critical evidence against his theory of abiotic oil. According to the University of Tokyo's Geoffrey Glasby:
"Methane can only be converted to higher hydrocarbons at pressures >30 kbar corresponding to a depth of ~100 km below the Earth's surface. The proposed reaction of methane to produce higher hydrocarbons above this depth and, in particular, in the upper layers of the Earth's crust is therefore not consistent with the second law of thermodynamics."
Despite the evidence against Gold's theory of abiotic oil, it cannot be completely ruled out. Though the weight of evidence supports a fossil fuel origin of oil and gas, most geologists admit there's still a tiny chance it might be wrong. Abiotic oil could very well exist, perhaps in small quantities.
Migraines are just awful. It is very difficult to express in words to non-sufferers what a migraine is like. However, if you could imagine a vice rhythmically squeezing your brain every second for several hours (or days), that is roughly what a migraine feels like. For some migraine sufferers, such as your humble correspondent, it is even possible to feel your heartbeat in your brain, with each pulse bringing a throbbing pain.
Migraines appear to result from activation of the trigeminal nerve, located in the brain and face, which results in the release of small proteins (neuropeptides) that trigger inflammation and swelling in nearby blood vessels. The end result is pain. (See this excellent graphic summarizing the molecular mechanism of migraine.)
People who suffer from migraines usually have a war chest full of medications to help prevent or treat them. My personal toolbox, which I carry around in a little bag like a sickly octogenarian, contains Excedrin Migraine (which contains aspirin and acetaminophen), sumatriptan, and eletriptan. Though these drugs have different mechanisms of action, they share two major things in common: (1) They reduce inflammation (except acetaminophen) and, in the case of the two triptans, reduce vascular swelling; and (2) They are small molecule drugs.
Recently, some migraine research has shifted away from small molecule drugs to focus instead on antibodies, the immune system proteins typically thought of as microbe fighters. In fact, according to a recent article in Nature Biotechnology by Gunjan Sinha, four different monoclonal antibodies -- all of which block the same neuropeptide (CGRP) -- are in various phases of clinical trials. Because CGRP causes blood vessels to swell and is involved in pain transmission, blocking the protein (or its receptor) with an antibody should help stop migraines.
Results have been modest but encouraging. Unfortunately, small molecule drugs (which can be put into pill form) that target CGRP don't work. Thus, research into CGRP blockers will likely continue to focus on antibodies, which must be injected and will cost patients upwards of $5,000 annually. Consequently, only the most serious migraine sufferers will benefit from this line of investigation. "Amateur" migraine sufferers, such as me, will have to be satisfied with the tools we already have.
Source: Gunjan Sinha. "Migraine mAbs crowd into late-stage trials." Nature Biotechnology 33, 676–677 (2015). doi:10.1038/nbt0715-676c Published online: 08-July-2015.
Last time we shared a selection of hilariously stupid science questions, we swore it would be the last time. We didn't, however, anticipate that one of our readers would actually urge us to continue! (Thanks, Julian!)
To be honest, we didn't need much encouragement. Whether it's a slow news day or a we just feel the need to lighten things up a bit, there will always be moments where hilariously stupid science questions are called for. And as long as there's a dedicated community of science comedians at Reddit, the questions will keep on coming.
Here are nine more.
If there is steel wool, why have I never seen a steel sheep in the wild?
Why do we freak out every time we find ice on another planet when we already have freezers?
Why does the world's oldest person keep dying?
Which would win in a fight: an Imperial Star Destroyer or a Metric Star Destroyer? (Any Star Wars fans out there?)
Why does smoking kill people but cure ham?
When we use ad-block software that blocks 99.99% of ads, are we not just selecting only the strongest and most resistant ads to repopulate the Internet?
If Jesus died for our Sins then who died for our Cos and Tans?
My wisdom teeth are coming in. How much longer until my GPA goes up?
I just read that 25% of women in the United States take medication for mental illness. That's scary! Why do we let 75% of them run around untreated?
(We apologize in advance for the last question.)
Periodically, the gullible science writers of the internet force me to grumpily emerge from the dark, musty cave from whence I write my dissertation. As my mole-like eyes squint at the forgotten sun, I snarl and gnash my teeth in order to explode some dubious claims. This time, it's the proclamation that there must be life on the unpronounceable comet to which we recently sent a ship and upon which we crash-landed a probe.
"Scientists say," they say. Well, who's the scientist?
The main person being quoted is Professor Chandra Wickramasinghe, a scientist in the controversial discipline of astrobiology. He is the father of panspermia, the theory that life on earth arrived sailing in from outer space on dust grains. It's a cute idea, but there's absolutely no reason to believe that it's true.
A further look at this scientist's career raises more red flags. Controversy in fact follows Dr. Wickramasinghe's every pronouncement these days. He lead a team that purported to find extraterrestrial life on a meteorite. Closer inspection leads to the conclusion that not only was there no E.T., it probably wasn't even a meteorite, and his team's investigation was a scientific joke.
Before that he published work suggesting that bizarre blood red rain that fell in India contained extraterrestrial cells. Turned out that this analysis was deeply unscientific as well.
Kookiness clings to Dr. Wickramasinghe's claims like stink on a pig. He once explained his theory that NASA has been covering up life on Mars for decades. He also suspects that SARS and the 1918 flu may have come from space. This is based on a scientifically dubious paper that claims to support life arriving in the upper atmosphere from space.
Science journalists, will you ever learn to stop listening to people who mislead us?
Star Trek: The Next Generation presents a harmonious view of interstellar spaceflight: a synchronized crew of diverse cultures and races exploring the galaxy.
Let's assume, for the moment, that we can reach such a future; that technology will one day allow us to warp spacetime and travel across immense distances while ignoring the effects of time dilation; that we will invent food replicators capable of transforming blocks of matter in-to any form of sustenance we desire; that linguists and technologists will team up to create a universal translator; and that physicists will discover a way to artificially control gravity. Despite all of those incredible assumptions, life aboard an interstellar starship would still present practical difficulties.
Although evolved sensibilities and communication technology will break down cultural barriers between species, physiological differences will be harder to conquer. A significant barrier is erected of biological time. To the best of our knowledge, almost all life comes equipped with an ingrained "master clock." Located within the brain, this "clock" -- built from a bundle of neurons -- coordinates groupings of interacting molecules spread throughout the body. These molecules control circadian rhythms: physical, mental, and behavioral changes that respond to light and dark cycles. When these rhythms aren't followed, problems arise. For example, humans forced to work at night and sleep during the day experience fatigue, general malaise, and increased rates of metabolic disorders like obesity and diabetes. A functional crew will need to minimize these issues. While Star Trek understandably glosses over the complications of circadian rhythms, on a real intergalactic starship, they will need to be dealt with. Not all of the crewmembers will be from planets with 24-hour day-night cycles, so duty schedules will have to be individually tailored and coordinated to fit their biological clocks.
Air conditions present a higher problem to hurdle. On the International Space Station, the matter is simple. Earth life is adapted to a pressure of roughly 14.7 psi (the pressure at sea level) and air composed of 78% nitrogen and 21% oxygen, so those are the conditions that suffuse throughout the orbiting station. But what will the conditions be onboard a starship of diverse life forms? After all, some crewmembers might breathe fluorine, or chlorine, or nitrogen... Moreover, high pressures that would squish some life forms might be perfect for others, and lower pressures that would boil the water in human lungs might be business as usual for other species. The only sensible solution would be to set the ship's life support to a setting that would accommodate the most crewmembers. Everybody else, however, would need to don a spacesuit, though the atmosphere in their personal quarters could be tailored to their physiological needs.
A final practical difficulty for our future starship is gravity. Yes, gravity will be artificially controlled -- that's a gimme -- but again, not all species will have the same "gravitational needs," so to speak. Some may hail from smaller worlds where gravity is not as pressing. Others may originate from larger worlds where gravity is punishing. Each would have evolved to be suited to its own gravity. A life form from a larger planet living in a lower-gravity environment would see their tissues and bones waste away, while a life form from a smaller planet placed in a higher-gravity environment might be at an increased risk of fractures, or worse, have their bones (if they have bones) snapped like twigs.
Gravity probably presents the biggest conundrum of all. Common use areas would again have to be set to a level of gravity that's effectively a compromise -- suitable to the majority of the crew. Private quarters could be kept at varying levels of gravity. But despite the accommodations, it's difficult to imagine anybody serving onboard a starship for more than a few years. They'd need to return to their own planet for gravitational rehabilitation.
As the human race considers extending its reach to other worlds, brave spacefarers will be forced to exist and persist outside their most essential comfort zones. A Mars day is forty minutes longer than an Earth day, and its gravity is just one-third that of Earth's. To face, and overcome, those differences is not science fiction, it is a very real future.
Update 7/9: There's a terrific discussion in the comments about other potential issues and ways to solve them. Feel free to chime in with your thoughts!
You almost certainly know Newton's famous thought experiment about orbits, but just in case you don't, here it is, as presented by Universe Today's Fraser Cain:
Imagine you had a cannon that could shoot a cannonball far away. The ball would fly downrange and then crash into the dirt. If you shot the cannonball harder it would fly further before slamming into the ground. And if you could shoot the cannonball hard enough and ignore air resistance – it would travel all the way around the Earth. The cannonball would be in orbit. It’s falling towards the Earth, but the curvature of the Earth means that it’s constantly falling just over the horizon.
But what if that cannonball is instead a massless photon of light? Could it orbit around the Earth? The answer is no, but the reason is unclear.
Is it because the photon has no mass? Surprisingly, no. To understand why, we have to forsake Newton's idea of gravity for Einstein's. Einstein describes space as a curved surface, warped by mass, and gravitational motion as simply objects tracing the shortest paths (called geodesics) through curved space. So, a massless photon can indeed feel the pull of gravity.
Though the photon does feel earth's gravity, the real problem is that it's moving at a great speed -- 299,792,458 meters per second -- and it can go no faster and no slower. While there are infinitely many possible orbits around a body like the earth, each orbit requires a precise velocity to maintain. If the object moves any slower it crashes back down to the surface, any faster and it will take off into space, lost forever. The photon's speed is so great that it moves too fast to stay in any possible orbit of the Earth.
Photons move far too fast to orbit planets or even stars. There is, however, a stellar object so massive that it bends space enough to pull photons into orbit: a black hole.
You probably knew that black holes can swallow light. After all, a black hole is black because light cannot escape its immense gravitational pull. But, did you know that photons which pass a black hole close enough, at just the right trajectory, will neither be pulled in nor pass by; they'll enter into orbit, just like Newton's cannonball around Earth!
The ringed zone where this occurs is called the photon sphere. Theoretically, light locked within the photon sphere would orbit the black hole endlessly. In practice, the one possible speed of the photon -- the speed of light -- allows it only one single possible orbit, given by the following equation (where r is the radius in meters, G is the gravitational constant, M is the mass in kg, and c is the speed of light in vacuum):
It would be exceedingly rare for a photon to reach this orbit, but it could happen.
Also impossible -- but definitely fun to discuss -- is a manned trip around the photon sphere in some futuristic starship. NASA's Robert Nemiroff described the scenery, a steady plane of view ahead, distorted, moving stars above, and only blackness below:
As you circle the black hole the sky appears to move in strange ways. Here an Einstein ring for background stars can be seen as an invisible line above the photon sphere horizon. Stars approaching the exact other side of the black hole... appear to approach this line, are greatly magnified, and move with high angular speeds. As one background star image is greatly magnified, so is another 180 degrees around the black hole - but with your current point of view you can see only one at a time. Your spaceship's motion can cause a star image below this Einstein ring to become very bright and shoot out of view - while a moment later the other bright image of this same star zooms into view above the Einstein ring and fades.
Another thought experiment provides further amazement. Since the photon sphere lies outside of the Schwarzschild radius you could conceivably visit it. This would of course require an obscenely powerful science fiction spaceship, capable of accelerating away from the black hole as fast as the black hole's own gravitational acceleration pulled the ship toward it. Imagine you are now sitting precisely on the sphere, and you turn on a powerful flashlight. Photons from the light will circle the black hole and come back around to hit you from behind. Some of those photons will then reflect back directly along the same orbit back to your eyes. You might be able to see the back of yourself!
(Images: Paramount Pictures, Reddit)
You're out walking around the park and suddenly you hear a strange buzzing sound. Your head snaps up and your eyes dart around the sky, looking for a bird, a plane, an injured man in a cape, something. After a moment of confusion, you spot the glint of an LED. Your eyes lock in on a tiny swooping object. It's like a loud, clumsy, hideous hummingbird. A drone!
Drones have become ubiquitous. People antagonize animals with them:
And spy on their neighbors:
"Personal stunt drones" capture skateboarding pre-teens:
Teenage BMX riders:
And grown men leaping off of tall things:
Five years ago, almost nobody had seen one of these tiny craft. Why the explosion in pint-sized private aircraft loaded with cameras and lights and accessories? Has some technology suddenly matured and revolutionized the industry?
Actually, yes. Smartphones and miniaturization.
In the past, small devices for controlling aircraft were the purview of the military. These military devices were expensive and mostly illegal to sell for private use. Additionally they were still a bit too bulky for use on aircraft as small as personal private drones. The massive boom in smartphones has changed that. There has been a rush to build ultra-small devices on tiny chips, capable of living inside a wafer-thin smartphone and not draining its diminutive battery.
Do you play games on your phone? Every time you lean the phone sideways to slide your character across the screen, a tiny accelerometer chip in the phone is measuring that tilt and reacting accordingly. (Look here for an in-depth discussion on how they work.) The short of it is that a technology called MEMS (microelectromechanical systems) has matured to allow microscopic silicon simple machines to be built on the same chips as the electronics to read them. This means that instead of a bulky and large mechanical gyroscope, a chip smaller than a dime can easily measure acceleration in all three spatial directions.
And, it's not just accelerometers either. GPS chips that allow for precise location of the craft have undergone a similar technological revolution. These chipsets used to be too large, power-guzzling, and hot. Now the same slim smartphones house GPS on another chip the size of a dime.
That tiny high-def camera that lives in your phone is at home on the drone too. Right next to the minute Micro-SD card on which it stores images.
All these tiny devices weigh almost nothing and don't draw much power. This allows a drone to be flown and controlled by a tiny circuit board, so the craft can weight almost nothing outside of the engines and batteries to power them.
The drone population has literally exploded along with their capabilities for troublesome use. What is the legal situation? At this point it's all up in the air, so to speak. A 2007 FAA ban on commercial use will end this year. Regulators are currently working on new legislation for both commercial and private use. Until then, your best defense is a water hose:
The world runs on oil. According to the United States Energy Information Administration, in 2011, the 6.965 billion people on Earth collectively used about 3,669,353,105 gallons of the stuff, combusting it in cars as gasoline, laying it down in asphalt, and processing it into lubricants.
Our reliance on this energy-dense liquid prompts questions. For starters, what the heck is it? Oil consists of hydrocarbons -- compounds containing carbon and hydrogen -- and other carbon-containing (organic) compounds. When combusted, oil's hydrogen-carbon bonds split apart, releasing a large amount of energy, energy that can be harnessed.
Oil is cheered by some and maligned by others. Everyone seems to have an opinion on it. Out of the incessant discussion, myths have arisen. Here are five of them.
Myth #1: Oil is mostly dinosaur bones. Oil is a "fossil fuel," formed from the remains of organisms that died millions of years ago. Dinosaurs certainly fit this description, and we dig up their fossilized bones all the time! But though dinosaurs reigned for 135 million years, not many of them died in a position where they could be buried and crushed over the eons into coal, natural gas, or oil.
"If you took all of the dinosaurs that ever lived and... squished them up in order to get the oil out of them, we'd probably go through that oil in... a couple of days," paleontologist Jack Horner told Vsauce.
In actuality, the oil used to make the gasoline in your car almost certainly formed from oceanic microorganisms like plankton and algae that lived millions, if not billions, of years ago. When they died, they sank to the bottom of the ocean and began to decompose. Over time, they became buried. As more and more sediment formed on top of them, heat and pressure crushed them into fossil fuels.
Myth #2: Americans use the most oil. This is only partly true. By far and away, the U.S. consumes the most oil of any other country. But on a per capita basis, Americans aren't the world-leading gas-guzzlers. We rank 22nd, behind countries like Singapore, Kuwait, Luxembourg, Bermuda, and our neighbor to the north: Canada.
Myth #3: All crude oil is black. When you think of oil, you probably picture a black sludge. Most oil is black, but it can be yellow, red, or even green in hue. Crude oil's color is a clear indicator of quality -- the more contaminants that are present, the darker it will be. The highest quality oil will actually resemble the vegetable or olive oil in your kitchen: amber or golden in color.
Myth #4. The first commercial oil well was in the U.S. Though Edwin Drake's relatives might claim otherwise, their ancestor's commercial oil well in Titusville, Pennsylvania was not the first of its kind. Wells in Russia, Poland, and Romania were already in operation. Drake's well did, however, attract the first great wave of investment into oil drilling and refining.
Myth #5. The world will run out of oil very soon. Oil's demise has been greatly exaggerated for decades. There's no question that fossil fuels are finite, but predicting when they will run dry is no easy task. Proven reserves continue to increase the more we explore and as technology advances. It may be more likely that humanity will phase out the use of fossil fuels before we even run out. But with demand still increasing, nobody precisely knows when that will be, either.
(Images: AP, Niagara)
Dioxygen difluoride sounds rather harmless: just two of what you breathe and two of what's in your toothpaste. It even has an adorable, cushy nickname: FOOF. But most sane chemists know dioxygen difluoride is not a chemical to be trifled with.
An orange-yellow solid, dioxygen difluoride melts at 109.7K to an orange-red liquid. Note, that's Kelvin, not Celsius. That means FOOF melts at -262.2 °F! The chemical wouldn't even solidify on the coldest-known day on Earth, July 21, 1983, when the recorded temperature at the Soviet Vostok Station in Antarctica plummeted to −128.6 °F.
But a frigid melting point isn't the most exciting thing about FOOF. The most exciting thing is that it reacts violently with almost anything it comes into contact with, and by react, I mean explode. FOOF is one of the most furious oxidizers known to man -- it rips electrons from other compounds. Oxygen does the same thing to fuel combustion, but not quite so feverishly as FOOF.
Due to dioxygen difluoride's excitable nature, chemist Derek Lowe absolutely refuses to work with it, calling it "Satan's kimchi." He references a 1962 paper by one A.G. Streng as proof for his claim.
Streng was very likely the first chemist to explore and document FOOF's volatile nature. Though his report is characteristically dry, as one would expect for a paper published in the prestigious Journal of the American Chemical Society, its thesis is thrilling. As Streng discovered firsthand, FOOF explodes when mixed with just about everything, even at "cryogenic conditions." Derivatives of "violent," "vigorous," and "explosive" frequently appear throughout Streng's account of his experimental escapades, prompting the reader to wonder just how the man escaped with his life.
"If the paper weren't laid out in complete grammatical sentences and published in JACS, you'd swear it was the work of a violent lunatic. I ran out of vulgar expletives after the second page. A. G. Streng, folks, absolutely takes the corrosive exploding cake, and I have to tip my asbestos-lined titanium hat to him," Lowe remarked.
Fortunately (or unfortunately, depending upon how you look at it), you won't find FOOF in your run-of-the-mill chemistry lab. It requires storage below 100K, and can only be created by mixing fluorine and oxygen at very low pressures then running a current through the mixture, or by mixing the two elements in a stainless steel vessel at 77.1K, or by heating them at 1,300 °F and subsequently cooling the reactants with liquid oxygen.
If there's one thing to remember about FOOF, it's that it goes poof!