HAVE YOU EVER felt hunger? Sure, we've all, at one time or another, felt irritable and impulsive after skipping lunch, but that's not true hunger.
First, there are the pangs, throbbing contractions of the stomach muscles triggered by high concentrations of the hormone ghrelin, a compound released when the body's blood sugar dips too low. For most of us in the developed world, that's as far as the symptoms extend, for they can be easily assuaged by a sandwich, salad, or soda pop. 826 million people in the developing world won't be so lucky, however. Starvation comes next.
The body starts off by metabolizing whatever fat stores are available. Skeletal muscle gets consumed next, followed by the internal organs least essential to survival. Organs that at one point seemed vital, like the liver, kidney, and pancreas, become food. To the famished body, everything looks appetizing. Only the heart and brain are left off the menu of this internal feast.
As the stomach wastes away to a shriveled mass, pains of hunger diminish, leaving the sufferer listless and numb. Apathy ensues, agonizing and blissful at the same time. The starving person knows that death is approaching, but is often indifferent. In such a state, being dead may be preferable to being hungry.
NORMAN BORLAUG became well acquainted with hunger at the age of 21. As a supervisor in the Civilian Conservation Corps -- a New Deal work relief program that employed out-of-work, unmarried men to work jobs related to the conservation of natural resources -- he met a lot of destitute individuals, many of whom were starving when they joined, and watched how food and meaningful work renewed their confidence and transformed their lives. It was these indelible memories that were on Borlaug's mind when he completed the work that would spark a revolution in agriculture, and later garner him the Nobel Peace Prize, the Presidential Medal of Freedom, and the National Medal of Science, among a long list of other acknowledgements.
In 1944, Borlaug made use of his degree in agriculture from the University of Minnesota and traveled to Mexico with a team whose goal was to boost wheat production. He'd work there for 16 years, toiling for endless hours in the lab and in the fields to breed a wheat plant that was resistant to disease, thick-stemmed, and enormously productive. He succeeded. By 1963, 95% of Mexico's wheat crops grown were Borlaug's dwarf variety, and the country's overall yield was six times higher than in 1944, the year he arrived.
Over the coming decades, Borlaug's wheat would be sown in developing nations around the world. At the time, biologist Paul Ehrlich was portending global starvation. "The battle to feed all of humanity is over..." he wrote. "In the 1970s and 1980s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now."
Borlaug proved Ehrlich wrong. Global yields skyrocketed. Starvation rates decreased. Doom was postponed.
THE GREEN REVOLUTION, as it is now termed, has been credited with preventing over a billion deaths by starvation. You'd think that'd be something universally celebrated, but not everyone is happy. In her 1991 book, The Violence of the Green Revolution: Third World Agriculture, Ecology and Politics, physicist Vandana Shiva decried Borlaug's efforts as a hoax, no more than a selfish ploy of the Ag industry, saying that it destroyed India's crop diversity, left the country more susceptible to drought, and created a dependence on "poisonous" agrochemicals. She wasn't the only critic.
When Borlaug attempted to extend The Green Revolution to Africa in the 1980s, environmental lobbyists unified to stop him. Arguing that Borlaug's farming methods would despoil the continent's environment, they successfully persuaded the World Bank and the Ford Foundation to pull back almost all of their funding for Borlaug's efforts. Even the Rockefeller Foundation, which had originally funded Borlaug's wheat research in Mexico, withdrew monetary support.
Despite that incalculable setback, Borlaug tirelessly strove to feed Africa. His efforts helped Ethiopia, where 28% of all child mortality is associated with undernutrition, boost yields of their major crops to record levels. But ultimately, Africa remains swamped with malnutrition.
“Some of the environmental lobbyists of the Western nations are the salt of the earth, but many of them are elitists..." he told The Atlantic. "If they lived just one month amid the misery of the developing world, as I have for fifty years, they’d be crying out for tractors and fertilizer and irrigation canals and be outraged that fashionable elitists back home were trying to deny them these things.”
Select environmentalists also neglect to realize that environmental conservation and agricultural advancement are not locked in a zero-sum contest. As Borlaug pointed out in 2000, if world cereal yields had remained unchanged since 1961, the world would need 850 million hectares of additional land to equal the 1999 harvest.
"Think of the soil erosion and the loss of forests, grasslands, and wildlife that would have resulted had we tried to produce these larger harvests with the older, low-input technology!" he wrote.
As with most debates, this one comes down to intrinsic values. From our lofty position in the developed world, we have the luxury to value the fallacious image of pristine, untouched nature over feeding ourselves. Hunger simply isn't something that most of us are familiar with.
"These people have never been around hungry people," Borlaug says of people like this. "They're Utopians. They sit and philosophize. They don't live in the real world."
Proselytizing is easy. But try doing it when you're starving.
Every year, kitchen counters across the United States are transformed from sanitary bastions to vulgar brothels. It's inevitable. Not even Lysol can stop it. Leave out a single piece of wayward fruit on a hot and humid day, and within hours, it will play host to an orgy so risqué that Hugh Hefner himself couldn't hope to top it.
Sure, those fruit flies in the picture are eating now. But they won't be in a minute. Let's take a closer look...
Still want to eat that banana?
Fruit fly procreation is a gluttonous free-for-all, both of eating and sex. Dozens of individuals can take part. First, males and females gobble up some of the fruit's fructose, then they pair off for a little hanky panky. After 15 to 20 minutes of copulation, the two separate and find new partners. This goes on again and again. They don't rinse, but they do repeat. Oh do they ever repeat. The entire process is both hectic and perilous, especially for females. Males have barbed penile spines to latch on to the ladies, and their sperm contains toxins designed to disable competitors' sperm, as well as hormones to make the female less receptive. Both the hormones and toxins can reduce the female's lifespan.
It was on this debauchery that Brian Hollis and Tadeusz J. Kawecki, both researchers in the Department of Ecology and Evolution at University of Lausanne in Switzerland, recently focused their microscopes. Zeroing in on male fruit flies, they wanted to find out whether the insects' frenetic style of sexual selection in any way contributes to their cognitive abilities. Put more simply, they wondered what would happen to males' intelligence if they were restricted to monogamy, rather than polygamy.
To answer this question, Hollis and Kawecki raised three populations of fruit flies -- specifically the species Drosophila melanogaster -- under conditions of strict monogamy. Fifty males were individually paired with fifty females in bottles for two days, their eggs were allowed to hatch, then fifty male and fifty female young were gathered. The process was repeated one hundred times. Three polygamous control populations underwent an identical process albeit with five males and five females per bottle.
After rearing the monogamous and polygamous flies, the doting Hollis and Kawecki placed individual males of each type in bottles with six females, five unreceptive to sex and one receptive. Under these conditions, the median monogamous male took nineteen minutes longer to successfully copulate compared to the median polygamous males -- a very large difference. For the most part, the monogamous males wasted time courting females that wanted nothing to do with them. (Sound familiar to any guys out there?)
Generally, unreceptive and receptive females emit dissimilar olfactory cues -- they smell different -- and naive males have to learn which is which. This is basic associative learning. So one explanation for monogamous males' sexual ineptitude is that they are poorer at learning by association.
To test this, Hollis and Kawecki removed sex from the equation, instead exposing both groups of flies to two separate odors, one accompanied by an electric shock and the other innocuous. An hour later, the flies were placed in a basic maze with the two odor sources placed in separate areas. Many more polygamous males opted for the innocuous smell than did monogamous males. Once again, monogamous males were sluggish learners.
Interestingly, no such difference was found between monogamous and polyandrous females. If anything, the monogamous females were slightly more adept learners.
Hollis told RCScience that he and his partner aren't finished with this line of research. There are three questions in particular that the duo is currently exploring.
"Which genes are responsible for the differences we see in monogamous male behavior? How has female behavior changed in light of the lack of opportunity for choosing a mate? Are monogamous females less particular about who they mate with?"
An intriguing conclusion of the research is that sexual competition may drive the evolution of intelligence and learning.
Source: Hollis B, Kawecki TJ. 2014 Male cognitive performance declines in the absence of sexual selection. Proc. R. Soc. B 281: 20132873. http://dx.doi.org/10.1098/rspb.2013.2873
(Images: AP, Wikimedia Commons, Shutterstock)
On June 7th, 2011 officials from Alachua County Animal Services and workers from the Humane Society entered the Haven Acres Cat Sanctuary near High Springs, Florida. What they found was anything but a sanctuary.
Shocked, the rescuers discovered dozens of cages, each crammed with five, ten, or even fifteen felines. Excrement littered the ground. Mountains of trash served as perverted cat towers upon which the animals climbed, and often toppled down from. In all, 697 cats were rescued. Most were diseased and emaciated; many more were disfigured in nightmarish manners. More than 60 had to be euthanized. It remains the worst recorded case of cat hoarding in the nation's history.
In the wake of the grizzly finding, Humane Society officials teamed up with the University of Florida to nurse the felines back to health. Dozens of students and residents at the college of veterinary science worked around the clock to prepare the misbegotten animals -- many of whom were infected with ringworm or stricken with diseases like feline infectious peritonitis and viral feline leukemia -- for adoption. Finally, in an assembly-line fashion, scientists anesthetized the cats and arrayed them on hygienic boards to minimize the risk of infection; de-wormed, vaccinated, and spayed or neutered them; then performed any needed surgeries. It was a heartwarming scene, beautiful in its healing efficiency.
The cats were later offered up for adoption for just $5. Hundreds were accepted into loving homes.
It's been two and half years since animal rescuers and scientists teamed up rectify an atrocious wrong in Florida. But two weeks ago, a familiar picture made its rounds on Twitter, albeit in a completely altered and duplicitous context.
When it comes to heated topics like animal testing, nuance is essential. Taking a photo out of context to prompt a knee-jerk emotional response is deceitful, and, in this case, dishonors the hard work of students, scientists, and rescuers. The people who tweeted and re-tweeted this photo could have taken the time to learn a few things. For example, animal research has directly or indirectly saved millions of lives. Moreover, scientists are charged with doing everything they can to reduce unnecessary pain or suffering and to utilize non-animal methods whenever possible. Instead, they re-tweeted a hoax. Nothing derails a productive conversation quite like misinformation. Every human and animal who has ever been involved with or benefited from animal testing deserves better.
H/T Museum of Hoaxes
Godwin's Law ought to be enshrined next to Newton's Laws or Kepler's Laws for all posterity. For the uninitiated, Godwin's Law states, "As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one." The concept was devised by Mike Godwin in 1990 and officially codified into law in a Wired article in 1994. Since then, the evidence for this law has only gotten stronger.
Because of the unquestioned veracity of Godwin's Law, it is perhaps inevitable that a journalist will, eventually, be compared to a Nazi. We could even formulate a corollary called Godwin's Law of Journalism: "As a journalist's career grows longer, the probability he or she will be compared to Nazis or Hitler approaches one."
And I have evidence.
Celebrate Life Magazine -- a Catholic, pro-life publication -- in a recent article by Terrell Clemmons, has compared me to a Nazi. But, not just any run-of-the-mill, Springtime for Hitler kind of Nazi. Specifically, I was compared to the Nazis who operated the most notorious concentration camps:
That kind of science [ignorant of good and evil] was to novelist Mary Shelley, Frankenstein; to Lewis, The Abolition of Man; and to Jews in Nazi Germany, the death camps of Buchenwald and Auschwitz.
Buchenwald and Auschwitz? I contacted the editor-in-chief for comment, but she didn't respond. (Perhaps they have a "do not negotiate with Nazis" policy?)
So, what on Earth have I done to deserve such scorn? According to Ms. Clemmons, my support of "three-parent embryo" technology is evil incarnate.
If you're unfamiliar with the technology, here is a brief primer. In short, the technology allows for a woman who has a mitochondrial disease to use another healthy woman's egg, followed by standard in vitro fertilization, to conceive a healthy baby. The resulting zygote would have a tiny fraction of DNA from the healthy egg donor, hence the term "three-parent embryo."
Scientists support studying the procedure because it could help sick women have healthy children. Critics say it is unethical, and a subset of them apparently believe the technique is analogous to poisoning Jews with Zyklon B and disposing of their bodies in giant crematoria. How did I miss that obvious comparison?
When thinking rationally, critics do bring up three relevant points worth discussing: (1) This is a form of human experimentation (and is, therefore, undesirable); (2) The safety of the procedure is unknown; and (3) This is the first step on a "slippery slope" toward "designer babies."
In regard to #1: Yes, "three-parent embryo" technology would be a form of human experimentation. However, all clinical trials are a form of human experimentation. Do you take any prescription drugs? Those were first tested on humans. Did you have laser eye surgery? Yep, tested on humans first. Are you a cancer patient? You guessed it: chemotherapy was tested on humans. In fact, the term "clinical trial" is a just nice way of saying "human experimentation." Before any major medical treatment goes to market, it is first tested on willing human volunteers.
In regard to #2: It is true that the safety of "three-parent embryo" technology is unknown. That is why the technique should first be perfected in mammals and primates before moving on to human clinical trials.
In regard to #3: The "slippery slope" argument can be applied to any new technology. However, laws could be established to allow therapeutic genetic engineering, but to disallow "designer babies." Indeed, as conservative columnist George Will (who has a son with Down syndrome) recently commented on three-parent embryo technology, "If it is possible to draw a line, where you can stop on this slippery slope between therapy and the engineering of designer children, it is worth trying."
Mr. Will's position is, of course, a very sensible one.
I would, however, take his argument one step further: Denying parents the chance to have healthy children seems a rather cruel thing to do.
One final point: It's time for Godwin's Law to come to an end. Nothing on Earth, except perhaps the Kim regime in North Korea, even remotely compares to Nazi Germany. Anyone who cavalierly draws such comparisons betrays a profound ignorance of world history. Additionally, my paternal grandparents survived the Holocaust, and I find such commentary -- particularly from a Catholic magazine -- to be far beneath its dignity and, frankly, disgusting.
Just something for the self-appointed bioethicists to think about.
Of the 200 different cells in the human body, the adipocyte -- the humble fat cell -- is by far the most maligned. Vilified even more maliciously is the adipose tissue it forms, otherwise known as fat. All we seem to want to do is "shed," "cut," or "burn" it. Until the 1940s, scientists were indifferent to fat, simply characterizing it as a form of connective tissue. But in the past four decades, that view has evolved considerably.
Nearly 100,000 papers have been published on fat during that time. From these learned exploits, researchers have learned that fat, far from being dead weight, is an "endocrine organ at the center of energy homeostasis." Without it, the human body could not function properly.
Harvard biologists Evan Rosen and Bruce Spiegelman paid homage to fat in the January 16th issue of the journal Cell, revealing an array of incredible facts about the tissue and offering insight into the future of fat research.
"Adipose tissue is a remarkably complex organ with profound effects on physiology and pathophysiology..." they wrote. "It seems certain that we will discover much more about this highly complex and relatively unloved cell in the very near future."
Here are eight fascinating facts about fat:
1. Only vertebrates are known to have specialized fat cells. All eukaryotes -- organisms whose cells contain a nucleus -- can store fat in the form of lipid droplets, but only those organisms with a backbone -- vertebrates -- have adipocytes. Consider yourself special!
2. There are three, count em', three types of fat: white, brown, and beige! White adipocytes contain a single lipid droplet and are used to store excess energy. Brown adipocytes contain numerous droplets and a great many mitochondria, the cellular power plants. More related to muscle cells than white adipocytes, brown adipocytes' primary purpose is to generate heat. They are primarily found around the back and shoulders. Beige adipocytes are very similar to brown adipocytes, except they are found scattered amongst white adipocytes and contain lower levels of a protein called UCP1.
3. You have a fixed number of fat cells as an adult. New fat cells are constantly created and destroyed, but the body always remains below a fixed limit, which is set during adolescence. This is a key reason why it's so important to prevent childhood obesity.
4. Fat cells are microscope balloons. If the body has a fixed amount of fat cells, then how have 1.7 billion across the globe come to be obese? The answer is that fat cells are endowed with the incredible ability to grow well beyond their original size, soaking up extra energy -- in the form of lipids -- like a sponge. Under prolonged overeating, progenitor fat cells called preadipocytes can become full adipocytes. But once these are gained, they're almost impossible to lose. Weight loss reduces the volume of fat cells, but not their number.
5. Liposuction doesn't work. It's the most common cosmetic surgery in the world. And in the long-term, it's completely pointless. Removing a hefty 20% of total body fat may make you superficially slimmer, but it doesn't improve insulin sensitivity or reduce risk factors for cardiovascular disease. Moreover, studies have clearly demonstrated that unless patients take steps to live healthier, the excised fat mass will regrow in about a year.
6. You might joke that fat is great padding, but it really is. Fat protects delicate organs. "The eye, for example is surrounded by fat in a manner analogous to the way one might pack a teacup in bubble wrap," Rosen and Spiegelman say. It also cushions body parts exposed to high amounts of stress. Your toe- and heelpads are composed of fat, and your buttocks is brimming with it.
7. Fat may boost your metabolism. Research has shown that transplanting fat taken from beneath the skin (subcutaneous) and placing it around internal organs can actually reduce the overall amount of fat in the body and balance glucose levels.
8. Seven out of every ten cells in fat tissue aren't fat cells. According to Rosen and Spiegelman, "every gram of adipose tissue contains 1–2 million adipocytes but 4–6 million stromal-vascular cells." Half of the latter cells are white blood cells. Under obese conditions, other types of immune cells latch on to the expanding adipocytes, contributing to insulin resistance and chronic low-level inflammation. Why they heap on is not entirely clear.
(Images: Wikimedia Commons, Spiegelman)
Source: Evan Rosen & Bruce Spiegelman. "What We Talk About When We Talk About Fat." Cell 156, January 16, 2014
Each and every day, Americans collectively fart between 2.5 and 6.3 billion times, unloading up to 466.7 million liters of gas into the atmosphere. Approximately 99% of an "anal gas evacuation" is composed of odorless gases, mostly hydrogen, carbon dioxide, and methane, along with smaller amounts of nitrogen and oxygen. The remaining 1% of compounds grant the fart its notorious scent. Hydrogen sulfide is the chief culprit.
A fart's life begins with food. After entering your mouth and traveling down the esophagus, a meal makes its way to the stomach to be digested and, soon after, the small intestine, where nutrients are absorbed. Some dregs, however, survive the acidic gauntlet and continue to the next leg of the journey: the large intestine. There, what began as a meal for you becomes a feast for resident bacteria. They ferment the leftover food, releasing gas in the process, gas which must be expelled.
Flatulence's omnipresence, smell, sound, and social stigma make it a frequently explored topic in popular culture. Men gathered around restaurant feasts of beer, buffalo wings, and nachos perform much of the experimentation and discussion. Scientists' contributions, while noteworthy, pale in comparison. Sure, they've calculated the average volume of a fart (between 5 and 375 millileters), identified two strains of bacteria to make beans "flatulence-free," and documented the causes of extreme flatulence, but they haven't characterized the magnificence and grandeur of a fart's flammability with anywhere near the precision of the common man equipped with a camera and a YouTube account.
With two new papers, one published in the journal Gut in June 2013, and the other just published to Neurogastroenterology and Motility, Spanish researcher Fernando Azpiroz takes the attention off of fart jokes (at least temporarily) and bolsters our scientific knowledge on passing gas.
Most recently, he tested how two different diets affected flatulence. For the longest time, experts have been recommending foods to reduce gassiness, but surprisingly, there actually hasn't been a study conducted that gauges how eating those foods affects the frequency of farting.
Until now, that is.
Azpiroz assigned a group of 15 subjects to a low-flatulence diet restricted to foods like meat, fish, fowl, eggs; lettuce, tomatoes, avocado, olives; rice, gluten-free bread, rice bread; dairy products; strained orange juice, berries; and sugar, chocolate, coffee, wine, vinegar, oil. He assigned 15 more subjects to a Mediterranean diet of 2-3 portions of meat, fowl, fish, or eggs; two portions of vegetables, salad or legumes; four portions of bread, rice, pasta, potatoes, or cereals; two portions of dairy products; two portions of fruit; and three portions of oil or butter. Subjects consumed their diets for seven days and used a counter to register "every passage of anal gas."
Azpiroz found that the low-flatulence diet reduced subjects' incidents of farting by 54%, while the Mediterranean diet reduced them by 28%. The low-flatulence diet offered a statistically significant improvement over subjects' base diet, but not over the Mediterranean diet. While the result wasn't a completely undisputed victory for so-called low-flatulence foods, Azpiroz noted that they do appear to be beneficial.
"In patients with gas-related symptoms, a low-flatulogenic diet produces immediate beneficial effects with digestive, cognitive, and emotive dimensions," he said. In other words, subjects reported that they felt better.
Last June, Azpiroz teamed up with 15 other doctors and researchers to examine which of the bacteria inhabiting our large intestine, if any, correlate to increased frequency of farting. Bacteroides uniformis, Bacteroides ovatus and Parabacteoides distasonis were indicted, though no causal mechanism has yet been found linking them to the smelly crimes.
Such an examination is likely needed. A cursory Google search on the treatments for flatulence yields a range of recommendations, from diets, to probiotics, to woo. Most lack sufficient scientific substantiation, or any at all. If researchers could pinpoint the gut flora's role in farting, future treatment strategies may be revealed.
Choose one of the following: Life as we know it on Earth is coming to an end! Or, an entire scientific field is engaged in a giant conspiracy along with the media to perpetuate an enormous hoax on everyone!
Hyperbolic sentiments embody the blinding heat and viciousness of the current frenzied brawl over global warming.
You’re either with us or against us. Nuance? Please. There is no such thing. You must choose one extreme or the other, and become either a heretic or a hero. Choose wisely.
Given that there is no possible scientific benefit from this political impasse, is there a practical way we can work toward a détente? Here are a few ideas.
Everything starts with the researchers who propose their theories on climate change. They, like all scientists, must be absolutely forthcoming and transparent about admitting the limitations and uncertainties in their work. That is often not the case. Climate researchers often try to portray more certainty than their models actually allow.
As brilliant physicist Freeman Dyson has pointed out, models don’t incorporate individual clouds. Others don’t factor in biological activities on the surface of the Earth. This isn’t a knock on climate research; it’s simply impossible to design a perfect model. And even if a perfect model could be theoretically constructed, the computer power required to run it would be far too great.
As a result of this inherent uncertainty, the scientific community – and science journalists – must listen to criticism and healthy skepticism and reply with fact, not insult those who disagree.
Also, science journalists need to stop reporting every weather event as if it is tied to climate change. Any randomly selected storm is not necessarily caused by climate change (or lack thereof), and we need to stop using these events as “evidence.” This problem is exacerbated by the media’s eye for scary, calamitous doomsday stories and political agenda.
Furthermore, every effort should be made to separate climate science from climate policy.
Having said all that, the skeptics need to cool down too.
Those outside the field should not demonize the scientific establishment, which is trying to uncover the truth, not scheming en masse to pull wool over peoples’ eyes. Angry, insulted researchers (and journalists) whose work has been trashed out of hand are likely to denigrate detractors and feel the need to hide the uncertainties in their work.
As conservative commentator Charles Krauthammer –himself a climate change “agnostic” – has worried, it likely is not good to dump lots and lots of CO2 into the atmosphere. It is worth considering the potential consequences of human activities on the Earth and to take action if necessary.
Most scientists (except for some bad seeds) are not fundamentally driven by a political agenda. Their original goal was not to crucify the coal industry or shut down global business.
Thus, in my estimation, the real blame for this pointlessly poisonous situation lies with the media. Their endless search for controversy has resulted in our toxic political climate.
Since life first sprung from the primordial ooze about 3.6 billion years ago, it has evolved a multitude of fantastical forms. Towering redwood trees soar to heights of 300 feet, while hundreds of microscopic green algae can fit on the tip of your finger. Yet both are plants! A single blue whale weighs in at 190 tons, equaling the mass of about 21,000 red foxes. Yet both are animals! Life's diversity is truly incredible.
No less amazing are the features that we share in common. For example, all life -- as we know it -- is composed of cells, can reproduce, and can adapt or react to its surroundings. You may have noticed that those traits are more descriptive than concrete. That's because life is quite tricky to define. One of existence's most beautiful ironies is that we still have no unequivocal definition of life.
Scientists are ever on the hunt for commonalities, however. One of the most intriguing that they've discovered is a simple equation: q0 ~ M¾. Termed Kleiber's Law (named for the Swiss biologist Max Kleiber), the equation states that an organism's basal metabolic rate -- the amount of energy it consumes at rest -- is roughly equal to its mass raised to the three-quarters power. That's quite impressive if you think about it, that such a simple equation can apply throughout the animal and plant kingdoms!
Even more awe-inspiring is one explanation for why this is so. In 1999, a trio of scientists led by Geoffrey West from Los Alamos National Laboratory postulated that the answer lies in fractals -- self-similar, repeating, geometric patterns. If that description seems in any way confusing or vague, don't worry. Like the concept of life, experts disagree on the precise definition of fractals. Mathematician Benoît Mandelbrot, who first devised the term, candidly characterized them only as "beautiful, damn hard, increasingly useful." But whether or not you know how to describe a fractal, you know one when you see it.
According to West and his colleagues, fractals are a facet of life.
"Unlike the genetic code, which has evolved only once in the history of life, fractal-like distribution networks that confer an additional effective fourth dimension have originated many times. Examples include extensive surface areas of leaves, gills, lungs, guts, kidneys, chloroplasts, and mitochondria, the whole-organism branching architectures of trees, sponges, hydrozoans, and crinoids, and the treelike networks of diverse respiratory and circulatory systems."
But why would life take on such a geometry? West explained that fractal-like shapes are terrific at maximizing surface area, which permits nutrients to be more efficiently transported within and throughout biological entities and structures.
"The vast majority of organisms exhibit scaling exponents very close to 3/4 for metabolic rate and to 1/4 for internal times and distances. These are the maximal and minimal values, respectively, for the effective surface area and linear dimensions for a volume-filling fractal-like network," West noted, before later coming to a spectacular conclusion. "Fractal geometry has literally given life an added dimension."
Understandably, not everyone was convinced by West's theoretical paper. Opponents of his claims countered that the role of fractal branching, particularly in the capillaries of the heart is not fundamental to the exponent 3/4. West later joined with some of his critics and admitted that fractals don't completely explain the 3/4 exponent, as it bears true even in organisms without apparent "fracticality" in their internal forms.
Last Tuesday, a international team of biologists and physicists politely set aside West's fractal idea and tendered a new explanation for Kleiber's Law: it's a mathematical expression of an evolutionary fact: that animals try to use energy as efficiently as possible.
"An organism is akin to an engine," they explained. "Part of the energy obtained from nourishment is used for organism function, growth, reproduction, while the rest is dissipated through its surface."
"Plant and animal geometries have evolved more or less in parallel," said University of Maryland botanist Todd Cooke, one of the authors. "The earliest plants and animals had simple and quite different bodies, but natural selection has acted on the two groups so the geometries of modern trees and animals are, remarkably, displaying equivalent energy efficiencies. They are both equally fit. And that is what Kleiber's Law is showing us."
Though compelling, Cooke and his colleagues' explanation of Kleiber's Law likely won't be the last.
Jayanth R. Banavar, Todd J. Cooke, Andrea Rinaldo, and Amos Maritan Form, function, and evolution of living organisms PNAS 2014 ; published ahead of print February 18, 2014, doi:10.1073/pnas.1401336111
The Fourth Dimension of Life: Fractal Geometry and Allometric Scaling of Organisms. Geoffrey B. West, James H. Brown, and Brian J. Enquist Science 4 June 1999: 284 (5420), 1677-1679. [DOI:10.1126/science.284.5420.1677]
Every year, about seven Americans come down with plague. Yes, that plague, the one known as Black Death and which wiped out about one-third of Europe's population in the mid-1300s.
Actually, the U.S. is lucky. Plague is a much bigger problem in many developing countries. For instance, a single outbreak in Madagascar in December 2013 killed 39 people, and from 2000-2009, more than 10,000 people died in Congo from plague. The reason plague is still around is because it lives in rodents, such as rats and prairie dogs, and is transmitted to humans by fleas. It would be impractical, to say the least, to vaccinate or eradicate the world's rodent community, so the next best thing is for doctors to be able to quickly identify patients infected with plague and to administer antibiotics.
People who are treated are far more likely to survive than those who are not (~90% survival vs. ~30% survival, respectively). The untreated pneumonic form of the disease (in which it is inhaled into the lungs) has a mortality rate close to 100% and can kill within 24 hours.
Let's pretend that you are one of those unlucky people who catches bubonic plague. Most likely, you were bitten by an infected flea, and the nasty bacterium which causes plague (Yersinia pestis) is now swimming around inside your body. What's going to happen?
Just like a spy entering enemy territory, one of the first things Y. pestis does is go under cover. One of the molecules in its outer membrane, called lipopolysaccharide*, is a dead giveaway to your immune system. So, the bacterium cleverly modifies the structure of this molecule so it no longer alerts the immune system. And how does it know when to do this? Y. pestis can detect the temperature outside. When it is around 37 degrees Celsius (98.6 degrees F.), the bacterium figures it must be inside a warm-blooded mammal.
The bacterium then makes a beeline for your lymph nodes, immune system outposts that are constantly searching for microbial intruders. Because it packs some serious heat, Y. pestis doesn't mind living there. During its journey, if the bacterium encounters any pesky immune cells, such as macrophages, it jams what looks like a hypodermic needle (called a "type III secretion system") into the cell and injects several toxins. The macrophage whimpers away with both its pride and ability to defend the body severely damaged.
Additionally, at some point, Y. pestis needs some iron. Your body has quite a bit of that, but it's wrapped up in hemoglobin and other proteins. One of these proteins, called transferrin, becomes the victim of a highway robbery. Y. pestis releases a molecule called yersiniabactin which has a high affinity for iron. In fact, it loves iron so much, that it is able to physically rip it away from transferrin, and then it brings the iron back to Y. pestis.
Once safely inside the lymph node, Y. pestis replicates. Eventually, your immune system picks up on the fact that something is terribly wrong. The lymph nodes swell up, creating the nasty-looking "buboes" that are characteristic of bubonic plague. The bacteria then migrate through the blood to your lungs, at which point you're basically cooked.
But, how exactly do you die? Strangely enough, your body kills itself. The presence of so many bacteria in the bloodstream causes your immune system to freak out, triggering a condition called septic shock. Your body's blood vessels begin leaking, decreasing blood volume. This leads to abnormal clotting and multiple organ failure.
*Lipopolysaccharide was the subject of my PhD dissertation, though not the one from Yersinia pestis. I studied a much less harmful bacterium called Bacteroides.
Source: Salyers AA and Whitt DD (2002). Bacterial Pathogenesis: A Molecular Approach (2nd ed). Washington, DC: ASM Press.
"We are each of us a multitude. Within us, is a little universe." - Carl Sagan
On the outside, the human body appears simple enough, no more than a three-dimensional stick figure painted in pale, pastel shades of pink and white or tanned shades of brown. It can be trussed up in fancy clothes or augmented with hair and metallic bits, but those changes only run skin-deep. To witness the true grandeur of a human being, we must look under the skin.
There exists an amalgamation of life, a universe to rival the one we inhabit with stars and planets. We are a collection of squelchy tissues. "Gross!" some might say. "Amazing" is what they truly are. Each of us is composed of trillions of cells, the fundamental biological units of all living organisms. And each cell operates dutifully, whether it composes the liver, heart, lungs, or any of the other organs of the human body.
Every cell has its own small array of miniature "organs," more properly referred to as organelles, with their own specialized functions. These tiny structures are made up of carbohydrates, lipids, and proteins. One of the organelles, the nucleus, is the brain of the cell. Within it is stored the language of life: DNA. The DNA entwines into a double-helix structure, forming, as Carl Sagan described, "a machine with about a hundred billion moving parts, called atoms."
“There are as many atoms in one molecule of DNA as there are stars in a typical galaxy,” he added. Consider further that since we have 23 pairs of chromosomes, we thus have 92 chromatids --molecules of DNA -- or 92 "galaxies," in a typical human cell. That makes the humble white blood cell, a microscopic soldier of the immune system, both an extraterrestrial and its own miniscule universe.
“These cells are part of us, but how alien they seem," Sagan remarked. "Within each of them, within every cell there are exquisitely evolved molecular machines, nucleic acids, enzymes, the cell architecture, every cell is a triumph of natural selection, and we’re made of trillions of cells."
About 100 trillion to be somewhat precise.
On it's own scale, our body is as vast as the universe. Even in terms of looks, the comparison is unmistakable.
To be awed at the natural world, one doesn't need to gaze up at the heavens or view an earthrise from the lunar surface. One needs only to look in the mirror.
(Images: Shutterstock, Cosmos, KULFOTO)
From San Francisco, to Seattle, to Denver, to Miami, all across the country, we are a nation of sports fans. Weeknights and weekends find countless Americans crammed on couches in front of big screen TVs or cheering at sports stadiums. As attendance figures and television viewership polls demonstrate, competition is a craving, and we are pretty much addicted.
Sports constitute a world of their own, one that's easy to get lost in. They've spawned reality television shows, 24-hour networks, and varieties of lingo that could almost be distinct languages. When you find yourself engrossed, be aware, it's important to keep your wits and remain skeptical. All the glorious entertainment and highlight reels are accompanied by cognitive biases, and you should be aware of them, lest they rub off.
1. Availability Heuristic. In the 2012-2013 NFL season, Minnesota Vikings running back Adrian Peterson sprinted to 2,097 rushing yards and the league's Most Valuable Player award. With that performance fresh in his mind, Adrian Peterson, along with a host of sports analysts, fans, and commentators, predicted that he would engineer a similar performance this year. He did not. No doubt Peterson and his supporters were fooled by the availability heuristic, the mental shortcut humans take that relies on events that are fresh in our minds to estimate future events or evaluate the present. They would have been wiser to make a prediction based on his career averages. Before this year, Peterson averaged 1,475 rushing yards per season and 98.4 yards per game. His final totals were much closer to those averages than his breakout performance in 2012-2013: 1,266 yards over the course of the season, and 90.4 yards per game.
2. Illusory Superiority. This is a bias which causes people to overestimate their positive qualities and underestimate their negative ones. For the most part, everybody at least partly suffers from this, but it's particularly on display in professional sports. Seattle Seahawks cornerback Richard Sherman recently remarked that he's "the best corner in the game." He's by no means the first football player to claim ultimate greatness. Receiver Randy Moss and quarterback Joe Flacco have done the same.
3. Hot-hand Fallacy. Particularly prevalent in basketball, this is the idea that a player who has hit repeated shots in the past will have greater success in the near future, going on a "streak," so to speak. It makes intuitive sense. Success may cause players to become more confident and thus more relaxed. But the effect has not borne out in the vast majority of research. A player seems no more likely to hit a shot if he has made a previous one. Contrary to what you or the announcer might exclaim, a player cannot "catch fire." As stated by sports psychologists Jonathan Koehler and Caryn Conley, "Declarations of hotness in basketball are best viewed as historical commentary rather than as prophecy about future performance."
4. Hindsight Bias. Yes, yes, yes stereotypical obsessive sports fan, you knew all along that the play call which lost your team the game was the "worst decision ever." Cue eye roll. It's easy to critique past decisions; it's much harder to make them at the moment that they are required. According to University of St. Thomas psychologist John Tauer, there is a danger to hindsight bias, especially when you apply it to your own mistakes. Instead of taking the time to learn why the mistake was made, you may fool yourself into assuming you knew the correct choice all along.
Consider the Laker girls or Dallas Cowboy Cheerleaders. To many, these women are beautiful and sexy. However, their perceived beauty is in part a visual illusion, created by the fact that cheerleaders appear as a group rather than solo operators. Any one cheerleader seems far more attractive when she is with her team than when she is alone.
WITH THREE SECONDS left in the game, Kevin Love knew the basketball was going to him. As the undisputed star player for the Minnesota Timberwolves, Love was the natural choice to attempt a game-tying shot. But it wouldn't be easy. No doubt the five players on the floor for the opposing Dallas Mavericks would do everything they could to protect their meager lead. A two would tie; a three would win. After a chirp of the whistle and an inbound pass from his teammate Ricky Rubio, the ball was in Love's tender hands. All he had to do was dodge, pull up, and shoot...
But as the final horn sounded, the ball was nowhere near the hoop. It never even came close. Shawn Marion made sure of that. The Mavericks' forward slapped Love's arm as he extended it to shoot, sending the ball wildly astray and into the arms of a defender. Though clearly a foul, there was no justice for Love or his Timberwolves. With clear views of the play, two NBA officials swallowed their whistles, electing inaction over action. For half a minute, Love stood in place with his arms at his side, smiling all the while. What could he do?
In the ensuing minutes, fans vocalized their disapproval with boos and yells, and dumbfounded commenters dissected the unfairness with hilarious outbursts (listen below). The next day, the National Basketball Association confirmed the obvious: "Love should have been awarded two free throws with one second left on the clock." And yet, throughout the controversy, nobody stated the obvious...
A robot would have made that call.
"TECHNOLOGY WILL SOON make officials at high-level sports events as obsolete as elevator operators," technology columnist Kevin Maney recently penned in Newsweek. "Referees, umpires, judges, linesmen - they're all toast, in every sport, some sooner than others."
If he's right, I mourn for that day. But I don't think he's right.
Make no mistake, robot referees are inevitable. Their forebears, in the form of precise, high-speed cameras, are already in use for simple "Who got there first?" and "in or out" affairs, in sports like car racing, sprinting, tennis, football, and baseball. The world's most popular sport, soccer, may soon follow suit as well, lending officials technological aid to review key offside or goal decisions.
"The binary calls are the easy ones for technology to take over. In or out. Ball or strike. Safe or out. There's no judgment required. It is or it isn't," Maney points out.
Programming robots to make judgment calls will be a far trickier task. But many thinkers are confident that providing powerful computers like IBM's Watson with immense amounts of data generated through omniscient motion-capturing systems will one day yield a program that can distinguish between calls and no-calls.
As of three years ago, roboticists were already on the trail. Engineers at the University of Chilé created an automaton to referee games of robot soccer.
"It moves along one of the field sides, uses its own cameras to analyze the game, and communicates its decisions to the human spectators using speech," creator Matías Arenas described (PDF). The bot was capable of recognizing the robotic competitors, the ball, field landmarks, and was even able to make rudimentary decisions, though fouls like pushing were beyond its means to assess, as they required the judging of intention. As a cute added touch, Arenas and his colleagues endowed the robot referee with ability to express basic facial gestures, which, as Arenas boasted, are "very attractive" to the human spectators.
So the question is not whether or not we can make robot referees -- we undoubtedly can! The question is whether or not they will ever fully replace humans. The answer is an almost equally assured "no."
AT FIRST THOUGHT, switching to robot referees is a no-brainer. Humans are glaringly fallible and corruptible. Soccer referees subconsciously call more fouls when they see the action moving from right to left. Home soccer teams generally receive more favorable calls from officials. Match fixing scandals have occurred and will certainly reoccur. Wouldn't an objective android be preferable to a subjective human?
No, for many reasons. Machines can also be corrupted. The power to cheat will simply be transferred to their handlers. Match-fixing could be made even more subtle and insidious. Moreover, machines are not entirely infallible. Cameras can't always get in the middle of the action like a human referee. Nor can a robot react to any possible situation. Strange things happen in sports, and for these unforeseen occurrences, a thinking, reactive human official is necessary.
On a more philosophical note, I doubt that coaches, fans, or players will ever want a completely mechanical referee. Sports aren't just games. Every match, every competition is a theatric spectacle, complete with a plot, heroes, and villains. In this drama, referees play starring roles, as antagonists or protagonists, or somewhere in between. To remove them would be to rewrite the story and lessen the excitement.
There's also a matter of scale to consider. Outside of the grandest echelons of professional sports, there are thousands upon thousands of leagues, from intramural, to amateur, to semi-professional. Will each be able to phase out human officials entirely? I doubt it. Perhaps when intelligent androids are ubiquitous and inexpensive. But by then, the human race may have other problems to worry about, like being enslaved by those referees.
Technology has a overt place in officiating. Features like Hawk Eye in tennis and booth reviews in football are clear improvements. But human referees are here to stay, however "blind" they may be.
All too many people behave as if they are experts in everything. The internet is partially to blame. The widespread availability of information is both a blessing and a curse. Indeed, the adage "a little knowledge is a dangerous thing" has never been more true, especially as it has become painfully obvious that some people believe that reading the first paragraph in a Wikipedia entry will quickly bring them up to speed on complex topics. Really, who needs a PhD when you have five minutes to kill and access to Google?
Philosophy, in particular, has witnessed an obnoxious infestation of alleged experts. World-renowned physicist Stephen Hawking -- who has weighed in on everything from aliens destroying Earth to the existence of God -- recently declared that "philosophy is dead." How he is able to simultaneously practice philosophy while believing that philosophy is dead without experiencing cognitive dissonance is curious, but perhaps Dr. Hawking suffers from the philosophical equivalent of Cotard delusion.
The esteemed Dr. Hawking isn't the only person who has foolishly dabbled in philosophy. Several bloggers, specifically science and political bloggers, have made dubious contributions to the philosophical subdiscipline of logic. How so? By inventing logical fallacies (that you would never learn in a real logic class) and applying them to their political opponents.
The most popular one is false equivalence. (I wrote an entire chapter debunking false equivalence for my book, Science Left Behind -- now available in paperback at fine retailers everywhere!) From what I can gather, "false equivalence" is a fancier way of saying, "You're comparing apples and oranges." It's meant to convey the notion that two things being compared can't actually be compared.
In reality, this argument is used mostly by political hacks who are trying to rationalize hypocritical beliefs and behavior. For instance, readers of RealClearScience know that both sides of the political spectrum -- Republicans and Democrats -- will throw science under the bus whenever it is politically convenient. But partisans don't see it that way. In their minds, only the "other side" is unscientific, and any comparisons between the two sides immediately draw accusations of "false equivalence." Some writers have built their careers peddling this nonsense.
Outside of political bickering, another newly invented fallacy is called sunk cost. According to psychologist Daniel Kahneman, "We are biased against actions that could lead to regret." This explains why, for instance, students finish their PhD's even if they realize they don't really need one anymore.
But how is that fallacious thinking? Regret is a powerful emotion. Nobody likes feeling regretful. Additionally, many people enjoy finishing something they started. Proving one's ability to persist in the face of adversity could instead be interpreted as a positive character trait.
The growing list of fallacies goes on and on. Some are legitimate, but others are questionable. After reading the list, however, it is difficult to imagine how anyone could ever form an argument that wasn't fallacious!
The trouble with labeling everything a "fallacy" is that (1) not all poor reasoning is automatically fallacious, and (2) it implies that everybody would agree on everything if we could only think correctly.
Consider the following: Your child tells you that she wants a lollipop before bedtime. You, the supposedly enlightened adult, know that a dose of sugar for a toddler before bedtime isn't the best idea in the world. So you ask, "Well sweetie, why do you want a lollipop now?" She responds, "Because I want one." There's nothing wrong or fallacious about her argument. It is, however, very poor and immature reasoning -- something we would expect from three-year olds or, perhaps, college students. No matter how many hard facts you use to convince her that you're correct, your little princess is going to disagree.
Why? Because people differ in priorities and value judgments. What is important to me isn't necessarily important to you. What your daughter finds important (a lollipop) isn't the same as what you find important (a good night's sleep and smaller dental bills). Fundamental differences in beliefs and values will forever prevent humanity from coming to 100% agreement on anything.
Blaming our disagreements, particularly political ones, on logical fallcies does nothing other than delude us into thinking that our opponents are illogical and that we are intellectually superior. And since when has that attitude ever succeeded in convincing anybody?
(Image: Bayes' Theorem via mattbuck/Wikimedia Commons)
It's Valentine's Day! Yes, love is in the air! Food tastes sweeter. Roses are red and violets are... bloody annoying. Dinner is far from delectable. And it's best not to think about how much that piece of jewelry cost.
Sounds like an aphrodisiac is in order. Dark chocolate is nice, but there's nothing better than Nature, itself. Kinky encounters from the Animal Kingdom ought to do the trick. A fair warning, however, they might also scar you for life.
1. Red Velvet Mites. Ever left a trail of roses for your significant other? It's dreamily romantic, if a little cliché. But guess what? The red velvet mite was doing it long before you saw American Beauty. Spinning a strand of silk, the male lays out an intricate path that leads back to his "love garden" (that's really what scientists call it). When a roaming female discovers the trail, she follows it back to the source. There, she finds that the male has been quite busy, artistically spraying sperm all over his sweet shag pad. While this trick may not work on human wives or girlfriends (and makes for a sticky, awkward cleanup), mite females don't seem to mind. If one of them deems the male to be virile enough, she'll dab up some of the sperm, thus fertilizing her eggs.
2. Salmon. Fresh, wild-caught salmon is a staple of Valentine's Day menus. The uncensored story of how they are spawned adds a bit of flavor to the meal. Their monumental upstream migration from the ocean to riverine origins is an elegant, storybook journey -- this much, you probably know. But what you may not know is that it ends in a nightmarish fashion. Neglecting to feed during their travels, salmon digest their own skin, and thus turn a sickly shade of red. When they reach the spawning grounds, females lay thousands of eggs, which males fertilize with a torrent of sperm. Both sexes then promptly die. Their decaying remains will provide nutrition for their young, one of which you might be eating tonight! Does that indirectly count as necrophilia?
3. Green Spoonworms. As CreatureCast's Alysse Austin describes, the green spoonworm's mating ritual begins with "the classic love at first sight scenario." When a young, sexless, larva comes in contact with chemicals given off by a female worm, it quickly grows enamored, and transforms into a male worm. With the male's metamorphosis complete, it enters the female through its proboscis (a mouth-like appendage) and takes up residence in her reproductive tract. The male spoonworm is 200,000 times smaller, you see, and once inside the female, it continually fertilizes her eggs, all while receiving sustenance and shelter.
4. Bees. Bees' honey is sweet. Their sex is not. In fact, for males, it's both explosive and deadly. Male bees, called drones, generally live colorless lives. Unlike worker bees, which collect nectar and pollen, drones loaf around the hive -- the insect equivalent of a basement child. But they do get their moment in the sun, and that moment is sex. When a bee queen is ready to mate, she flies out of the hive for all the world to see. Males don't waste their time ogling. They speedily fly to meet the queen, clamp on, then, in a process that takes all of two seconds, forcefully eject their penises into the queen and pump her full of semen. The force is so great that the connection to the penis is explosively severed, and the appendage remains in the queen like a cork. Males are left with an open wound through which their hemolymph (blood) slowly seeps out. Death follows. But hey, there are worse ways to go than suicide by sex.
In 1974, Richard Feynman delivered the commencement address at CalTech. With his typical aplomb and flamboyant hand gestures, the Nobel Prize winning physicist shared his thoughts on science, pseudoscience, and learning how to not fool yourself.
Feynman discussed much in his speech, touching on topics from massage to social science. One could say he rambled a bit, but even the most ardent critic couldn't deny that Feynman's babbling was thought provoking. He always had a knack for making even the most mundane somehow meaningful.
Towards the end of his remarks, Feynman spoke on a topic still relevant today: the need for rigor in scientific research. His primary concern was that scientists were ignoring the integrity of their methods in favor of attaining publishable, flashy results. Publication bias was also on his mind. He worried that excellent papers were going unnoticed, or worse, unpublished, simply because their results weren't instantly attention-grabbing. To support his claim, Feynman referenced a little-known rat study conducted in 1937 by a man he referred to as "Mr. Young."
Back then, many psychologists considered the humble lab rat to be the gold standard for psychological research. Psychologist James B. Watson even suggested that you could learn everything you might want to know about human psychology by dropping a rat into a maze. Today, most experts agree that psychology experiments conducted on humans can't even tell us everything we might want to know about the human psyche!
Feynman described Young's experiment as such:
"He had a long corridor with doors all along one side where the rats came in, and doors along the other side where the food was. He wanted to see if he could train the rats to go in at the third door down from wherever he started them off."
But Young ran into a problem. Each time, the rats would simply go to the door where the food was previously.
"The question was," Feynman continued, "how did the rats know, because the corridor was so beautifully built and so uniform, that this was the same door as before?"
Young set about eliminating all the possible variables that would clue the rats in to their position in the alley, so that they'd have to rely purely on relational information.
"So he painted the doors very carefully, arranging the textures on the faces of the doors exactly the same. Still the rats could tell. Then he thought maybe the rats were smelling the food, so he used chemicals to change the smell after each run. Still the rats could tell. Then he realized the rats might be able to tell by seeing the lights and the arrangement in the laboratory like any commonsense person. So he covered the corridor, and still the rats could tell."
Young finally discovered that the rats could discern the previous door by the way the floor sounded as they ran over it! So he filled the corridor with sand, and was finally able to teach the rats to go to the third door down from their starting location.
You might think this was a pretty cool discovery, and that Young's methods were worthy of replication in any future rat-running experiment with similar aims. But Feynman was dismayed to find that the study was relatively forgotten. In fact, it may have never even been published.
Was this due to publication bias? Limited information exists to the precise identity of Mr. Young, though it's likely that Feynman was referring to animal scientist Paul Thomas Young. Young, did, in fact, work with rats, but no study as Feynman describes is listed in his published works. So we'll have to take Feynman's word that the study was indeed conducted. If so, the rat-running psychologists of old never heeded Young's methods.
"I looked up the subsequent history of this research," Feynman said in his address. "The next experiment, and the one after that, never referred to Mr. Young. They never used any of his criteria of putting the corridor on sand, or being very careful. They just went right on running the rats in the same old way, and paid no attention to the great discoveries of Mr. Young, and his papers are not referred to, because he didn't discover anything about the rats."
But what Young discovered, according to Feynman was infinitely more vital.
"In fact, he discovered all the things you have to do to discover something about rats."
Proper science is not simply about attaining a result; it's about uncovering truth. Today, when almost 90% of results from preclinical cancer trials can't be replicated, and psychology is plagued in a similar fashion, it's clear that this has been at least partially forgotten. Though scientists may be tempted by a flashy finding, rigorous methods should always take precedence.
The Iditarod is known as the "Last Great Race on Earth." It may not be the last, but it certainly is great.
On March 1st, teams composed of 12 to 16 dogs and one human musher will race from Anchorage to Nome, Alaska, a hellish 975-mile journey through "a harsh landscape of tundra and spruce forests, over hills and mountain passes, and across rivers." They'll battle blizzard conditions and arctic temperatures. Gale-force winds chilled to the triple digits below zero will hamper their progress. The only certainty is that not every competitor -- man or dog -- will make it across the finish line.
The dogs that compete in the Iditarod are some of the finest athletes in the world. Inuit Sled Dogs were the original canines of choice, but now, Alaskan or Siberian Huskies have supplanted them as the gold standard. When trained, they sport an aerobic capacity about eight times that of an average human, and about three to four times that of an elite marathoner. Over the years, breeders have selected for traits ranging from webbed toes that travel more efficiently atop snow to a pulse rate that quiets at a moment's rest. But the most amazing biological advantage, as described by Sports Illustrated senior writer David Epstein in his book The Sports Gene, is the dogs' ability to adapt to training on the fly.
Intense exercise depletes the body's energy reserves and leaves tiny micro-tears in muscles. The ensuing soreness and fatigue is enough to hobble even the fittest humans, but not premier Alaskan Huskies. "[They] get fit while barely stopping to recuperate," Epstein writes.
Fairbanks, Alaska native Lance Mackey knew all about these performance sled dogs, and nursed a burning desire to breed and race them. But, poor, to the occasional point of destitution, he had to try something different. Without the funds to afford elite dogs, Mackey instead bred dogs that "yearned for nothing more than to run, eat, and run some more." In short, his huskies weren't as fast as the superior racing lines that had been honed for decades, but they would pull, pull, and pull some more until they couldn't pull anymore. And they'd love every moment of it.
In 2007, Mackey won the Iditarod. In 2008, he won it again. Three-peat and four-peat victories followed, the latter a feat that nobody had accomplished before. Mackey irrevocably changed sled-racing, though he insists that he just furthered a trend that had already been brewing.
"Yeah, thirty-eight years ago in the Iditarod there were dogs that weren't enthused about doing it, and that were forced to do it," Mackey told Epstein. "I want to be out there and have the privilege of going along for the ride because they want to go, because they love what they do... And that's what's happened over forty years of breeding. We've made and designed dogs suited for desire."
Lackey's sentimental assessment is completely accurate. Studies of the humble lab mouse reveal that motivation has a genetic component. For more than a decade, UC Riverside physiologist Theodore Garland has been breeding mice that love to run. Nothing gets them going like jumping on the wheel and motoring nonstop. Garland's data indicates the critters crank out seven or more miles each night -- that's a lot of squeaky revolutions of running wheels. Examining the rodents' brains, the scientist has found that the bodily processes engaged during running peak levels of dopamine -- the neurotransmitter associated with the brain's reward system -- like nothing else.
Though human studies aren't as in-depth, the evidence that scientists have mustered thus far supports the idea that desire is genetic. Identical twins have more similar workout habits than fraternal, possibly because they share genes that predispose them to enjoy exercise. The desire for sex may also be tied to genes. Around 20% of humans carry one particular variety of a dopamine receptor in the brain that endows them with a more voracious sexual appetite.
What are the implications of this research? Does it mean that our likes and desires are programmed from birth? Not entirely. Predisposed is a better word. The rest is willpower. Thankfully, our lives are not as grueling as the Iditarod.
Primary Source: David Epstein, The Sports Gene: Inside the Science of Extraordinary Athletic Performance, 2013
(Images: AP, Wikimedia Commons)
Thus far, the biggest event of the year for the scientific community was the "Creation Debate" between Bill Nye ("The Science Guy") and Ken Ham, a Young-Earth Creationist and founder of the Creation Museum. Other than this 3-minute clip which summarized the essence of the debate, I didn't bother watching.
Why? For three reasons.
First, there are only 86 billion neurons in my brain and 24 hours in a day. Because my brain is limited by both space and time ("neural spacetime"?), I prefer to fill it with useful information. (Okay, sometimes my wife forces me to watch The Bachelor.) I haven't yet reached the point where I have forgotten that the Earth revolves around the sun, like Sherlock Holmes, but I'm sympathetic to his intellectual strategy. I try to limit my daily intake of inanity.
Second, debates with people who embrace anti-scientific beliefs only serve to lend them credibility. I recently made the mistake of engaging in an online debate regarding GMOs. Of course, the "winner" of the debate -- determined by online votes -- was a business law professor who has little, if any, understanding of biotechnology. I won't make that mistake again.
Third, and most importantly, the "Creation Debate" perpetuates the myth that science and religion are fundamentally in conflict. They are not.
The list of famous historical scientists who believed in God -- specifically the God of Christianity -- is very, very long. Harmony between science and religion still exists. Perhaps the most famous Christian practicing science today is Francis Collins, who helped sequence the human genome and is the current director of the National Institutes of Health. He has also published more than 500 papers and, as a result, is one of the most successful scientists alive. Even Richard Feynman, who described himself as an "avowed atheist," recognized that it was entirely possible for a scientist to rationally believe in God.
Also, it is worth noting that Mr. Ham doesn't represent the "Christian point of view," although by comments he makes, it's clear that he thinks he does. As The Economist reported, Mr. Ham said before the debate, "I'm a Christian. I know God's word is true. Nothing [Mr. Nye] can say will cast doubt on that." He also rebukes Christians who disagree with him. However, he is conflating his personal interpretation of the Bible with what he believes to be "God's word."
The Catholic Church, for instance, accepts evolution. Many icons in the Christian Church, such as St. Augustine, John Calvin and John Wesley, also rejected a literal interpretation of Genesis. While Young-Earth Creationism is more common among American evangelicals, it is still only accepted by 54% of U.S. pastors. Among the laity, belief in creationism varies widely by denomination. For example, 64% of white evangelicals reject evolution, but 68% of white Catholics and 78% of white mainline Protestants accept it.
So, why does the myth of an intrinsic incompability between science and religion persist? Well, it persists largely because so many fundamentalists say that there is a conflict. And those fundamentalists come in two flavors -- both the religious and atheistic varieties.
For religious fundamentalists, science that challenges any aspect of their faith must be wrong. For atheistic fundamentalists, positivism is the only source of knowledge. Both adhere to a worldview that is dominated by a false dilemma: Either religion or science is true, and when a perceived conflict arises, there is simply no middle ground.
When one thinks about it, Young-Earth Creationists and fundamentalist atheists actually have quite a bit in common.
Britons insist that their teeth aren't bad; they just have character. But, like the pudgy, misbehaved Eric Cartman from South Park insisting that he's not fat, he's just big-boned, that statement reeks of denial.
Long have Americans harped on our closest allies for their cliché dental deficits. Are we too hard on them? To be sure, British teeth aren't as ghastly as is commonly stereotyped, particularly in North American comedies. Austin Power's misshapen, yellow chompers are far from the norm. In fact, according to the most recent statistics from the Organisation for Economic Co-operation and Development (OECD), British children are tied with Germany for the lowest average number of decayed, missing, or filled teeth, firmly trouncing American children. However, by adulthood, that lead is lost.
Pouring through the data in the CDC's 2012 Oral Health Status Survey and the National Health Service's (NHS) 2011 Adult Dental Health Survey reveals that, by adulthood, Americans narrowly beat Britons in rates of adult tooth decay: 23.6% vs. 27.5%. Other measures, such as the rates of teeth restorations, and the percent of adults who are edentate (meaning they don't have any natural teeth) are nearly identical.
Also comparable are the measures that Britons and Americans take to protect their teeth. 75% of Britons brush their teeth twice a day. Roughly 80% of Americans do. Our friends across the pond beat us in dental visits. 71% of British adults visit the dentist each year, while only 61.6% of Americans claim to do so.
Functionally, it seems British teeth are no worse off than American teeth. The major difference comes in form. While it's customary for almost all American children to don braces at some point, Britons commonly eschew the process. The disparity likely stems from the fact that Britain's NHS doesn't cover cosmetic dentistry unless its deemed "clinically necessary." Even the most burning desire for a movie star smile doesn't cut it. People can still get braces, of course, they just have to shell out in excess of £2,000, approximately $3,300.
Besides the institutional barrier, there is a cultural one. Britons often frown on "artificially" white smiles.
"US teeth are sometimes whiter than it is physically possible to get in nature - there is a new reality out there. The most extreme tooth bleaching is terrifying, it looks like it's painted with gloss paint and has altered what people perceive as normal," Professor Jimmy Steele, of the School of Dental Science at Newcastle University, told the BBC.
As is apparent, this is where the numbers end and the cultural back-and-forth teasing begins. Sifting through the banter leads to a basic realization: Yanks are partial to their straight, pearly whites. Brits enjoy a more zigzag, earthy-colored set of chompers. Different strokes for different folks.
Meandering into the lecture hall, you take note of the atmosphere. The air is still. But for the faint sounds of shuffling pages, trackpad clicks, and anxiety-laced whispering, the room is silent. You take a seat, separated from your nearest classmate by an empty chair. At face value, the gulf seems superficial, and yet, when the tests are passed out, that distance will become insurmountable. Don't talk. That's cheating! It will be just you, the test, and the bubbles on the answer sheet. Those cursed bubbles...
Anyone who's ever taken a large science class in college is well acquainted with the multiple-choice test. Ingenius in its simplicity, the test comprises a set number of questions, each with a short list of responses. It's up to the test-taker to determine which is correct. Here's an example:
1. Which of the following is one of the major approaches to psychology?
d. New Age Movement
The testing strategy has been utilized for decades, with few alterations and a tacit resignation to the status quo. To professors, it's an easy, objective, and efficient way to gauge the material comprehension of large numbers of students. To students, though they may view the method as cold and unforgiving, it's a universal standard -- one they're accustomed to -- and it offers a genuine chance to guess the correct answer.
Critics contend that multiple-choice tests only encourage two things: rote memorization and hand-eye coordination. (Filling in tiny bubbles is deceptively difficult.) Since science is not about memorizing and regurgitating facts, why should future scientists be judged in such a fashion?
Compared to memorization, Professor Kathrin Stanger-Hall of the University of Georgia believes that critical thinking skills are far more useful to aspiring scientists, and to students, in general. But sadly, college is seriously inept at teaching these skills. A 2011 study found that 46% of college students did not gain critical-thinking skills during their first two years of college, and 36% had not gained critical-thinking skills after 4 years. Stanger-Hall theorizes that multiple-choice tests contributed to these dismal statistics. In 2012, she tried out a little experiment on two sections of her Introductory Biology class.
Though each section was taught in an identical fashion, one section (consisting of 282 students) was assessed using the traditional multiple-choice-only format, while another (192 students) was assessed with "mixed" mid-term exams of 30 multiple-choice questions and three to four constructed response questions, such as short answer, fill-in-the-blank, or diagram labeling. At the end of the year, each section took final exams that shared 90 of the same multiple-choice questions. Their scores on these questions were compared.
After correcting for students' grade point average, Stanger-Hall found that students in the "mixed" exam section scored significantly higher on the 90 multiple-choice questions than did students in the multiple-choice only section: 67.35% vs. 64.23%. Upon closer examination, Stanger-Hall determined that the difference was mostly due to the fact that students in the "mixed" section firmly outstripped those in the multiple-choice section on higher-level thinking multiple choice questions: 64.4% vs. 59.54%.
"The purpose of this study was to assess whether a multiple-choice-only exam format might hinder the development of higher-level (critical) thinking skills in introductory science students. The answer is a convincing yes," Stanger-Hall summed up (emphasis hers).
According to Stanger-Hall, replacing a significant portion of multiple-choice questions with constructed response questions would be a "cost-effective strategy to significantly improve the critical-thinking skills of college students." But her recommendation is not the only viable option. Social psychologist Joann M. Montepare -- who's taught college classes for 15 years -- urges a slightly different approach, one that she's already put into practice with great success. Multiple-choice tests, she says, are a great evaluative tool. But like any tool, they must be well crafted and correctly employed. Montepare described her creative assessment methods in the October 2005 edition of The Observer:
"Students come to class prepared as they would be for any other multiple-choice exam, take the exam, and then they take it home and review each question to assess whether their answer was indeed the best one. Students can use class notes, readings, and even discuss the questions with their classmates (indeed such collaboration is encouraged). As they do so, they can change their answers. Students return exams during the next class period and the self-corrected version determines their final grade, as follows. For each correct answer (no change) students receive full credit. For each corrected answer (wrong to right), students receive half-credit. Incorrect answers — originally wrong and unchanged, or changed to wrong — receive no credit."
Perhaps the largest benefit of Montepare's method is this: Instead of focusing on memorizing material beforehand, students actively research and collaborate to not only find, but also understand the answers. That sounds a lot more like how science is done.
Every winter, we southerners wish for snow. Sometimes, just when we think we’re going to get our wish, the conditions are slightly off and we get something weirder called freezing rain. Much of the continent has experienced this odd precipitation over the past month. It’s an icy nuisance, but it demonstrates something unusual: water dropping below 32 degrees without freezing!
Three vertical atmospheric layers are required to produce freezing rain: cold air high up and near the ground, with warm air sandwiched between. Snow forms in the uppermost cold layer. Upon falling lower, the snow encounters the warmer air and rapidly melts into liquid droplets of rain. (Without this warmer layer, the precipitation would reach the ground as snow.)
As the melted liquid droplets fall out of the warm layer and into the cold layer just above the ground, something unusual can happen. If the temperature of the ground layer of air is roughly 29-31 degrees Fahrenheit, the droplets become supercooled. This means that, while they are cold enough to freeze solid, the structure of their molecules remains liquid. How can this be?
When matter changes from one state to another, physicists say that a phase transition has occurred. Phases of matter include the familiar liquid, solid, and gas. There are also more exotic phases: plasma, Bose-Einstein condensate, superfluid, glass, and amorphous solid. Supercooled freezing rain is a strange state between liquid and solid.
The phase transition of water from a liquid to a solid is a clear change in the position of the molecules. Liquid water molecules move and flow around one another. They are all pressed together, but with no particular order; stirring them can rearrange the order easily. Solid water ice, however, is a rigid structural arrangement of atoms. This is called a crystal: trillions and trillions of atoms all line up in harmony to form a nearly perfect repeating pattern.
How does a crystal begin to form? Nucleation triggers the process. After a few atoms form into a crystal structure, the remaining atoms snap into position almost instantly. If a water droplet cools to roughly 14 degrees F, nucleation occurs spontaneously. Otherwise, at warmer temperatures, it needs some help. This can be a vibration to shake the atoms into place or contact with a solid particle.
Supercooled droplets are falling through relatively still, clear air, so nucleation isn’t occurring. Thus, they fall all the way to the ground as liquid. However, when they hit the cold ground, nucleation suddenly occurs. The water freezes instantly, often in interesting shapes.
The aftermath of freezing rain is beautiful: Clear, flowing, liquid-like ice coatings. Of course, it’s also an inconvenient and potentially dangerous mess.