What is it Good For? – The Roots of War

Anyone who’s spent time among the bickering cliques of a high school (or, for that matter, read William Golding’s “Lord of the Flies“) will be unsurprised by cognitive scientist Steven Pinker’s assertion that “chronic raiding and feuding characterize life in a state of nature.” Pinker is not alone among prominent minds in declaring war to be rooted deeply in human nature, but a recently released paper in the journal Science suggests that this view may be misguided. According to Douglas Fry and Patrik Soderberg of Abo Akademi University in Finland, the evidence from the closest parallels to ancient humans, modern hunter-gatherer groups, does not support the hypothesis that war is integral to mankind.

Modern Gudigwa bushmen on the hunt, courtesy of John Gowdy

This is not to say that hunter-gatherer life is inherently peaceful. The researchers found 148 examples of “lethal aggression events” in the 21 groups they studied, as reported in ethnographic studies over the past century. War, however, is something more specific than violence alone; the authors separated the deaths into interpersonal events (what modern society would call homicide or murder) and intergroup events, which better follow an accepted definition of war as “actual, intentional, and widespread armed conflict between political communities.”

Following this division, only a third of deaths could be classified as due to war, and when an outlier group (the Austrialian Tiwi people) was separated from the analysis, only 15 percent of killings involved group on group violence. Of particular interest to today’s struggles over oil or water is the finding that only two of the reported killings involved the contention of resources; most were due to personal concerns like insults or infidelity. In Fry’s words, “When you look at these foraging groups, you see a great deal of cooperation. There are homicides on occasion, but generally people get along very well. Humans have a capacity for warfare — nobody’s denying that. But to make it a central part of human nature is grossly out of contact with the data.”

If war is not encoded in humanity’s genes, then it may be present in its memes. As coined by Richard Dawkins, a meme may be thought of as an idea that follows some of the principles of genetics. Nobel Prize winner Jacques Monod explains that “they tend to perpetuate their structure and to breed; they too can fuse, recombine, segregate their content; indeed they too can evolve, and in this evolution selection must surely play an important role.” Certain societies, such as the Eskimos and Lepchas, have no concept of warfare; it might be said that the memes of interpersonal violence have not mutated into those of intergroup conflict among these people. Where war does arise, however, it tends to spread itself quickly, as a group that finds itself under organized attack will be pressured to imitate the meme and organize itself in defense.

Regardless of its origins, war is a fact of modern society. But the knowledge that war is not biologically inherent to humanity gives hope to efforts of peace.

Advertisements

A Matter of Perspective – Virtual Avatars

Webcomic xckd once did a take on a phenomenon most of us have experienced: the perceived difference between the actual size of a place and our childhood memories. The same playground that felt like an entire country to our grade-school selves feels smaller, even cramped, to our adult vantage points. A recent study by scientists at the University of Barcelona, however, managed to induce that feeling in reverse, making adults feel childlike again through the use of virtual reality (VR).

Courtesy of xkcd.com

Mel Slater and colleagues outfitted their participants with motion-capture suits, the same technology used to create movie performances like Andy Serkis’s Gollum in “The Lord of the Rings: The Fellowship of the Ring.” The movements of each subject were then scaled down and mapped to a virtual model of a four-year-old, which the participant could see in a virtual “mirror” while wearing a head mounted display. After adapting to the movements of this virtual body, the participants were asked to estimate the sizes of a number of cubes in the virtual environment, and those inhabiting the child avatars nearly doubled the size of their guesses over a control group of subjects inhabiting adult avatars of the same height. Additionally, an implicit association test conducted after the VR experiment showed that participants in the child group had subconsciously begun to identify themselves with traditionally childlike traits and images.

Previous research has also established a physical link between virtual and real-world responses. An earlier study by Slater’s group had participants experience a virtual slap to the face, which generated the very real physiological response of reduced heart rate. A similar experience can be generated with a much lower-tech trick, the “rubber hand illusion.” In this setup, the participant’s real hand is placed under a realistic false hand that is “connected” through a sleeve to the rest of the body. Watching the false hand get stroked or hit generates a real sensation, and functional magnetic resonance imagining (fMRI) of the brain during these experiments has shown increased activity in regions associated with integrating sense information. As neuroscientist Arvid Gutersam explains, “It therefore seems as if these areas of the brain automatically associate the sight from the brush moving in empty space with the touch felt on the real hand, leading to the bizarre consequence that one feels touch in midair and perceive having an invisible hand in this location.”

True VR systems are likely to remain in the laboratory for some time due to their expense, but new technology like Google Glass may open at least some of their possibilities to a wider audience. Users should remain mindful of how immersion in a virtual world can have effects in the real one; in the case of Slater’s experiment, it’s possible that adults who have recently spent time in a child’s body might misjudge distances while driving and have an increased accident rate. Yet there are many possibilities for productive use: for example, the military has studied VR as a way to treat soldiers suffering from PTSD. The interface between real and virtual experience remains a relatively unexplored territory, but it is surely a fascinating one.

Restless Nights – The Evolution of Sleep

Last month, Marvin Anthony Alexander of Phenix City, Ala., drove his SUV over a guardrail, falling 40 feet to the highway below and killing three teenaged passengers. Police determined that he was not under the influence at the time, but instead had fallen asleep at the wheel after an extended drive back from a vacation. Alexander, although he experienced the worst of what sleeplessness has to offer, is far from alone in his condition: according to a recent poll by the Centers for Disease Control and Prevention, 32 percent of Americans admit to “drowsy driving” on at least a monthly basis, and the National Sleep Foundation reports that 20 percent of US adults get less than six hours of sleep nightly. The basic sleep need of adults is generally agreed to be in the range of seven to nine hours nightly, and most experts recommend that this sleep come in a single, unbroken block.

Yet according to Roger Ekirch, a historian at Virginia Tech University, this recommendation may actually be counterproductive to achieving proper rest.  In his book “At Day’s Close: Night in Times Past,” Ekirch argues that the ideal of eight hours of uninterrupted sleep is a modern invention, promoted by the availability of artificial lighting that disrupted natural sleep cycles. Light suppresses the production of melatonin, a hormone that regulates sleepiness and is of great importance to the body’s circadian rhythm, or “biological clock.” Before streetlamps and lightbulbs, the night was significantly darker, and historical records suggest that people divided the night with two periods of sleep, scientifically known as biphasic sleep. Between the “first” and “second” sleeps came an hour or two of wakefulness, during which people would do everything from pipe smoking to visiting neighbors to praying. As sleep scientist Thomas Wehr writes, “Waking up after a couple of hours may not be insomnia. It may be normal sleep.”

Sample sleeping schedules, courtesy of Chase Hamilton.

Some adventurous sleepers are trying to break their rest into even further divisions in patterns of polyphasic sleep, often with the goal of reducing the total time spent in bed. These patterns aim to maximize the proportion of rapid eye movement (REM) sleep, the stage associated with dreaming, which is considered most important in helping the brain cement what it has learned over the course of the day. One of the most extreme sleep designs is called the “Uberman” schedule, which calls for less than three hours of sleep a day, broken into 20-30 minute chunks distributed every four hours. Scientific evidence on the success of these patterns is limited, but the Internet abounds with polyphasic success stories (as well as a considerable number of failures).

The temptation to reduce the time spent asleep and channel it into more productive activity is understandable, but fails to recognize the basic needs of the body. Whether it be in an unbroken stretch or the two halves perhaps favored throughout history, adequate sleep is absolutely necessary for human health and well-being.

Circus of Value – The End of Chimpanzee Testing

American animal rights activists rejoiced late last month over a decision by the National Institutes of Health to eliminate a majority of the chimpanzees the agency now employs for scientific research. The primates, which share an estimated 96 to 99.4 percent of DNA with humans, have been employed as a model organism for medicine in the United States since psychobiologist Robert Yerkes bought “Chim” and “Panzee” in 1923. The NIH currently houses 360 chimps, of which 310 will be retired to the federal sanctuary system and 50 will be retained, but not bred, as possible subjects for future work. Said NIH director Francis Collins, “Chimpanzees are very special animals. They are our closest relatives. We believe they deserve special consideration.”

A young chimpanzee in its natural habitat, courtesy of National Geographic.

The bioethics of conducting science on animals is based on the framework of “the three Rs” model: replacement, reduction, and refinement. Whenever possible, living systems should be replaced by in vitro (cell culture) models or computer simulations; if animals are absolutely necessary, less “advanced” organisms such as worms or other invertebrates are preferable. An animal study should then strive to reduce the number of individuals it involves while maintaining scientific validity. Good statistical analysis and experimental setups like the repeated measures design can help get the best results from the fewest animals. Finally, the procedures that are conducted should minimize the pain and distress experienced by the subjects. Measures like sedating research subjects before surgery may seem obvious, but stories like those of the unanesthetized baboons in the Experimental Head Injury Laboratory of the University of Pennsylvania should serve as constant reminders to those engaged in animal studies.

In the past, the closeness of chimpanzees to humans has made them a valuable resource for scientists. As the only animal model that can be infected by all strains of hepatitis, chimps were involved in the development of the hepatitis A and B vaccines, and the similarities in their immune systems made chimps a model for the study of HIV/AIDS. The most well-known chimpanzee experiment may be that of “Ham,” whose spaceflight paved the way for the first American astronauts. Yet these uses have been superseded by advances in technology: hepatitis vaccines can be produced in yeast, new HIV/AIDS work has progressed straight from in vitro experiments to human clinical trials, and humans are signing up at $250,000 a head to be shot into space. While future diseases may require chimps as the only viable research model, the present state of science has made them largely unnecessary, and their use therefore less morally defensible.

Recent neurobiology research has also suggested that non-human animals may experience significantly more consciousness than previously thought. The Cambridge Declaration on Consciousness, signed by luminaries such as Stephen Hawking and Francis Crick, states that “the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness.” In other words, the similarities of chimpanzees to humans may extend to their feelings of pain, and scientists should ensure that any benefits they gain from chimp research are worth this moral burden.

Panopticon – NSA Surveillance and PRISM

Edward Snowden, the whistleblower behind last month’s massive leak of classified documents from the National Security Agency, is currently sitting in a Moscow airport unable to leave, a man without a country. This choice, Snowden claims, he made gladly: “I can’t in good conscience allow the U.S. government to destroy privacy, internet freedom and basic liberties for people around the world with this massive surveillance machine they’re secretly building.” While a number of South American countries, including Venezuela and Bolivia, have offered him asylum, his safety is far from assured; U.S. officials have promised that accepting Snowden will put any of these countries “directly against the United States.”

The most troubling project Snowden unveiled is officially designated US-984Xn but has become more widely known by its government code name, PRISM. His leaked slides indicate that PRISM grants the NSA direct access to the data of several of the Internet’s most popular services, including Facebook, Google, Yahoo and Skype. Although the law that authorized the program mandated that surveillance targets be identified “with 51 percent certainty” as foreign nationals located outside the US, subsequent court orders have authorized the NSA to use “inadvertently acquired” information about US citizens, and the leaked slides state that targeting domestic citizens is “nothing to worry about.

Technology experts have hypothesized that PRISM most likely operates by wiretapping Tier 1 network providers, companies like AT&T and Verizon who provide the majority of Internet infrastructure. Large Internet services such as Facebook are connected directly to Tier 1 networks through pieces of hardware called edge devices, which allow them to form “peering connections” with Internet service providers. These connections reduce the number of networks across which data must travel, decreasing the latency or “lag” a user experiences when connecting to a large site. By placing a physical tap on the wires between a company’s edge devices and the Tier 1 network, the NSA could “view and copy data transmitted over every single session from a user to an application in realtime.

All of this massive amount of data must be stored, and it is thought that an NSA facility being constructed in Utah is location for that storage. Although the task seems daunting, mass storage is surprisingly inexpensive: by one estimate, all of the phone calls made in America over the course of a year could be captured in a warehouse of less than 5,000 square feet, at a cost of approximately $27 million. Storage capacity is actually becoming more affordable more quickly than computing power; the growth patterns of the two are estimated by Kryder’s Law and Moore’s Law, respectively. Phenomena such as giant magnetoresistance can be exploited to pack more data onto ever smaller surfaces.

Aerial view of the NSA’s Utah Data Center, courtesy of National Geographic.

The NSA undoubtedly has the technological means to capture a significant portion of the traffic that passes through the Internet on a daily basis. Any restrictions to this capacity therefore must come from the legal sphere, and it appears that both Congress and the Foreign Intelligence Surveillance Court are granting the agency considerable leeway in its operations. PRISM provides a powerful tool, one that can be used to protect the nation or abused to place pressure on citizens with unpopular views. It is the responsibility of the government to ensure that this technology does not stifle the free expression and dialogue of ideas from which progress must arise.

Addendum: Readers concerned about surveillance may find this linked paper, published by the libertarian think-tank Sovereign Man, helpful in locating resources to strengthen their Internet privacy.

Slimy Yet Satisfying – Insects as Food

At first glance, the products designed by the United Kingdom-based startup Ento seem just like any other work of modernist culinary art. As might be expected, the plates are white, the lines are crisp, and the portions are small. Yet the company’s goal is unique among haute cuisine: introduce insects into the diets of Western diners. The four co-founders, students at the Royal College of Art and Imperial College London, believe that their approach can help overcome the cultural taboo surrounding insect consumption by focusing on taste and nutrition, but there are strong ecological reasons for eating insects as well.

An ento box, courtesy of foodrepublic.com.

The species of any given ecosystem form a food web, a complex series of interactions representing which organisms eat what other organisms in the environment. The place of a species in the food web can be defined by its trophic level, which indicates how many steps stand between the initial source of energy (usually, but not always, the sun) and that species. In a kelp forest, for example, herbivorous fish form the second trophic level, with only the seaweed standing between them and solar energy, while sharks might be found at the fifth or higher trophic level. A great deal of energy is lost from one trophic level to the next; the general inefficiency is given by the “ten percent rule,” which estimates that each step up the food chain loses 90 percent of the energy in the previous step.

Both cattle and grasshoppers are on the second trophic level, grazing on similar species of grass or grain, but the efficiency with which these organisms convert their food into biomass varies greatly. As warm-blooded animals, cattle require a great deal of food to maintain their body temperature, and it is estimated that only three percent of the energy in grass is used to fatten a cow. Grasshoppers require much less energy for metabolism and can put more of it towards growing larger; estimates of the food conversion efficiency for edible grasshopper species range from 10 to 15 percent, up to five times that of cattle.

The greatly superior efficiency of insects provides a possible solution to one of the most pressing issues of world nutrition, that of increased protein demand in developing countries. As nations such as China and India have become wealthier, their citizens have begun to consume more meat, which has led to increased stress on the environment both from methane emissions and habitat destruction for pasture land. As the United Nations has pointed out, the increased efficiency and decreased carbon impact of insects would fulfill the protein needs of the growing human population while minimizing its effects on the planet. Indeed, over two billion people already eat insects on a regular basis, mostly in Asia and Africa. Perhaps increased knowledge about the scientific benefits of insect consumption will help overcome long-standing taboos on the practice based on Biblical prohibition or the view of insects as agricultural pests. Of course, when insect-based meals look this good, no convincing may be required.

Crowdsourcing Roundup – Give a Grant

Early in the history of science, thinkers such as Galileo Galilei and Leonardo da Vinci couldn’t rely on cushy tenure appointments to pay the bills and fund their research. Instead, these visionaries relied on the system of patronage, in which wealthy individuals such as Cosimo II de’Medici and King Francis I of France funded their work in return for both results and the social capital associated with supporting the progress of humankind. As scientists across the world, and the United States in particular, are forced to forgo expenses due to sequestration or the pace of the economy, some are turning back to this old model in a new way through the use of the social funding platform Kickstarter. This blog has previously reported on the ARKYD space telescope, but there are many other science projects worthy of support on the site. Here are seven of the most interesting ways you can become a patron of scientific progress today!

Leaf and Liquid – With the recent rise in popularity of gardening and urban farming, it’s become more important for technologically-adept city dwellers to understand the factors involved in growing plants. This project aims to create an accurate pH measurement system that plugs directly into a smartphone, making this important soil test more widely available.

Float the Dover Bronze Age Boat – In the vein of the famed voyage of the Kon-Tiki, this experimental archeology project hopes to test the seaworthiness of a boat recovered from under the streets of Dover in England. Backers could get a place on the first crew of one of the oldest known sea-going ships in history.

A Pregnancy Test For An Endangered Species – By combining a biosensor for spawning hormones with radio transmitters, Jim Garvey of Southern Illinois University has created a clever way to determine exactly where sturgeon go to reproduce. These data will give better focus to conservation efforts for this commercially important group of fish (and for enough money, one of the research subjects will be named after you!).

Library for All – The developing world is in dire need of science education, and although sites like Coursera and Khan Academy are valuable resources, they rely on technological access many may not have. Library for All is designing a system that works on practically all devices and bandwidth levels, opening scientific knowledge to even the least privileged.

InfragramNear-infrared photography has become a widely-used technique in plant science, and this project wants to make it more accessible for the amateur scientist. Even artists may want to get their hands on the technology for the strikingly unusual colors that it produces.

Fermostat – For science that directly improves your life, look no further than this ingenious piece of tech, which simplifies the precise control of homebrewing beer. As the great Benjamin Franklin is (falsely) said to have written, “In wine there is wisdom, in beer there is freedom, in water there are bacteria.”

The RoboRoach – Perhaps the creepiest project of the bunch, this effort to create the first “commercially available cyborg” will allow you to turn a living cockroach into a controllable puppet by electrically stimulating the neurons that detect obstructions in the insect’s path. While potentially amusing to anyone, the project also provides an inexpensive way for students to learn the basics of behavioral neuroscience.