The Unz Review - Mobile
A Collection of Interesting, Important, and Controversial Perspectives Largely Excluded from the American Mainstream Media
Email This Page to Someone

 Remember My Information

Topics/Categories Filter?
Africa African Americans Amerindians Antiracism Archaic Humans China East Asians Europeans Eye Color Face Recognition Farmers Female Sexual Response Fertility Gene-Culture Coevolution Genetic Pacification Genetics Greece Greg Cochran Gregory Clark Guilt Culture Hair Color History Human Evolution Ideology Immigration Intelligence IQ Kinship Light Skin Preference Male Homosexuality Middle East Monogamy Neanderthals Northwest Europe Operational Sex Ratio Out Of Africa Model Polygyny Population Replacement Race/Ethnicity Richard Lewontin Ron Unz Science Sexual Dimorphism Sexual Selection Skin Color Skin Tone Violence Vitamin D Western European Marriage Pattern Abigail Marsh Adoption ADRA2b Affective Empathy Agriculture Ainu Alan Macfarlane Alexandre Skirda Algeria Altruism Amoral Familialism Amygdala Ancient DNA Ancient Jews Anders Breivik Androids Ann Coulter Anthropology Antifa Antiquity Antisocial Behavior Archaic DNA Arctic Humans Arthur Jensen Ashkenazi Jews ASPM Austro-Hungarian Empire Austronesians Bacterial Vaginosis Balanced Polymorphism Beauty Behaviorism Bernard Lewis Bill 59 Black Americans Black Metal Blade Runner Blond Hair Blue Eyes Boasian Anthropology Bolshevik Revolution Brain Size Brian Resnick Britain Broken Hill Brown Eyes Bruce Lahn Burakumin Cagots Cameron Russell Canada Candida Albicans Castes Catholicism Chanda Chisala Charles Darwin Charlie Hebdo Chechens Chimerism Chlamydia Chris Stringer Christianity Christmas Cinema Claude-Lévi-Strauss Clovis Cognitive Empathy Cognitive Psychology Communism Craniometry Cuckold Envy Cultural Evolution Daren Acemoglu Dark Ages Darwin Darwinism David Goldenberg Davide Piffer Death Penalty Deep Sleep Democracy Demographic Transition Denisovans Denmark Dienekes Diet Discrimination Paradigm Doll Tests Dopamine Ebola Economics Ed Miller EEA Egypt Empathy England Environmental Estrogens Estrogen Ethiopia Ethnicity Ethology Eugenics European Union Evolution Evolutionary Biology Evolutionary Psychology Face Shape Fascism Father Absence Female Deference Female Homosexuality Fertility Rates First World War Flynn Effect Foreign Policy Fragrances France Francis Fukuyama Frantz Fanon Franz Boas Fred Reed French Canadians Front National G Gardnerella Vaginalis Gender Confusion Gender Identity Disorder Gender Reassignment General Social Survey Germ Theory Germany Globalism Globalization Goths Gregory Cochran Grooming Guangzhou Hair Lengthening Hajnal Line Hate Speech Havelock Ellis Heather Norton Helmuth Nyborg Hemoglobin Henri De Man Henry Harpending Herbert John Fleure Hispanics Homicide Rate Homo Altaiensis Homosexuality Human Rights Hungary Hunter-Gatherers Ibo Igbo Illegal Immigration India Individualism Inuit Ioannis Metaxas Iosif Lazaridis Islam Israel Italy Iwo Eleru Japan Jared Diamond Jason Malloy John B. Watson John Durant John Hawks John Hoffecker John Tooby Kennewick Man Khazars Kimberly Noble Kin Selection Kissing Korea Lactose Latitude Law Leda Cosmides Liberalism Literacy Logan's Run Love Dolls Lover Boys Lutherans Market Economy Marxism Mary White Ovington Mate Choice Matriarchy Melanin Menstrual Cycle Mental Traits Mesolithic Michael Jackson Michael Lewis Michael Woodley Microcephalin Middle Ages Minimum Wage Minnesota Transracial Adoption Study Mirror Neurons Modern Humans Moral Absolutism Moral Universalism Morality MTDNA Multiregional Model Muslims Nationalism Natural Selection Neolithic New Cold War Nigeria North Africa North Korea Norway Nubia Oceania Olfaction Orthodoxy Ottoman Empire Paekchong Pakistani Paleoamerindians Parasite Manipulation Paris Spring Parsi Paternal Investment Patrilocality Personality Phil Rushton Physical Anthropology Pierre-Andrew Taguieff Pierre Van Den Berghe PISA Poland Political Correctness Population Pornography Post-Nationalism Prenatal Hormones Protestantism Psychopaths Pubertal Timing Public Schools Quebec Race Denialism Racism Radical Islam Razib Khan Reading Red Hair Regression To The Mean Religion Reproductive Strategy Republicans Revolution Of 1905 Richard Dawkins Richard Dyer Richard Lynn Richard Russell Rickets Robert Lindsay Roman Empire Rosemary Hopcroft Rotherham Russia Ruth Benedict Samuel George Morton Sandra Beleza Serbia Sex Ratio At Birth Sex Recognition Sexual Division Of Labor Sexual Identity Sexual Maturation Sexually Transmitted Diseases Shame Culture Shanghai Cooperation Organisation Shyness Siberia Single Men Single Women Skin Bleaching Slave Trade Slavs SLC24A5 Sleep Smell Sociobiology Sociopathy South Asia South Korea Speech Stanislas Dehaene Star Trek State Formation Stephen Jay Gould Steve Sailer Steven Pinker Sub-Saharan Africa Sub-Saharan Africans Subprime Mortgage Crisis Suicide Sweden Switzerland Taiwan Tamerlan Tsarnaev Tatars Testosterone Thomas Aquinas Thomas Talhelm Time Preference Toxoplasma Gondii Trans-Species Polymorphism Transsexuals Tropical Humans Tuberculosis Twin Study United Kingdom United States Upper Paleolithic UV Vaginal Yeast Victor Canfield Victorianism Vikings Visual Word Form Area Vladimir Putin Volkisch Voronezh Walter Bodmer Walter Rathenau Western Europe White Americans White Slavery William Graham Sumner William McGougall WORDSUM World War II Writing Y Chromosome Yugoslavia
Nothing found
Print Archives1 Item • Total Print Archives • Readable Only
The Mankind Quarterly
Nothing found
 TeasersPeter Frost Blogview

Bookmark Toggle AllToCAdd to LibraryRemove from Library • BShow CommentNext New CommentNext New Reply
🔊 Listen RSS

Last year, around this time, friends and acquaintances offered me all sorts of religiously neutral salutations: Seasons Greetings! Happy Holidays! Joyeuses fêtes! Meilleurs vœux! Only two people wished me Merry Christmas.

One was Muslim, the other was Jewish.

They meant well. After all, isn’t that the culturally correct greeting? In theory, yes. In practice, most Christians feel uncomfortable affirming their identity. And this self-abnegation gets worse the closer you are to the cultural core of Anglo-America. Immigrants of Christian background enjoy being wished Merry Christmas. Black people likewise. Catholics seem to split half and half, depending on how traditional or nominal they are.

But the WASPs. Oh, the WASPs! With them, those two words are a faux pas. The response is usually polite but firm: “And a very happy holiday season to you!”

Things weren’t always that way. The situation calls to mind a Star Trek episode where Capt. Kirk persuades an alien robot to destroy itself. “That which excludes is evil. If you affirm your identity, you are excluding those who don’t share your identity. You are therefore evil.”

I could question this logic. What about other cultural groups? Why single out just one? But I’ve heard the answer already. WASPs and their culture dominate North America. The path to power, or simply a better life, runs through their institutions. Minorities can affirm their own identities without restricting the life choices of others, but the same does not hold true for WASPs. Their identity affects everyone and must belong to everyone.

I’m still not convinced. Yes, WASPs did create the institutions of Anglo-America, but their influence in them is now nominal at best. The U.S. Supreme Court used to be a very WASPy place. Now, there’s not a single White Protestant on it. That’s a huge underrepresentation for a group that is still close to 40% of the population. We see the same thing at the Ivy League universities, which originally trained Protestant clergy for the English colonists. Today, how many of their students have a Christian European background of any sort? The proportions are estimated to be 20% at Harvard, 22% at Yale, and 15% at Columbia (Unz, 2012).

Sometimes reality is not what is commonly believed. WASPs are not at all privileged. In fact, they have been largely pushed aside in a country that was once theirs.

Whenever this ethnic displacement comes up for discussion (it usually doesn’t), it gets put down to meritocracy. In the past, WASPs were the best people for the job of running the country. Now it’s a mix of Jews, Asians, and other high-performing groups. A cynic might ask whether merit is the only factor … and whether the U.S. is better run today than it was a half-century ago. Indeed, the latest Supreme Court appointee had little experience as a solicitor general and a scanty record of academic scholarship.

Merit isn’t the whole story. There is also networking. In most parts of the world, an individual gets ahead in life by forming bonds of reciprocal assistance with family and kinfolk. “You scratch my back, I’ll scratch yours.” That’s how most of the world works.

But not all of the world. Northwest Europeans have diverged the most from this pattern, at least since the 12th century (Macfarlane, 1978a – 2012). Their kinship ties have been weaker and their sense of individualism correspondingly stronger. As a result, their cultural evolution has to a large degree been emancipated from the restraints of kinship, and this emancipation has facilitated other ways of organizing social relations: the nation-state, ideology, the market economy … not to mention the strange idea of personal advancement through personal merit alone. This model of society has succeeded economically, militarily, and geopolitically, but it’s vulnerable to people who don’t play by the rules, since the threat of kin retaliation is insufficient to keep them in line. Societal survival is possible only to the extent that rule-breakers are ostracized and immigration restricted from cultures that play by other rules.

This brings us to the dark side of traditional WASP culture: the busybodiness, the judgmentalism, the distrust of foreigners no matter how nice or refined they may seem. That mentality still exists, but it has been turned against itself. The people to be excluded are now those who exclude. The cultural programming for survival has been turned into a self-destruct mechanism … as in that Star Trek episode.

Even if we could somehow abort this self-destruct sequence, it’s hard to see how WASPs can survive on the current playing field. WASPs believe in getting ahead through rugged individualism. Most of the other groups believe in using family and ethnic connections. Guess who wins.

Anyway, I wish all of you a merry end of 2014! Far be it for me to exclude anyone from the merriment.


Unz, R. (2012). The myth of American meritocracy, The American Conservative, November 28

Macfarlane, A. (2012). The invention of the modern world. Chapter 8: Family, friendship and population, The Fortnightly Review, Spring-Summer serial

Macfarlane, A. (1992). On individualism, Proceedings of the British Academy, 82, 171-199.

Macfarlane, A. (1978a). The origins of English individualism: Some surprises, Theory and society: renewal and critique in social theory,6, 255-277.

Macfarlane, A. (1978b). The Origins of English Individualism: The Family, Property and Social Transition, Oxford: Blackwell.

🔊 Listen RSS

Subjects identified the left-hand image as a woman and the right-hand one as a man. Yet the two images differ only in skin tone. Study by Richard Russell, Sinha Laboratory for Vision Research, MIT.


Skin color differs by sex: women are fairer and men browner and ruddier. Women also exhibit a greater contrast in luminosity between their facial skin and their lip and eye areas. These differences arise from differing concentrations of three skin pigments: melanin (brown); hemoglobin (red); and carotene (yellow). The cause is ultimately hormonal, as shown by studies on castrated and ovariectomized adults, on boys and girls during puberty, and on digit ratios (Edwards and Duntley, 1939; Edwards and Duntley, 1949; Edwards et al., 1941; Frost, 2010; Kalla and Tiwari, 1970; Manning et al., 2004; Mesa, 1983; Omoto,1965; Porcheron et al., 2013). Women are fairer than men in all human populations. The difference is greatest in people of medium color and least in very dark- or very fair-skinned people, apparently because of “floor” or “ceiling” effects (Frost, 2007).

This sex difference is used by the human mind for sex recognition. In fact, it’s more important for this purpose than other visual cues, like face shape. When subjects are shown an image of a human face, they can tell whether it is male or female even if blurred and differing only in hue and luminosity. Hue provides a “fast channel” for sex recognition. If the facial image is too far away or the lighting too dim, the mind switches to the “slow channel” and relies on luminosity (Bruce and Langton, 1994; Dupuis-Roy et al., 2009; Frost, 2011; Hill et al., 1995; Russell and Sinha, 2007; Russell et al., 2006; Tarr et al., 2001; Tarr et al., 2002).

Age differences

Skin color also differs by age. It can be used to distinguish younger from older women, since the contrast in luminosity between facial skin and the lip/eye areas decreases with age (Porcheron et al., 2013). It can also be used to recognize infants. All humans are born with very little melanin, and the resulting pinkish-white skin is often remarked upon in different cultures.

This is especially so where adults are normally dark-skinned, in striking contrast to newborns. In Kenya, the latter are often called mzungu (‘European’ in Swahili), and a new mother may ask her neighbors to come and see her mzungu (Walentowitz, 2008). Among the Tuareg, children are said to be whitened by the freshness and moisture of the womb (Walentowitz, 2008). The situation in other African peoples is summarized by a French anthropologist: “There is a rather widespread concept in Black Africa, according to which human beings, before ‘coming’ into this world, dwell in heaven, where they are white. For, heaven itself is white and all the beings dwelling there are also white. Therefore the whiter a child is at birth, the more splendid it is” (Zahan, 1974, p. 385). A Belgian anthropologist makes the same point: “black is thus the color of maturity [...] White on the other hand is a sign of the before-life and the after-life: the African newborn is light-skinned and the color of mourning is white kaolin” (Maertens, 1978, p. 41).

This infant/adult difference is evolutionarily old, being present in nonhuman primates. In langurs, baboons, and macaques, the newborn’s skin is pink while the adult’s is black. This visual cue not only helps adults to locate a wayward infant but also seems to induce a desire to defend and provide care (Jay, 1962). Humans may respond similarly to the lighter color of infants and women. This would be consistent with a tendency by the adult female body to mimic the newborn body in other ways: face shape; pitch of voice; amount of body hair; texture and pliability of the skin; etc. Over time, women may have come to resemble this ‘infant schema’ because it is the one that can best reduce aggressiveness in a male partner and induce him to assume a provider role.

The sun-tanning fad: An aesthetic revolution

The sex-specific aspects of skin color have influenced the development of cosmetics in many cultures. Even in ancient times, women would use makeup to increase the natural contrast in luminosity between their facial skin and their lip/eye areas (Russell, 2009; Russell, 2010). They would also make their naturally fair complexion even fairer by avoiding the sun and applying white powders or bleaching agents.

This feminine aesthetic changed radically in the 1920s with the sudden popularity of sun-tanning throughout the Western world, initially as a health fad. Tanned skin then entered women’s fashion and became part of the flapper image, along with bobbed hair, broad shoulders, a relatively flat chest, narrow hips, and long legs. This fashion image was intended to be hermaphroditic, the aim being to energize the erotic appeal of the female body by masculinizing some of its key features (Bard, 1998; Segrave, 2005).

Has the tanning fad un-gendered skin color? Not wholly, to judge by the above studies on sex recognition. There still seems to be a tendency to prefer a lighter skin tone for women than for men. This was the conclusion of a recent study on how people perceive two skin pigments: melanin (brown) and carotene (yellow). When shown facial images with varying concentrations of melanin and carotene, the subjects had a greater preference for carotene than for melanin. This preference was stronger for female faces than for male faces, irrespective of the observer’s sex. Nonetheless, for both male and female faces, the preferred skin color was much darker than it would have been a century ago (Lefevre and Perrett, 2014).

Again, this aesthetic norm has darkened only in the Western world. The tanned look had some popularity among Japanese women in the postwar era up to the 1970s, but it has since gone out of fashion (Ashikari, 2005). It never did catch on elsewhere in Asia (Li et al., 2008). In North America and Europe, the tanned look seems much more persistent, and this persistence suggests that it is locked into place by something else in our cultural environment.

🔊 Listen RSS

In my last post I discussed recent research on mental differences between Europeans and Chinese people. The latter are less prone to boredom. They think less abstractly and more relationally. They’re less individualistic, and less likely to punish friends for dishonesty. Mental differences also seem to exist within China, depending on whether one comes from a region that historically grew rice or one that historically grew wheat. Chinese from wheat-growing regions are closer to Europeans in mentality.

Are these differences inborn? Or are they due to upbringing? The second explanation is hard to reconcile with the fact that the regional differences within China involved urban residents who had never lived on a farm of any sort.

Almost a half-century ago, these questions interested the American psychologist Daniel Freedman and his wife Nina Chinn Freedman. They examined 24 Chinese-American and 24 Euro-American newborns whose parents were otherwise similar in age, economic class, and number of previous children. The two groups nonetheless behaved differently. The Euro-American babies cried more easily, were harder to console, and would immediately turn their faces aside if placed face down on a sheet. In contrast, the Chinese-American babies accepted almost any position without crying or resisting. When a light was shone in their eyes, the Euro-American babies would continue to blink long after the Chinese-American babies had stopped blinking (Freedman and Freedman, 1969; Freedman, 2004).

These findings were partially replicated by another American psychologist, Jerome Kagan, who found that Chinese 4-month-olds cried, fretted, and vocalized less than Euro-American infants. At older ages, however, the pattern reversed with Chinese Americans fretting and crying more when separated from their mothers (Kagan et al., 1978; Kagan et al., 1994).

Is this response specific to Chinese? Or does it apply to East Asians in general? In a study of Euro-American, Japanese, and Chinese 11-month olds, the last group was the least expressive one, being least likely to smile or cry. The Japanese babies either fell between the two other groups or were like the Euro-American babies (Camras et al., 1998). When another study looked at Japanese and British newborns, the latter actually showed more self-quieting activity (Eishima, 1992).

On the other hand, Navaho babies are even calmer and more adaptable than Chinese babies (Freedman, 2004). Some anthropologists have attributed this finding to a traditional practice of tying the baby to a cradleboard. As Freedman pointed out, however, this practice is now only sporadic among the Navaho.

Freedman attributed his Chinese and Navaho findings to a general Mongoloid temperament. If that were the case, infants should behave similarly in other North American native peoples. A study of Alaskan Inupiaq found young children to be shy but otherwise no different from Euro-American children. These subjects were, however, older than Freedman’s, being 3 to 6 years of age (Sprott, 2002).

It may be that the Navaho differ from other North American native peoples in this respect. Perhaps, in the past, mortality was higher among those babies who resisted the cradleboard; over time, they and their temperament would have been steadily removed from the gene pool. As Freedman noted, “most Navaho infants calmly accept the board; in fact, many begin to demand it by showing signs of unrest when off.” When Euro-American mothers tried using the cradleboard, “their babies complained so persistently that they were off the board in a matter of weeks” (Freedman, 2004).

Infant calmness can thus arise in relatively simple societies, and not just in advanced ones as I had argued in my last post. In the Navaho case, there may have been some kind of parental selection, i.e., through their child-rearing practices, parents influence what sort of children survive and what sort don’t. In other simple societies, such as among the Australian Aborigines, infant behavior is much less calm and compliant (Freedman, 2004).

Behavior can likewise differ between infants from different complex societies. We’ve seen this with Chinese-American and Euro-American babies, the latter having a less easy temperament. A difficult temperament (colic, excessive crying) is also much more common in babies of Greek or Middle Eastern origin than in babies of Northwest European or Asian Indian origin (Prior et al., 1987).

In the future, it would be interesting to find out whether infants differ in temperament within China, such as between rice-growing and wheat-growing regions.

But will there be more research?

There seems to be less and less interest in this area of research, particularly within the United States. I can point to several reasons:

- The behavioral differences between Chinese and Japanese babies must have arisen over a relatively short span of evolutionary time. Many researchers, even those who are receptive to HBD thinking, have trouble accepting fast behavioral evolution, especially below the level of large continental races.

- American researchers are increasingly interested in the possibility that early parental interaction, such as reading to children, can stimulate brain development. Although it is doubtful that parental interaction can explain differences in newborn behavior, this assumption seems to make people dismissive of Freedman’s work.

- Since the 1970s, and throughout the Western world, academia has become more hostile to the possibility of genetic influences on human behavior. This trend is self-reinforcing, since hiring decisions are biased toward candidates who believe in environmental determinism.

The last two points apply much less to East Asian scholars … or American ones who are willing to do some of their work offshore.

Right now, we need to identify the genetic causation for these differences in infant behavior. One cause may be the 7R allele of the D4 dopamine receptor gene, which is associated with Attention-Deficit/Hyperactivity Disorder (ADHD) and is very rare in East Asians (Leung et al., 2005). Nonetheless, as with differences in intellectual capacity, we’re probably looking at an accumulation of small effects at many different genes. Natural selection acts on what genes produce, and not directly on genes, so there is no reason to believe that a single behavioral outcome has a single genetic cause. That would be too convenient.


Camras, L.A., H. Oster, J. Campos, R. Campos, T. Ujiie, K. Miyake, L. Wang, and Z. Meng. (1998). Production of emotional facial expressions in European American, Japanese, and Chinese infants,Developmental Psychology, 34, 616-628.

• Category: Science • Tags: China, Gene-Culture Coevolution, Mental Traits 
🔊 Listen RSS

All humans were once hunter-gatherers. Back then, versatility came with the territory. There were only so many game animals, and they differed a lot in size, shape, and color. So you had to enjoy switching back and forth from one target animal to another. And you had to enjoy moving from one place to another. Sooner or later you’d have to.

Beginning 10,000 years ago, farmers made their appearance. Now monotony came with the territory. A plot of land wasn’t something you could forget while you took off somewhere else. It needed constant care. The tasks were also more repetitive: ploughing, sowing, harvesting …

Things worsened as farming became more advanced. You had to focus on one crop and a limited number of key tasks.

Different means of subsistence have selected for different mental traits, and this selection has had genetic consequences. Monotony avoidance has a heritability of 0.53 (Saudino, 1999). This predisposition has usually been a handicap in modern societies, so much so that it often leads to criminality. Males with a history of early criminal behavior tend to score high on monotony avoidance, as well as on sensation seeking and low conformity (Klinteberg et al., 1992).

Today, if you have trouble fitting into your society, you might still survive and reproduce. In the past, you probably wouldn’t. Other people would take your place in the gene pool and, over successive generations, their mental makeup would become the norm.

That’s gene-culture co-evolution. We have reshaped the world we live in, and this human-made world has reshaped us. After describing how our ancestors radically changed their environment, Razib goes on to write: “We were the authors of those changes, but in the process of telling that story, we became protagonists within it” (Khan, 2014).

China: a case study

Advanced farming—intensive land use, task specialization, monoculture—has profoundly shaped East Asian societies, particularly China. This is particularly so for rice farming. Because the paddies need standing water, rice farmers must work collectively to build, dredge, and drain elaborate irrigation networks. Wheat farming, by comparison, requires no irrigation and only half as much work.

Advanced farming seems to have favored a special package of predispositions and inclinations, including greater acceptance of monotony. This has been shown in two recent studies.

The first one was about boredom and how people experience it in their lives. The results from the 775 Chinese participants were then compared with the results from a previous survey of 572 Euro-Canadians. It was found that the Chinese participants were less likely to feel bored in comparable situations. They seemed to value low-arousal (calm, relaxation) versus high arousal (excitement, elation) in the case of Euro-Canadians (Ng et al., 2014).

The authors attributed their findings to cultural learning. One may wonder, however, why preference for low arousal persists in the face of China’s massive influx of high-arousal Western culture.

Relational thinking, collectivism, and favoritism

The second study had the aim of seeing whether the sociological differences between rice farmers and wheat farmers have led to differences in mental makeup. When 1,162 Han Chinese performed a series of mental tasks, the results differed according to whether the participants came from rice-farming regions or wheat-farming regions (Talhelm et al., 2014).

When shown a list of three items, such as “train”, “bus”, and “tracks”, and told to choose two items that pair together, people from rice-farming regions tended to choose “train and tracks,” whereas people from wheat-farming regions tended to choose “train and bus.” The former seemed to be more abstract relational in their thinking and the latter more relational abstract. This pattern held up even in neighboring counties along China’s rice-wheat border. People from the rice side of the border thought more relationally than did people from the wheat side.

A second task required drawing pictures of yourself and your friends. In a prior study, Americans drew themselves about 6 mm bigger than they drew their friends, Europeans drew themselves 3.5 mm bigger, and Japanese drew themselves slightly smaller. In the present study, people from rice regions were more likely than people from wheat regions to draw themselves smaller than they drew their friends. On average, people from wheat regions self-inflated 1.5 mm, and people from rice regions self-deflated -0.03 mm.

A third task required imagining yourself doing business with (i) an honest friend, (ii) a dishonest friend, (iii) an honest stranger, and (iv) a dishonest stranger. This person might lie, causing you to lose money. Or this person might be honest, causing you to make money. You could reward or punish this person accordingly. A previous study found that Singaporeans rewarded friends much more than they punished them. Americans were much more likely to punish friends for bad behavior. In this study, people from rice regions were more likely to remain loyal to friends regardless.

Interestingly, these findings came from people with no connection to farming at all. They grew up in a modern urban society, and most were too young to have known the China that existed before the economic reforms of the late 1970s. It looks like rice regions have favored hardwiring of certain psychological traits: less abstract thinking and more relational thinking, less individualism and more collectivism, and less impartiality toward strangers and more favoritism toward kin and friends.

Why farming sucks, for you but not for me

These findings corroborate the ethnographic literature on the differences in mentality between hunter-gatherers and farmers. Hunter-gatherers typically see farming as a kind of slavery, and they have trouble understanding well-meaning outsiders who want to turn them into land-slaves.

Yes, for the same land area, farming can produce much more food. But it’s hard work, not only physically but mentally as well. Humans had to undergo a change in mentality before they could make the transition from hunting and gathering to farming

Those humans ended up transforming not just their physical landscape but also their social and cultural landscape … and ultimately themselves. By creating new values and social relations, they changed the rules for survival and reproduction, thereby changing the sort of mentality that future generations would inherit.

Humans transformed the world through farming, and the world returned the favor.


Khan, R. (2014). Our cats, ourselves, The New York Times, The Opinion Pages, November 24

Klinteberg, B., K. Humble, and D. Schalling. (1992). Personality and psychopathy of males with a history of early criminal behaviour,European Journal of Personality, 6(4), 245-266.

🔊 Listen RSS

Are liberals and conservatives differently wired? It would seem so. When brain MRIs were done on 90 young adults from University College London, it was found that self-described liberals tended to have more grey matter in the anterior cingulate cortex, whereas self-described conservatives tended to have a larger right amygdala. These results were replicated in a second sample of young adults (Kanai et al., 2011).

The amygdala is used to recognize fearful facial expressions, whereas the anterior cingulate cortex serves to monitor uncertainty and conflict (Adolphs et al., 1995; Botvinick et al., 1999; Critchley et al., 2001; Kennerley et al., 2006). Perhaps unsurprisingly, these findings were changed somewhat in the popular press. “Conservatives Big on Fear, Brain Study Finds,” ran a headline in Psychology Today. The same article assured its readers that the anterior cingulate cortex “helps people cope with complexity” (Barber, 2011).

A study on 82 young American adults came to a similar conclusion. Republicans showed more activity in the right amygdala, and Democrats more activity in the left insula. Unlike the English study, the anterior cingulate cortex didn’t differ between the two groups (Schreiber et al., 2013).

It would seem, then, that conservatives and liberals are neurologically different. Perhaps certain political beliefs will alter your mental makeup. Or perhaps your mental makeup will lead you to certain political beliefs. But how can that be when conservatism and liberalism have changed so much in recent times, not only ideologically but also electorate-wise? A century ago, English “conservatives” came from the upper class, the middle class, and outlying rural areas. Today, Britain’s leading “conservative” party, the UKIP, is drawing more and more of its members from the urban working class—the sort of folks who routinely voted Labour not so long ago. Similar changes have taken place in the U.S. Until the 1950s, white southerners were overwhelmingly Democrats. Now, they’re overwhelmingly Republicans.

Of course, the above studies are only a few years old. When we use terms like “conservative” and “liberal” we refer to what they mean today. Increasingly, both terms have an implicitly ethnic meaning. The UKIP is becoming the native British party, in opposition to a growing Afro-Asian population that votes en bloc for Labour. Meanwhile, the Republicans are becoming the party of White Americans, particularly old-stock ones, in opposition to a Democrat coalition of African, Hispanic, and Asian Americans, plus a dwindling core of ethnic whites.

So are these brain differences really ethnic differences? Neither study touches the question. The English study assures us that the participants were homogeneous:

We deliberately used a homogenous sample of the UCL student population to minimize differences in social and educational environment. The UK Higher Education Statistics Agency reports that 21.1% of UCL students come from a working-class background. This rate is relatively low compared to the national average of 34.8%. This suggests that the UCL students from which we recruited our participants disproportionately have a middle-class to upper-class background. (Kanai et al., 2011)

Yes, the students were largely middle-class, but how did they break down ethnically? Wikipedia provides a partial answer:

In 2013/14, 12,330 UCL students were from outside the UK (43% of the total number of students in that year), of whom 5,504 were from Asia, 3,679 from the European Union ex. the United Kingdom, 1,195 from North America, 516 from the Middle East, 398 from Africa, 254 from Central and South America, and 166 from Australasia(University College London, 2014)

These figures were for citizenship only. We should remember that many of the UK students would have been of non-European origin.

We know more about the participants in the American study. They came from the University of California, San Diego, whose student body at the time was 44% Asian, 26% Caucasian, 10% Mexican American, 10% unknown, 4% Filipino, 3% Latino/Other Spanish, and 2% African American (Anon, 2010). This ethnic breakdown mirrors the party breakdown of the participants: 60 Democrats (72.5%) and 22 Republicans (27.5%).

Affective empathy and ethnicity

In my last post, I cited a study showing that the amygdala is larger in extraordinary altruists—people who have donated one of their kidneys to a stranger. In that study, we were told that a larger amygdala is associated with greater responsiveness to fearful facial expressions, i.e., a greater willingness to help people in distress. Conversely, psychopaths have a smaller amygdala and are less responsive to fearful faces (Marsh et al., 2014).

Hmm … That’s a tad different from the spin in Psychology Today. Are liberals the ones who don’t care about others? Are they … psychopaths?

It would be more accurate to say that “liberals” come from populations whose capacity for affective empathy is lower on average and who tend to view any stranger as a potential enemy. That’s most people in this world, and that’s how most of the world works. I suspect the greater ability to monitor uncertainty and conflict reflects adaptation to an environment that has long been socially fragmented into clans, castes, religions, etc. This may explain why a larger anterior cingulate cortex correlated with “liberalism” in the British study (high proportion of South Asian students) but not in the American study (high proportion of East Asian students).

As for “conservatives,” they largely come from Northwest Europe, where a greater capacity for affective empathy seems to reflect an environment of relatively high individualism, relatively weak kinship, and relatively frequent interactions with nonkin. This environment has prevailed west of the Hajnal Line since at least the 12th century, as shown by the longstanding characteristics of the Western European Marriage Pattern: late age of marriage for both sexes; high rate of celibacy; strong tendency of children to form new households; and high circulation of non-kin among families. This zone of weaker kinship, with greater reliance on internal means of behavior control, may also explain why Northwest Europeans are more predisposed to guilt than to shame, whereas the reverse is generally the case elsewhere in the world (Frost, 2014).

All of this may sound counterintuitive. Doesn’t the political left currently stand for autonomy theory and individualism? Doesn’t it reject traditional values like kinship? In theory it does. The reality is a bit different, though. When Muslims vote Labour, it’s not because they want gay marriage and teaching of gender theory in the schools. They expect something else.

The same goes for the political right. When former Labourites vote UKIP, it’s not because they want lower taxes for the rich and offshoring of manufacturing jobs. They expect something else. Are they being delusional? Perhaps. But, then, are the Muslims being delusional?

🔊 Listen RSS
The Child at Your Door (c. 1917-1919). Credit: Wikimedia Commons

The Child at Your Door (c. 1917-1919). Credit: Wikimedia Commons

In a previous post, I discussed why the capacity for affective empathy varies not only between individuals but also between populations. First, its heritability is high: 68% (Chakrabarti and Baron-Cohen, 2013). So natural selection has had something to grab hold of. Second, its usefulness varies from one culture to another. It matters less where kinship matters more, i.e., where people interact mainly with close kin and where non-kin are likely to be enemies. The threat of retaliation from kin is sufficient to ensure correct behavior.

Affective empathy matters more where kinship matters less. This is a situation that Northwest Europeans have long known. Historian Alan Macfarlane argues that kinship has been weaker among the English—and individualism correspondingly stronger—since at least the 12th century and perhaps since Anglo-Saxon times (Macfarlane, 2012;Macfarlane, 1992, pp. 173-174). A weaker sense of kinship seems to underlie the Western European Marriage Pattern (WEMP), as seen by its defining characteristics: late age of marriage for both sexes; high rate of celibacy; strong tendency of children to form new households; and high circulation of non-kin among families. The WEMP has prevailed since at least the 12th century west of the Hajnal Line, a line running approximately from Trieste to St. Petersburg (Hallam, 1985; Seccombe, 1992, p. 94).

Can natural selection specifically target affective empathy?

So if affective empathy helps people to survive and reproduce, there will be more and more of it in succeeding generations. If not, there will be less and less.

But what exactly is being passed on or not passed on? A specific capacity? Or something more general, like pro-social behavior? If it’s too general, natural selection could not easily make some populations more altruistic than others. There would be too many nasty side-effects.

Although pro-social behavior superficially looks like affective empathy, the underlying mental processes are different. Pro-social behavior is a willingness to help others through low-cost assistance: advice, conversation, a helping hand, etc. The logic is simple: give some help now and perhaps you’ll receive a lot later from the grateful beneficiary. By the same logic, you may stop helping someone who seldom reciprocates.

Affective empathy is less conscious. It seems to have developed out of cognitive empathy: the ability to simulate what is going on in other people’s minds, but not necessarily for the purpose of helping them. Con artists have plenty of cognitive empathy. Empathy is affective when you not only simulate how other people feel but also experience their feelings (Chakrabarti and Baron-Cohen,2013). Their wellbeing comes to matter as much as your own.

Empathy of either sort relies on unconscious mimicry: “empathic individuals exhibit nonconscious mimicry of the postures, mannerisms, and facial expressions of others (the chameleon effect) to a greater extent than nonempathic individuals” (Carr et al., 2003). The ability to mimic is key to the empathic process of relaying information from one brain area to another via “mirror neurons”:

- The superior temporal cortex codes an early visual description of another person’s action and sends this information to posterior parietal mirror neurons.

- The posterior parietal cortex codes the precise kinesthetic aspect of the action and sends the information to inferior frontal mirror neurons.

- The inferior frontal cortex codes the purpose of the action.

- Parietal and frontal mirror areas send copies of motor plans back to the superior temporal cortex in order to match the visual description of the person’s action to the predicted sensory consequences for that person.

- The mental simulation is complete when the visual description has been matched to the predicted sensory consequences (Carr et al., 2003).

By simulating the sensory consequences of what someone does or intends to do, we gain an understanding of that person that goes beyond what our senses immediately tell us.

[...] we understand the feelings of others via a mechanism of action representation shaping emotional content, such that we ground our empathic resonance in the experience of our acting body and the emotions associated with specific movements. As Lipps noted, ”When I observe a circus performer on a hanging wire, I feel I am inside him.” To empathize, we need to invoke the representation of the actions associated with the emotions we are witnessing. (Carr et al., 2003)

Affective empathy exists when this mental representation is fed into our own emotional state. We feel what the other person feels and we act appropriately. This is much more than pro-social behavior.

From psychopaths to extraordinary altruists

The capacity for affective empathy varies from one person to the next. It is least developed in psychopaths:

Psychopathy is a heritable developmental disorder characterized by an uncaring nature, antisocial and aggressive behavior, and deficient prosocial emotions such as empathy, guilt, and remorse. Psychopaths exhibit consistent patterns of neuroanatomical and functional impairments, such as reductions in the volume of the amygdala and in the responsiveness of this structure to fear-relevant stimuli. These deficits may underlie the perceptual insensitivity to fearful facial expressions and other fear-relevant stimuli observed in this population.(Marsh et al., 2014)

Mainstream opinion accepts that psychopaths are heritably different because they are “sick.” Heritable differences are thus thought to be unusual and even pathological. “Normal” individuals may vary in their capacity for affective empathy, but surely that sort of variability is due to their environment, isn’t it?

🔊 Listen RSS

Who were the first Europeans? We now have a better idea, thanks to a new paper about DNA from a man who lived some 38,700 to 36,200 years ago. His remains were found at Kostenki, a well-known Upper Paleolithic site in central European Russia (Seguin-Orlando et al., 2014).

Kostenki Man tells us several things about the first Europeans and, more broadly, the first non-African humans:

The Neanderthal encounter

Modern humans received their Neanderthal admixture when they were just spreading out of Africa some 54,000 years ago. At that time, they had not yet encountered the Neanderthals and were entering the territory of the Skhul/Qafzeh hominids, a semi-archaic people of the Middle East. So we may have got our Neanderthal admixture indirectly. The Skhul/Qafzeh hominids had probably interbred with their Neanderthal neighbors to the north, and our ancestors may have then picked up this admixture while in the Middle East.

When our ancestors spread farther north into Europe, some 45,000 to 42,000 years ago, they could have interbred directly with Neanderthals, but they didn’t. Perhaps the two groups were just too different. They seem to have intermixed only via a third party that was neither fully modern nor fully archaic.

A strange detour … and then another!

There was initially a large continuous population across northern Eurasia, perhaps composed of nomads who pursued wandering herds of reindeer across the European Plain and its eastward extension into central and northern Asia.

Not long before the time of Kostenki Man, these Northern Eurasians began to split into three regional groups: Western Eurasians, Eastern Eurasians, and the ancestors of Middle Eastern farmers. The degree of reproductive isolation is unclear, however, and gene flow may have continued between all three groups until the onset of the last ice age some 25,000 years ago. This may be why Kostenki Man does not fit perfectly into any of the three groups, although he is genetically closest to Western Eurasians.

Yes, Northern Eurasians were ancestral to the early farming peoples of the Middle East. It seems that early modern humans had to head north, learn to hunt reindeer, and then head south again before they could start farming. Sounds like a strange detour. Wouldn’t it have been easier to stay put and do it locally? You know, Middle-Eastern hunter-gatherers becoming Middle Eastern farmers? Apparently not.

It gets even more convoluted. After some of those Northern Eurasians had gone south to the Middle East, some of their farming descendants “returned” to Europe and partially replaced its hunter-gatherers, particularly in southern and central Europe. This second detour has been greeted with disbelief. Dienekes (2014), for instance, has written: “I don’t think many archaeologists would derive European farmers from Russia (Russia is actually one of the last places in Europe that became agricultural).”

True, but farming requires a mindset that may have come from those northern hunters (Frost, 2014). When Piffer (2013) looked at human variation in alleles at COMT, a gene linked to executive function, working memory, and intelligence, he found that northern hunting peoples had more in common with farming peoples than with other hunter-gatherers, “possibly due to the higher pressure on technological skills and planning abilities posed by the adverse climatic conditions.”

That mindset made farming possible, but the first steps toward farming could not be taken in a cold climate. They had to be taken in a place with a long growing season and a wide variety of domesticable plants and animals, such as in the Middle East. Once farming had developed there, it could move back north, while taking along its technologies, its food crops, and its livestock species.

Farming can develop in the tropics with a “tropical” mindset, but it looks very different. The farming that arose in West Africa is overwhelmingly women’s work and seems to have wholly developed out of female plant gathering. The guinea fowl is the only animal that has been domesticated for food consumption in sub-Saharan Africa.

The Ice Age was not so bad

The Upper Paleolithic humans of northern and eastern Europe did not die out during the last ice age, as was commonly thought. They survived the glacial maximum intact.

The European phenotype came later

Kostenki Man was dark-skinned, dark-eyed, and rather short. These details, curiously enough, appear not in the paper but in a review of the paper, published by the same journal, as well as in an interview with one of the authors (Associated Press, 2014; Gibbons, 2014).

So we now have an upper bound for the emergence of the European phenotype, i.e., light skin and a diverse palette of hair and eye colors. The lower bound has been set by the remains of a Swedish hunter-gatherer, dated to 8,000 years ago, who had the “European” allele for light skin at the gene SLC24A5 (Skoglund et al., 2014).


My main criticism centers on the dating to 38,700 – 36,200 years ago. At the Kostenki site, the radiocarbon dating used to be some 10,000 years younger. It was then recalibrated to an older range of dates when a layer of volcanic ash at the site was attributed to a volcano that had erupted in southern Italy some 39,000 years ago. This recalibration was initially controversial, but the controversy has since subsided (Sinitsyn and Hoffecker, 2006). I would not rule out a subsequent re-recalibration.

By retrieving ancient DNA from an early modern human, we have made a key advance in human paleogenetics, perhaps more so than by sequencing the Neanderthal genome. We again see that evolution did not slow down with the emergence of anatomically and behaviorally modern humans some 60,000 years ago. It actually began to speed up, as humans began to enter not only new natural environments but also new cultural environments of their own making.


Associated Press (2014). DNA study dates Eurasian split from East Asians, The Columbus Dispatch, November 6–Eurasian%20Split/id-ae36fa368c634c7383d807942bd5fe67

Dienekes (2014). Genome of Kostenki-14, an Upper Paleolithic European (Seguin-Orlando, Korneliussen, Sikora, et al. 2014),Dienekes’ Anthropology Blog, November 7

Frost, P. (2014). The first industrial revolution, Evo and Proud, January 18

🔊 Listen RSS

Throughout the world, kinship used to define the limits of morality. The less related you were to someone, the less moral you had to be with him or her. We see this in the Ten Commandments. The phrase “against thy neighbor” qualifies the commandment against bearing false witness and, implicitly, the preceding ones against killing, adultery, and stealing. For the modern reader, “thy neighbor” is helpfully explained as meaning “the children of thy people” (Leviticus19:18).

In some cases, this kin-based morality gradually ceased to apply the farther away one went from home and from immediate kith and kin. Usually, however, the limits of one’s moral community coincided with some kind of boundary: a geographic barrier, a political border, and/or an ethnic frontier. Beyond lay the world of “strangers.”

Toward a universal morality

The first efforts to universalize morality—to create a single moral system that could apply to everyone—”arose simultaneously around 500 BCE in various parts of the world, from China in the Far East to Southern Italy in the West” (Assmann and Conrad, 2010, p. 121). These efforts were initially driven by the need to form alliances between different peoples:

Alliance – the formation of treaties – proved the most important instrument of internationalism. Forming an alliance required mutual recognition of the deities which served as patrons. The treaties which these empires formed with each other and with their vassals had to be sealed by solemn oaths invoking the gods of both parties. The list of these gods conventionally closes the treaty [...]. They had to be equal in their function and rank. Intercultural theology became a concern of international law. (Assmann and Conrad, 2010, p. 125)

As ancient empires expanded and absorbed different peoples, this intercultural theology became useful for internal peace, notably with the Hellenistic empires that arose in the wake of Alexander the Great’s conquests. By affirming that different religions are interchangeable, it became possible to create a common civic culture for diverse peoples:

Hellenization had two faces. On the one hand, it referred to the diffusion of Greek language, ideas and customs all over the Ancient World; on the other hand, it appeared to be more of a construction of a ‘common culture’, suggesting a similar change in Greece as in the other cultures. Flavius Josephus did not speak of ‘Greek’ but of ‘common culture’, ho koinos bios, as the goal of Jewish assimilation or reform in the Hellenistic age. (Assmann and Conrad, 2010, p. 127)

One result would be the emergence of a universal religion. We like to associate this development with the teachings of Jesus, but a kind of proto-Christianity was already emerging near the end of the pre-Christian era. At that time, many Jews were adapting their belief in one God to the universal worldview of Hellenistic culture:

Thus, while biblical universalism was founded on a notion of the mission of Israel to save all of humanity and bring them to the true worship of the only God, Hellenistic notions of universalism involved the assumption that all the gods were really different names for one God. (Boyarin, 1994, chap. 3).

The two belief-systems merged among the increasingly Hellenized Jews of the eastern Mediterranean, thus setting the stage for Jesus and making it easier for his movement to succeed.

The Christian impulse

This new religion became a vehicle not only for moral universalism but also for belief in human equality. For if morality is universal, all humans must have the same capacity to follow its rules. In Christ, asserted Paul, there is neither Jew nor Greek, neither slave nor free, neither male nor female (Galatians 3:28).

While Christianity would steer people in the direction of universalism, there were limits to how far it could go. Theologians sometimes spoke of the need to set lower aims for average people and higher aims for saintly men and women. We see this realism in Augustine’s position on prostitution: “If you do away with harlots, the world will be convulsed with lust” (De Ordine ii, 4). The same could be said for the Church’s position on war, slavery, prejudice, and other manifestations of human inequality. These were the realities of an imperfect world.

Such imperfections nonetheless became harder to accept over the following centuries. First, there was “mission creep.” Once the Church had established certain ideals, there was continual pressure to bring human behavior into line with them. Second, the geocenter of the Church was shifting away from the eastern Mediterranean, where the absolute morality of Christianity had been constrained by the relative morality of kinship. Farther north and west, beyond the Hajnal Line, kinship ties were weaker and people more receptive to universal principles. There was thus a “fruitful encounter” between the Christian faith and these northwest Europeans who were more willing to internalize such principles and apply them more thoroughly (Frost, 2014a).

Within this region, Catholicism would radicalize to the point of splitting away and becoming Protestantism. Here, too, Christian ideals would increasingly be taken to their logical conclusion.

The Abolitionist movement

Abolitionism began in the 17th century among English Quakers as a movement to abolish the slave trade. Over time, it grew more radical, seeking not only to free black slaves but also to extirpate racial and ethnic prejudice. Although “antiracism” did not yet exist as a word, its form and substance were already recognizable by the early 19th century. This was particularly so in the American northeast, where radical abolitionists denounced not only slavery but also fellow abolitionists who wanted to settle freed slaves in Africa. “In the 1830s, for the first time in American history an articulate and significant minority of Americans embraced racial equality as both a concept and a commitment” (Goodman, 1998, p. 1). This militant minority wanted more than simply an end to slavery:

Believing that racial prejudice underpinned slavery, abolitionists committed themselves not just to emancipation [...] “Our prejudice against the blacks is founded in sheer pride; and it originates in the circumstance that people of their color, only, are universally allowed to be slaves,” Child argued. “We made slavery, and slavery makes the prejudice.” Color phobia, abolitionists contended, is irrational, wicked, preposterous, and unmanly. It is contrary to natural rights and Christian teaching, which recognizes no distinctions based on color. Race prejudice, Elizur Wright Jr. exploded, is “a narrow, bitter, selfish, swinish absurdity.” (Goodman, 1998, p.58)

Decline … and resurgence

🔊 Listen RSS

Growing up in rural Ontario, I would talk with older folks about politics. A favorite topic was Quebec, and how those selfish French Canadians wouldn’t fight in the Boer War, the First World War, and the Second World War. Later, as a student in Quebec City, I would hear the other side. French Canadians saw those wars as foreign entanglements of no concern to them. They were willing to fight and die, yes, but only for their own soil. That may seem selfish, but so were we with our slavish loyalty to the British Empire.

The folks back home would have disagreed. The Empire wasn’t just for the British or even for Europeans in general. It was for people of all races and religions. It was an instrument for raising everyone up to British standards of fair play, morality, and civilization. In short, for making the world a better place. Take up the White Man’s burden …

Such talk puzzled me, even as a kid. The sun had long ago set on the British Empire. There was the Commonwealth, but why would its leaders defend our imperial heritage? Most of them had fought for independence from the Empire. They valued the British connection only to the extent that it was useful to themselves and their people.

Some Commonwealth leaders wouldn’t even be that generous. When Robert Mugabe dispossessed the British farmers remaining in his country, we could only look on helplessly. A century ago, people called the Ottoman Empire the “sick man of Europe.” Today, that title surely applies to the remnants of the British Empire.

There is a difference, though. The Ottomans were militarily helpless. We are ideologically helpless. Our universal morality has been turned against us, and it is in the name of our notions of fair play that we’re giving everything up, often to people, like Robert Mugabe, who make no pretence of believing in fair play. And we accept the logic of the situation. We think it normal to judge ourselves by a harsher standard and others by a more permissive one.

Double standards normally work the other way. Normally, one judges people of another kind by a harsher standard. They are less likely to share the same notions of right and wrong. They are also less likely to feel the sort of kinship affinity that makes people want to help each other and forgive minor wrongs, or even major ones.

But we’re doing the reverse. That kind of situation is inherently unstable, even self-destructive. No other human society has ever attempted such a thing.


All of this seems obvious to me. Why is it less so to other people? The question crosses my mind when I see how thinking men and women respond—or rather fail to respond—to the Rotherham sex-abuse scandal. In an English town of some 250,000 people, at least 1,400 school-age girls were “groomed” for prostitution by gangs of Pakistani origin. Grooming begins with seduction and ends in abduction, trafficking, and confinement. This final stage apparently explains why some 500 girls were missing from the town’s 15 to 19 age group at the last census.

This went on for years without anything being done and little being said. From time to time, the parents of the girls would complain, and the police would immediately investigate … the parents. Finally, in August of this year, a long report broke the logjam of silence by officials and the media (Jay, 2014). There is still a pervasive bias against this news item, as seen in coverage by three online magazines.Slate ran one story about Rotherham and four about Jennifer Lawrence. Jezebel had one story about Rotherham and six about Jennifer Lawrence. Feministing made a passing reference to Rotherham and ran two stories about Jennifer Lawrence (Durant, 2014).

Who is Jennifer Lawrence? She’s an American actress, and last August someone leaked nude photos of her online. That’s why she matters so much more to thinking men and women.

It gets weirder. Social media have become overwhelmingly opposed to quarantining of the Ebola outbreak (Alexander, 2014). At one time, quarantines were considered a progressive measure, the sort of thing you would support as a thinking man or woman. If you didn’t, people would assume you were a fool who knew nothing about modern science.

So what makes the Ebola outbreak different? The difference is simple. Quarantining means that light-skinned people will be detaining dark-skinned people. So we just can’t do it. Because? Because.

The same applies to Rotherham, which was about dark-skinned men seducing, confining and, ultimately, enslaving light-skinned women. That, too, triggers the same mental lockdown—Don’t go there! That’s how thinking men and women unthinkingly respond—or almost anyone who has gone to college and watches TV. The response seems almost Cartesian: I try not to think, therefore I am a moral person.

Unfortunately, we cannot make unpleasant truths go away by ignoring them. Sooner or later, we will have to confront them. We will especially have to confront our universal morality, including the assumption that only light-skinned folks have moral agency and only they are to be held accountable for their actions.

Please don’t get me wrong. I’m not arguing for a new improved universal morality. Morality can never be universal. It is a product of local conditions—to be specific, it arises from a co-evolving system of cultural, historical, and genetic factors. If forced to choose between saving one or the other, we should first save this foundational system. Anyhow, that’s all we can really save. Morality has no existence above and beyond the humans who act it out in their daily lives.

That’s a hard message to swallow, but we will have to. Eventually.


Alexander, S. (2014). Five case studies on politicization, Slate Star Codex, October 16

Durant, J. (2014). John Durant compares coverage of Rotherham abuse vs. Jennifer Lawrence nudes, Twitchy Media, September 3

Jay, A. (2014). Independent Inquiry into Child Sexual Exploitation in Rotherham 1997-2013

• Category: Science • Tags: Ebola, John Durant, Morality, Rotherham 
🔊 Listen RSS

Are men and women more alike in some populations than in others? It’s possible. First, boys and girls differentiate from each other to varying degrees during adolescence, and this process of sexual differentiation is genetically influenced. There are even conditions, like Swyer syndrome, where an individual is chromosomally male (46, XY) and yet develops externally into a woman.

Second, men and women don’t have the same sex roles everywhere. According to a survey of 93 nonindustrial cultures, men were expected to dominate their wives in 67% of them, the sexes were expected to be about equal in 30%, and women were expected to dominate their husbands in 3% (Whyte, 1978). Sex roles differ to varying degrees even among hunter-gatherers, who correspond to the earliest stage of cultural evolution. In the tropics, women provide more food through gathering than men do through hunting. The reverse is true beyond the tropics, where women have few opportunities to gather food in winter (Kelly, 1995, pp. 128-132; Martin, 1974, pp. 16-18).

There has thus been a potential for gene-culture co-evolution. Wherever men and women behave more alike, natural selection will tend to level any innate behavioral differences between them. This can come about in several ways, but a particularly common one is to reduce the sex difference in prenatal hormonal exposure, i.e., the ratio of testosterone to estrogen in the uterine environment of the developing fetus.

We have a “handy” way to measure this prenatal influence. It’s called the digit ratio: the length of your index finger divided by the length of your ring finger. The lower your 2nd digit to 4th digit ratio (2D:4D), the more you were exposed to testosterone in the womb and the less to estrogen.

English psychologist John T. Manning has pioneered the use of this digit ratio as a way to measure how prenatal male and female hormones influence various behavioral traits. In a recent study, he looked at how prenatal hormones might influence gender equality in different populations. After measuring the digit ratios of participants from 29 countries, his research team averaged the score for each country and compared it with indices of gender equality: women’s share of parliamentary seats; women’s participation in the labor force, women’s education attainment level; maternal mortality rates; and juvenile pregnancy rates. To ensure comparability, all of the participants were of European descent.

The results?

In short, the more similar the two sexes were in 2D:4D, the more equal were the two sexes in parliamentary and labor force participation. The other variables were not as strongly correlated. (Manning et al., 2014)

In general, women from Northwest Europe have more masculine digit ratios, whereas women from farther east and south have more feminine digit ratios. This geographical trend is more pronounced for the right hand than for the left hand. Since the right-hand digit ratio is associated with social dominance, Northwest Europeans may be less sexually differentiated for that particular trait, as opposed to being less sexually differentiated in general.

Presumably, this isn’t a new tendency. Women must have been more socially dominant among Northwest Europeans even before the late 19th century and the earliest movements for women’s suffrage. So how far back does the tendency go? To medieval times? To pre-Christian times? It seems to go back at least to medieval times and, as such, forms part of the Western European Marriage Pattern:

The status of women differed immensely by region. In western Europe, later marriage and higher rates of definitive celibacy (the so-called “European marriage pattern”) helped to constrain patriarchy at its most extreme level.

[...] In eastern Europe however, the tradition of early and universal marriage (usually of a bride aged 12-15 years, with menarche occurring on average at 14) as well as traditional Slavic patrilocal customs led to a greatly inferior status of women at all levels of society. (Women in the Middle Ages, 2014)

Does this geographic tendency go back to pre-Christian times? There is little consensus on this point, as noted in a study of women in Old Norse society:

The conversion of Iceland raises the problem of the impact of Christianity on the female half of the human race. This, in fact, is one of the most controversial issues in women’s history. One point of view argues that Christianity was deeply imbued from the beginning with Jewish and Roman patriarchy, which became intensified by an all-male clergy and resulted in misogyny as the most lasting and profound legacy of Christianity for women. An opposite argument claims that the Christian message was fundamentally a liberating force that included women as well, and although the original radicalism of Jesus on this issue, as on so many others, became diluted with time, women were better off during the Christian period and in Christian countries than they had been before and elsewhere. (Jochen, 1995, p. 2)

Perhaps both arguments are true. As I have argued elsewhere, there may have been a “fruitful encounter” between Christianity and pre-existing behavioral tendencies in Northwest Europe, the result being a significantly different form of Christianity (Frost, 2014).


Frost, P. (2014). A fruitful encounter, Evo and Proud, September 26

Jochens, J. (1995). Women in Old Norse Society, Cornell University Press.

Kelly, R.L. (1995). The Foraging Spectrum. Diversity in Hunter-Gatherer Lifeways, Washington: Smithsonian Institution Press.

Manning, J.T., B. Fink, and R. Trivers. (2014). Digit ratio (2D:4D) and gender inequalities across nations, Evolutionary Psychology, 12, 757-768.

Martin, M.K. (1974). The Foraging Adaptation – Uniformity or Diversity? Addison-Wesley Module in Anthropology 56.

Women of the Middle Ages. (2014). Wikipedia

Whyte, M. K. (1978). The status of women in preindustrial societies, Princeton, NJ: Princeton University Press.