The Unz Review - Mobile
A Collection of Interesting, Important, and Controversial Perspectives Largely Excluded from the American Mainstream Media
Email This Page to Someone

 Remember My Information

Authors Filter?
Razib Khan
Nothing found
 TeasersGene Expression Blog
Cognitive Psychology

Bookmark Toggle AllToCAdd to LibraryRemove from Library • BShow CommentNext New CommentNext New Reply
🔊 Listen RSS

A paper on the psychology of religious belief, Paranormal and Religious Believers Are More Prone to Illusory Face Perception than Skeptics and Non-believers, came onto my radar recently. I used to talk a lot about the theory of religious cognitive psychology years ago, but the interest kind of faded when it seemed that empirical results were relatively thin in relation to the system building (Ara Norenzayan’s work being an exception to this generality). The theory is rather straightforward: religious belief is a naturally evoked consequence of the general architecture of our minds. For example, gods are simply extensions of persons, and make natural sense in light of our tendency to anthromorphize the world around us (this may have had evolutionary benefit, in that false positives for detection of other agents was far less costly than false negatives; think an ambush by a rival clan).*


But enough theory. Are religious people cognitively different from those who are atheists? I suspect so. I speak as someone who never ever really believed in God, despite being inculcated in religious ideas from childhood. By the time I was seven years of age I realized that I was an atheist, and that my prior “beliefs” about God were basically analogous to Spinozan Deism. I had simply never believed in a personal God, but for many of earliest years it was less a matter of disbelief, than that did not even comprehend or cogently in my mind elaborate the idea of this entity, which others took for granted as self-evidently obvious. From talking to many other atheists I have come to the conclusion that Atheism is a mental deviance. This does not mean that mental peculiarities are necessary or sufficient for atheism, but they increase the odds.

And yet after reading the above paper my confidence in that theory is reduced. The authors used ~50 individuals, and attempted to correct demographic confounds. Additionally, the results were statistically significant. But to me the above theory should make powerful predictions in terms of effect size. The differences between non-believers, the religious, and those who accepted the paranormal, were just not striking enough for me.

Because of theoretical commitments my prejudiced impulse was to accept these findings. But looking deeply within they just aren’t persuasive in light of my prior expectations. This a fundamental problem in much of social science. Statistical significance is powerful when you have a preference for the hypothesis forwarded. In contrast, the knives of skepticism come out when research is published which goes against your preconceptions.

So a question for psychologists: which results are robust and real, to the point where you would be willing to make a serious monetary bet on it being the orthodoxy in 10 years? My primary interest is cognitive psychology, but I am curious about other fields too.

* In Gods We Trust and Religion Explained are good introductions to this area of research.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Cognitive Psychology, Psychology 
🔊 Listen RSS

This is a datum which you can dine out on, The Bias You Didn’t Expect:

It turns out that legal realism is totally wrong. It’s not what the judge had for breakfast. It’s how recently the judge had breakfast. A a new study (media coverage) on Israeli judges shows that, when making parole decisions, they grant about 65% after meal breaks, and almost all the way down to 0% right before breaks and at the end of the day (i.e. as far from the last break as possible). There’s a relatively linear decline between the two points.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Cognitive Psychology, Psychology 
🔊 Listen RSS

About 20 years ago I lived for a year in a rural area where Amish were a common feature of country roads and farmers’ markets. My parents, being Muslims, would sometimes buy chickens from the local Amish and slaughter them according to halal. We had a relationship with a particular family. They were nice people, though I have to admit that their chickens were a bit tougher than I was used to. In many ways the Amish lived predictably parallel lives from the “English” (we referred to them as “Dutchees”), but they’d always pop up from the background in unexpected ways. Amish don’t seem to have a problem with modern medicine, so we’d run into them at the hospital sometimes. Whenever my father saw an Amish fruit or vegetable stand on a country road he’d pull over, because they’d often let us sample a bit before we purchased (we always purchased watermelons from the Amish for this very reason). It’s been a long time, so I haven’t thought about the Amish in much depth. Living on the West coast you don’t run into their kind very often (I don’t recall ever running into Amish on the West coast in fact). But it turns out that the number of Amish in the United States of America has more than doubled in the past 20 years. Their population went from 123,000 in 1991 to 249,000 in 2010. The fertility of the traditionalist Old Older Amish is 6.2. Here’s the Old Older Amish fertility rates in an international perspective:

Fertility rate, 2005-2010
Niger 7.19
Guinea-Bissau 7.07
Afghanistan 7.07
Burundi 6.80
Liberia 6.77
Congo 6.70
East Timor 6.53
Mali 6.52
Sierra Leone 6.47
Uganda 6.46
Angola 6.43
Chad 6.20
Old Order Amish 6.20
Somalia 6.04
Burkina Faso 6.00
Yemen 5.50
Pakistan 3.43
Bangladesh 2.74
Mexico 2.21
USA 2.05
Germany 1.36 Assuming current rates of increase, there should be 7 million Amish in 2100, and 44 million Amish in 2150. I don’t believe these numbers will pan out. I will explain why later in detail, but first, let’s look at the paper from which I extracted these statistics and projections. Religion, fertility and genes: a dual inheritance model:

Religious people nowadays have more children on average than their secular counterparts. This paper uses a simple model to explore the evolutionary implications of this difference. It assumes that fertility is determined entirely by culture, whereas subjective predisposition towards religion is influenced by genetic endowment. People who carry a certain ‘religiosity’ gene are more likely than average to become or remain religious. The paper considers the effect of religious defections and exogamy on the religious and genetic composition of society. Defections reduce the ultimate share of the population with religious allegiance and slow down the spread of the religiosity gene. However, provided the fertility differential persists, and people with a religious allegiance mate mainly with people like themselves, the religiosity gene will eventually predominate despite a high rate of defection. This is an example of ‘cultural hitch-hiking’, whereby a gene spreads because it is able to hitch a ride with a high-fitness cultural practice. The theoretical arguments are supported by numerical simulations.

Let’s be frank: some of this is “no shit.” “Provided the fertility differential persists” is a major central assumption of their argument. The paper is full of mathematical models, simulations, and inferences from those models tweaking the parameters. It also uses both haploid and diploid frameworks of inheritance, as well as placing the argument in a broader evolutionary framework of how religion could emerge and spread. Yo u can read the whole paper online because it is open access.

The meat of the argument corresponds well with Eric Kaufmann’s Shall the Religious Inherit the Earth. Being a article in a journal it obviously isn’t going to be as empirically “thick” as the Kaufmann book. Instead the authors take some of Kaufmann’s empirical insights and turning them into a more rigorous quantitative system; an analytic engine for more precise prediction. Of particular importance, Shall the Religious Inherit the Earth generally worked with the assumption that religion was a cultural trait, and that its success was dependent on heritability of culture. Here the authors suggest that a predisposition toward religiosity may be rooted in genetic factors which are heritable. This adds a major twist: even with high defection rates from fecund religious groups, if religiosity is heritable many secular people will inherit a predisposition toward religion or religious modes of thought.

Below are the primary tables and illustrations.

[zenphotopress album=247 sort=sort_order number=4]

As you can see, the big picture is this: religion wins! Assuming…. And what are the assumptions?

1) Religious people have higher fertility. This is true!

2) Religiosity exhibits heritability. This is true!

Combine these two, and you have an excellent ingredient for natural selection to drive a trait and its underlying alleles to fixation. The question is not if, but when. But it’s more complicated than that. The author covers many of the nuances within the paper, so I don’t want to be unfair here. I’m expanding on much of what they give a nod to. For example, one can’t presume that religious people are always going to be more fertile than irreligious people. As Eric Kaufmann observed the gap is the largest in secular societies. And secular societies are very new indeed. In the pre-modern world there was very little variation in the trait value. If there isn’t variation in the trait value how is natural selection supposed to “see” the variance in genes? But just as the past is not a good guide to the present, so we should be careful about the present being a good guide to the future. Or more precisely, the medium-to-far future.

As noted within the paper religiosity as a trait may exhibit density dependence. In other words, as the proportion of the trait increases its rate of increase may decrease. This is not an unjustified assumption. Around 1955 ~5% of South Koreans were Protestant or Catholic. By 1985 the proportion had risen to ~20%. That’s factor of four growth in 30 years. In 2005, the proportion was ~30%. There has clearly been leveling off, as the exponential curve begins to look like a logistic.

A particular parameter need not be fixed in time. Many projections assumed fixed parameters, and then they allow time to extend outward toward infinite. This seems to not be a robust assumption when it is not trivial, and trivial when it is robust. To understand why, look back to the past. If religiosity is heritable at around 0.50 that is a strong signal that there hasn’t been enough positive directional selection on the genes which control variation on religiosity to exhaust variation. Traits which are very strongly associated with fitness, which to some extent religiosity is today, invariably show low heritabilities. This is because selection is so good at weeding out unfit variants when fitness differentials are great. Traits which exhibit higher heritabilities, and for a behavioral trait ~0.5 isn’t bad, it is on the same order as I.Q., may not exhibit direct fitness implication, or are being buffeted by stabilizing selection. But we don’t need to rely on what we know about the genetic architecture of traits. Secularization has been proceeding for nearly two centuries, and it has shot up only recently. Atheists have been present in the literary record since the 6th century B.C.E., starting with the Greeks, but also occurring among Chinese and Indian philosophers. Any sufficiently complex hierarchical society does seem to produce dissents from the reigning metaphysical orthodoxy. If those societies do not tolerate public and open dissent, as has been the case in the past for Christian cultures, and to some extent is still the case in the world of Islam, dissenters may simply ensconce themselves in the institutions of orthodoxy so as to obtain cover for their deviations. One of the more blatant examples of this which I stumbled upon was an 18th century French bishop, Étienne Charles de Loménie de Brienne, who was blocked from being the archbishop of France by Louis XVI because of his atheism (he was made archbishop of Sens and eventually became a cardinal, before finally openly repudiating Catholicism in 1793).

Of course just because the past can not predict the present, does not mean that the present can not predict the future. So why do I think there won’t be 7 million Amish in 2100? The culture that is the Old Order Amish exists only within the context of the current United States of America, where they are a trivial minority. Today the Amish are 0.1% of the USA’s population. Some demographers predict ~1 billion Americans by 2100. I’m skeptical of this figure, but assuming that it is correct, 7 million Amish would be 0.7% of America’s population. 44 million Amish in 2150 if the USA had a population of 1.5 billion would be 3% of America’s population. I don’t think that the Amish lifestyle can take over whole societies. It isn’t economically scalable (I doubt it’s religiously scalable, but I will avoid a digression into then nature of very radical Protestantism and its repeated failures in Europe). The Amish could adapt, as the New Order Amish have, but then their fertility starts dropping. There are structural constraints on fertility beyond simply idealized preference. The Arab fertility in Israel crashed in the 2000s due to removal of some child benefits. People do respond to incentives. Or at least some people do. It turns out that ultra-Orthodox Jews continue to have large families because of stronger ideological commitments. But at some point ultra-Orthodox labor force participation in Israeli society will have to increase. At that point I predict that fertility will begin to drop. In Europe the incredibly high fertility of Roma minorities in relation to the non-Roma is only feasible under current economic surpluses of those societies. As the proportion of Roma increase the parameters resulting in incentives toward higher fertility will shift, both for the Roma, and for the society in which they are embedded. The authors make a nod to the idea that religion may have spread through group selection: but this is also an argument for why very fertile and religious groups such as the Amish and Roma will reach their “limits to growth.” If they persist in their atypical lifestyles their host societies will simply collapse. Or at least restructure in a fashion to make extremely high endogenous growth of minorities impossible.

Before I move on to a big picture “far future” implication, I want to address the issue of genetic architecture. The paper operates with a monogenic model. That is, there is a “God gene” which controls religiosity and irreligiosity. This is for reasons of tractability of the model. In broad brushes it doesn’t really matter if the trait is controlled by variance on 1 gene, or 1,000, for the main portion of their argument. But reality is more than just their model. Religiosity is heritable, which suggests it has many genes controlling most of the variance. Like many behavioral or psychological traits “religion” is somewhat a fuzzy category, and the competencies which produce religious phenomena are almost certainly implicated in many other core cognitive functions. The ability to represent entities, gods, which you’ve never seen, and have superhuman powers, leverages a host of human-specific mental skills. Not only that, but these abilities may exhibit variation from person to person. The cognitive psychologist Paul Bloom has observed that American atheists are somewhat atypical. If the traits which combine to produce religiosity and irreligiosity are correlated with a host of other traits, then these correlations may serve as a break on the margins which would prevent one “morph” from driving another out of the population. The authors note that one way irreligiosity may survive us through heterozygote advantage of those with a religious allele and an irreligious allele. But, they argue that “…heterozygote advantage is merely a theoretical possibility and there is currently no evidence to support it.” There are some theoretical reasons to be skeptical of heterozygote advantage being common. But this is not the only way that stability could be preserved on a trait.The authors are only operating within the monogenic framework, a framework whose existence is contingent on tractability of the model for their own purposes! The reality is that religiosity is probably polygenic, and there may be constraints of correlated response and frequency dependencies on the margins of many of the other traits controlled by the same genes. Bloom notes that atheists exhibit some psychological deviancies (correctly to my mind). But in a society deviated sharply toward the religious side you may start seeing other sorts of problems as well in terms of the mental state of those at the extreme end of the tail of the trait distribution.

Finally, what about the idea that selection dictates that fertility should bounce back after the demographic transition? R. A. Fisher outlined this model in The Genetical Theory of Natural Selection. The authors don’t address directly the heritability of fertility, but if the biological or psychological traits which predispose one to large families are heritable, then over time those traits will spread in the population. The demographic transition is very new in an evolutionary time scale. Arguably the first population to go through the transition was that of France in the early 19th century, with Britain next. These assumptions drive Robin Hanson to predict a Malthusian future:

But as long as enough people are free to choose their fertility, at near enough to the real cost of fertility, with anything near the current range of genes, cultures, and other heritable influences on fertility, then in the long run we should expect to see a substantial fraction of population with an heritable inclination to double their population at least every century. So if overall economic growth doubles less than every century, as I’ve argued it simply must in the long run, income per capital must fall over the long run, a fall whose only fundamental limit is subsistence; we can’t have kids if we can’t afford them.

Hanson’s argument is that eventually population growth will catch up with economic growth, as it has for all of history. The reason that we live in an age of affluence is not just economic growth. It is that the more wealth a society has today, the lower the population growth rate is, meaning that per capita wealth is increasing as the growing pie is not eaten up by more mouths. This is abnormal for all of human history, and in fact all of the biological domains. Populations tend to exhibit long term equilibrium with their carrying capacity. The key is long term. Some biological populations go through booms and busts every few generations. For humans the process may run over centuries. There was a massive ramp up of European population right up until 1300. This squeezed peasants on the margin. But the Black Death of the 14th century drastically reduced aggregate population, resulting in a situation of land to labor surplus. Not only did this change the structure of European society (with labor shortages there had to be a modification to feudal arrangements; there weren’t enough serfs to go around, so some landlords would steal labor via inducements toward better treatment, more freedom, and even pay), but the median health and diet of the peasant improved. It took a few centuries, but eventually the Malthusian grind swallowed up the surplus land which had been freed up by the Black Death. Only the lift off of the past two centuries, the concomitant speed up of economic growth and collapse of births per woman, has been the exception to this overall pattern.

So the big question is: are we simply living in a transient, like the peasants of the 15th century? In the long run everything is a transient. And, there are structural changes which mean that the past may not be a good guide to the future. The emergence of multicellular life was a revolution in the parameters of diversification of this planet’s biosphere. The arrival onto the scene of a protean cultural animal, our own species, was also a revolution in terms of the shocks which an organism can induce upon the biosphere (though not as much of a shock as oxygen producing cyanobacteria). Finally, the shift toward exponential economic growth over a few generations was also a radical change from the past. Five years ago I read The Robot’s Rebellion: Finding Meaning in the Age of Darwin. The argument within that book is presented that modern humanity in has the ability, the choice, to break free from the iron laws of the past. We need to be careful about glib projections. I think there is still a high probability that the vast majority of the existence of this universe until final heath death or inflation will be characterized by the ubiquity of von Neumann machines. But between then, and now, may be a long time indeed. Until the iron laws of evolution, and ultimately the cosmos, snap their jaws shut we may have the opportunities to choose between many paths. The painting we project upon that palette of time may be diverse, detailed, and unpredictable.

Citation: Rowthorn R (2011). Religion, fertility and genes: a dual inheritance model. Proceedings. Biological sciences / The Royal Society PMID: 21227968

Image credit: Magnus Manske

(Republished from Discover/GNXP by permission of author or representative)
🔊 Listen RSS

414NJ526e2L._SS500_ Sometimes books advertise themselves very well with their title. The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us is one of those books. Alternatively it could have been titled: “Giving thinking a second chance.” Or, with an eye toward pushing copies: “Why everything Malcolm Gladwell tells you is crap.” And finally, a more highbrow possibility: “Reflection: man’s greatest invention.”

The “hook” for The Invisible Gorilla is the experiment which goes colloquially by the same name. The authors of the book, Christopher Chabris and Daniel Simons, actually wrote the paper Gorillas in our midst: sustained inattentional blindness for dynamic events (though they note that the basic insight goes back to the 1970s). Here’s a YouTube clip illustrating Chabris & Simons’ set up. Despite the eye-catching way the authors grab your attention the core message of The Invisible Gorilla is often very Plain Jane: thinking is hard, it yields real results, and, beware of short-cuts. Many sections of the book read as counterpoints to the counterintuitive defenses of intuition which Malcolm Gladwell presents in Blink: The Power of Thinking Without Thinking (Gladwell as it happens played a key role in popularizing knowledge of “the invisible gorilla” phenomenon). Despite being “sexed” up in the past few decades the defense of intuition, of “gut,” has a long intellectual history. For every Kant there is a Wang Yangming. And yet the borderlands between intuition and deduction, reflex and reflection, can often be gray. I would argue that much of human culture actually emerges from rational extensions of intuition. David Hume famously asserted reason’s slavery to passion, but I think a less grand way of characterizing the nature of different aspects of cognition is that they complement and supplement each other (see How We Decide).

Chabris and Simons cover more than their famous invisible gorilla experiment. Rather, it just serves as a doorway into a whole world of cognitive distortions, illusions, and fallacies. Many of these emerge from what we think we know about ourselves, and an inattention to the reality of the shortcomings of our own minds. These deceptions are explored across six chapters, but it is the conclusion where I think one can illustrate the deep insight The Invisible Gorilla can give us into leading fuller lives. Telling us how to think, not just how we think. Specifically, Charbis and Simons give an illustration of a circumstance where intuitions are of use, and why they are of use. They recount an experiment where students were given five different strawberry jams of very different quality (as rated by Consumer Reports) and asked to list their likes and dislikes. Then they were asked to rate the quality of jams on a scale of 1 to 9. In a separate treatment the students were given the jams to taste, and then asked to write about why they chose their major in college. Then they rated the jams again, 1 to 9. In the second condition the students gave ratings much closer to the experts than in the first. This seems a case where thinking hard and deeply about the issue only muddled the outcomes, while intuition was validated. What’s going on? First, Chabris and Simons suggest that the students were not adding any new relevant information by listing their likes and dislikes. Second, one’s perception of taste is primarily, though not exclusively, related to a visceral emotional reaction. In matters of personal taste it makes good sense to go with one’s gut, especially if the taste has a strong necessary connection to the real gut!

In contrast, deliberate rational thought will beat intuition when you have more conscious access to the data for any given task. Both the authors of The Invisible Gorilla were chess players at a high level, and their narrative is laced with their own reminiscences about the chess-world. They note that expert chess players can usually beat lesser players even if they give their opponents far more time to think about their moves. One model which can explain this is that the elite chess players have an intuitive grasp of the game and can “recognize” the pattern of play without having to engage in ratiocination. Charbis and Simons show in fact that under conditions of time pressure excellent players increase their error rate by 36 percent. This hints to the likelihood that there are finite cognitive resources being brought into play which are reflective, not rapid fire reflex, in moments of decision making in chess. Of course the distinction between reflex and reflection may seem academic from the outside, but there is a difference when it comes to how one learns, and the preconditions of virtuosity.

At the end of the day The Invisible Gorilla reinforces some very old-fashioned maxims about the importance of hard work, care in system-building, and intellectual humility. The exact virtues which are theoretically baked into the cake of modern institutional science. Chabris and Simons reveal that the emperor of intuition often has no clothes, but they give us hope because reflective cognitive processes have a track record of building upon themselves, and extending into directions which have allowed us to construct the material civilization we see all around us. In contrast in areas where an intuitive aesthetic sense is at a premium I do not have great confidence that moderns have exceeded the ancients.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Cognitive Psychology, Psychology 
🔊 Listen RSS

I highly recommend this discussion between Paul Bloom & Robert Wright. The topic under consideration is the psychology of pleasure, as reviewed in Bloom’s new book How Pleasure Works: The New Science of Why We Like What We Like. You can also find out about Bloom’s ideas in this exchange in Slate. The essentialism examined in Descarte’s Baby is being taken for another spin, though with a more precise focus. The bottom line is that pleasure is often contingent on more than proximate empirical sensory input; it depends on what you perceive to be the essence of the object of pleasure, even its history (or more crassly, its price). This truth may make the calculation project of the utilitarian heirs of Gottfried Leibniz pragmatically impossible.

(Republished from Discover/GNXP by permission of author or representative)
No Items Found
Razib Khan
About Razib Khan

"I have degrees in biology and biochemistry, a passion for genetics, history, and philosophy, and shrimp is my favorite food. If you want to know more, see the links at"