The Unz Review - Mobile
A Collection of Interesting, Important, and Controversial Perspectives Largely Excluded from the American Mainstream Media
Email This Page to Someone

 Remember My Information

 TeasersGene Expression Blog

Bookmark Toggle AllToCAdd to LibraryRemove from Library • BShow CommentNext New CommentNext New Reply
🔊 Listen RSS

The New York Times has a very long and detailed article titled Norway Offers Migrants a Lesson in How to Treat Women. Here’s the primary issue:

Henry Ove Berg, who was Stavanger’s police chief during the spike in rape cases, said he supported providing migrants sex education because “people from some parts of the world have never seen a girl in a miniskirt, only in a burqa.” When they get to Norway, he added, “something happens in their heads.”

The statistics are pretty straightforward, and some are outlined in the article. This is a robust and replicated dynamic in Scandinavia; people of “migrant background” are over-represented in rape statistics. It’s an open secret, in that when push come to shove the authorities tend to make excuses rather than lying about it. Though Scandinavians maintain public norms of political correctness, their revealed preferences in terms of self-segregation and ubiquitous “white flight”, illustrate that everyone knows the reality even if they don’t address it out of politic.

Migrants themselves can often be quite frank and astute observers of cross-cultural differences:

“Men have weaknesses and when they see someone smiling it is difficult to control,” Mr. Kelifa said, explaining that in his own country, Eritrea, “if someone wants a lady he can just take her and he will not be punished,” at least not by the police.

Norway, he said, treats women differently. “They can do any job from prime minister to truck driver and have the right to relax” in bars or on the street without being bothered, he added.

Mr. Isdal, the Stavanger psychologist, said refugees, particularly those traumatized by war, represent a “risk group” that is not predestined to violent crime but that does need help to cope with a new and alien environment.

Unfortunately it’s not surprising that the “professionals” are making excuses for these men. All of a sudden males, who are sometimes portrayed in feminist literature as “natural born rapists,” become traumatized by war and no longer responsible for their actions (or at least not as culpable). This turns “victim blaming” on its head. But in a world where white males are the font of all evil non-white males are denied any agency (i.e., conflict and trauma can always be blamed on Western nations somehow).

The reality is that the attitudes expressed by Mr. Kelifa are not that atypical over recent human history. What’s atypical is the sort of gender egalitarianism which is normative in Scandinavia, and to a lesser extent in much of the West and other parts of the developed and developing world. My own suspicion is that in small hunter-gatherer bands the worry of violent rape at the hands of strangers was not a concern, because there were no strangers, and women were often in the close presence of males who were either relatives, or males with whom they were bonded with (in the case of partilocal societies). Norms of extreme sex segregation and minute physical control of women by groups of men probably arose in agricultural societies where contact with strangers became more common, and powerful patriarchies became organized and standardized.

Individualistic Western norms, which are slowly spreading throughout the world, are in some ways a reversion back to norms of the hunter-gatherer period. What we are seeing today is a slow unwinding of the institutional and social scaffolds that arose as cultural adaptations during the long period between the Pleistocene and modernity, when what had been hunter-gatherer clans were thrown together without innate cognitive tools at the scale of the individual to enable social cohesion. But the older cultural norms persist in many contexts even in the West. See the video below, which perhaps should be adapted for migrants….

• Category: Science • Tags: Culture, Rape 
🔊 Listen RSS

41cVv-L8yOL._AA160_ The vast majority of the phone conversations I have with people are either on cell phones of via Skype. One of the consequences of this is the changing of the norms and expectations which accrued with telephone usage over the 20th century. For example, I don’t really know anyone’s number (does anyone know anyone’s “Skype number”?). Another dynamic I’ve noticed is the phenomenon of “sticky area codes.”

I’m going to be in Baltimore starting Wednesday and into Friday for ASHG. So I decided to email an old friend who I know from the Bay area, but who is actually a friend from my elementary school years back in the Northeast. When we both lived in the Bay we had atypical area codes. Mine was form the Northwest. His was from a New York City area code (where he went to medical school). It was absolutely no surprise to me that though he now lives in Baltimore he still has the same area code. I have other friends who have spent more than 10 years in the Bay area who retain their New York City area codes.

Here’s my hypothesis: the churn in area codes on cell phones over the past 15 years or so is basically modeled as neutral, with the exception of a few “prestige” area codes. As Dave Baltrus pointed out until recently area code portability wasn’t without friction. In particular you might have to get a new code (aligned with your residence) if you were switched carriers and such. But all that said, the prediction is rather straightforward, over time the proportion of prestige area codes should be increasing when you control for confounds. In regards to confounds, prestige area code residents were probably early adopters of cell phone technology. But now that the market has saturated and area code transition is relatively without friction my prediction is that there are is a notable bias in transitions away from non-prestige area codes, while prestige code holders are less likely to “mutate.”

• Category: Science • Tags: Culture 
🔊 Listen RSS
OK Cupid results

OKCupid results

And Sarah saw the son of Hagar the Egyptian, which she had born unto Abraham, mocking.

Wherefore she said unto Abraham, Cast out this bondwoman and her son: for the son of this bondwoman shall not be heir with my son, even with Isaac.

- Genesis

41YlHxt+hUL._SY344_BO1,204,203,200_ The above is from a relatively widely circulated post from OKCupid. It has been argued that this post saved the dating website OKCupid, and launched the book Dataclysm. Over five years on the underlying biases have not changed, and if anything gotten more notable. I think the fact that OKCupid has become a more popular service probably explains this. From what I am to understand OKCupid had a more “hip” clientele in the late 2000s in comparison to the big dating sites, so it stands to reason that as its user base increased by many factors it would become more typical. This result is not isolated, but replicated in other surveys in experimental dating situations.

Unsurprisingly much of the male bias in race when it comes to dating comes down to perceptions of physical attractiveness. Once that is “corrected” for, the bias becomes very small. In contrast, this does not occur in women.* You can spin this in two ways. First, women are more racist. Or secondarily, women are less shallow, in that they are fixated on things beyond physical attraction. Though ironically that would definitely include physical appearance as it relates to race.

I thought of this when reading this piece in The Washington Post, Punjabi Sikh-Mexican American community fading into history. What happened is that because of anti-miscegenation laws and bans on the arrival of Indian women Punjabi farmers in the Central Valley of California married Mexican American women. The children had something of a hybrid identity, but are slowly being absorbed into the Mexican American and Punjabi Sikh communities. But this section jumped out at me because it seems an instance of a general pattern:

And when Punjabi women began coming to the United States, the Punjabi-Mexican community confounded them, Leonard said.

“They even kicked out the Mexican women from the gurdwara, even though those Mexican women helped fund it,” Leonard said.

This reminds me of what occurred at Fort Astoria, as the white women arrived the native women and their mixed-race offspring were quickly marginalized. In South Asia the same occurred with the ancestors of the Anglo-Indian community. For reasons of caste and religion they were excluded from assimilating into the native population (pairings between elite individuals, as depicted in White Mughals, differed from the majority of instances where common soldiers and lower caste women made arrangements which resulted in some censure from their respective communities), while the arrival of white women meant that the British men serving in India now had their preferred mating partners, and recreated England overseas in insular enclaves.

There seem to be two stylized extreme positions when it comes to cultural transmission as it relates to sex bias. One model holds that women are the fundamental culture bearers. In the United States for example children are more likely to adhere to the religion of the mother in mixed marriages. But there is another view, illustrated by the Islamic practice where men, but not women, could marry out. This is because it was presumed that culture would be passed down the paternal lineage. Not an unreasonable proposition in a hyper-patriarchal society. In the case of the New World, the mestizo populations clearly inherited language and religion from their male ancestors, but other aspects of their culture are indigenous (e.g., food). Though broad empirical patterns are interesting, the general expectations contingent upon theory are important in light of what we now know about mass migration in ancient history. Skewed distribution of Y and mtDNA seems to imply that migration which can not be modeled as isolation by distance diffusion tended to be male mediated, in the past as it is now. What does the uptake of Neolithic “First Farmer” mtDNA tell us about the dynamic of how the Corded Ware integrated to the local substrate, for example?

* By this, I mean that even when women give high ratings of attractiveness to men of other races, they still do not reciprocate in dating entreaties.

• Category: History, Science • Tags: Culture 
🔊 Listen RSS

From the comments:

I would be interested in more of your thoughts on western popular culture. Thanks for keeping up with this blog, I have learned a lot! For now, I quietly wait for a post that I can add insight to

Having lived basically my entire adult life in overwhelmingly coastal liberal US cities I’m not personally very familiar with “Red America” (though long-time readers will be aware that my adolescence was spent in the inter-Montane West, in a town not too different from what you might see in Napoleon Dynamite). So I’ve become fascinated recently about what one can learn by watching “bro-country” videos. Compare for example the video above of Florida Georgia Line’s Cruise remixed from the origin with the rapper Nelly, to the original below:

Even more explicit in terms of the cultural values which are at the heart of “bro-country” is their video for “Dirt”:

We tend to view past cultures through their production of literature and visual arts. The music of the United States naturally maps onto to subcultural divisions, and seems to me to be a great way to explore the diversity. Though I’m not sure how “cross-over” productions like the above with Nelly turn out, often the values and ethos are dissonant.

If you are new to this area, this critique of “bro-country” hits many of the tropes of the genre:

• Category: Miscellaneous • Tags: Culture 
🔊 Listen RSS

I guess this is taking “world music” to the next level, going back to the ancient Mesopotamians. The artist is Stef Conner, and you can read about how this reconstruction was done over at Newsweek, where there is a Soundcloud preview of her full album, The Flood. I’d actually purchase it if I could find a full digital copy, but I don’t see any out there right now (the article says it will be on iTunes next month). You can buy a physical copy at her website, but the last time I purchased a C.D. was probably in the early 2000s, so that’s not happening. Anyway, do listen the preview on Soundcloud. The drinking song above is probably not representative.

• Category: Miscellaneous • Tags: Culture 
🔊 Listen RSS

Aeneas Flees Burning Troy

Aeneas Flees Burning Troy

I found the old edition of The New Republic under Marty Peretz a bit too smug, not being as heterodox or unpredictable as it fancied itself. But the new Chris Hughes owned version does make me miss the old TNR sometimes. It’s now predictably liberal, a more high-toned and moderate sibling of The Nation. Not that there’s anything wrong with that, but that space is more consumer driven (i.e., people want to confirm their priors), and it would have been nice if a billionaire like Hughes would be more open to looking at viewpoints in a genuinely original manner. That being said the Hughes owned TNR has expanded their cultural coverage in interesting directions. But sometimes the results are uneven. This piece cross-posted from The New Statesman, Bored With Hollywood Blockbusters? Blame Digital Piracy, seems to be very close to trolling.

The author makes some valid points, in particular the ad hoc nature of many arguments for the principled usage of file sharing. But the headline itself is grossly misleading. The domination of comic themed blockbusters at the movies, with thin plots and zero characterization, derives from many factors. The international consumer market (read: China) in media is obviously the biggest driver of the change. The rise of the lowest common denominator popcorn film is a function of the expanding nature of the audience which Hollywood feels it needs to satisfy. Headline aside, there is also the argument that free music downloads is driving musicians into penury. My question to this is when has the modal musician ever not been economically marginal? Back in the days of CDs some musicians became very wealthy, but it was still a winner-take-all game. Probably more hard hit by the collapse of revenues in the music industry has been the ancillary employment around the music itself. And these arguments in favor of intellectual property always strike me as peculiar because they often don’t grapple with the historical reality that intellectual property rights have been absent for almost the whole of human history, and somehow humans did produce works of great artistry.

To be fair the author has a whole book, Freeloading, in which he expounds on this topic. There might be more there than can be distilled into a magazine piece. But let’s take the author’s argument at face value, that creativity will be abolished by the lack of enforcement of intellectual property.* Is that disastrous in all domains? It looks as if piracy has resulted in a recession of the porn industry. What if all porn production ceased today. Wouldn’t the “back catalog” suffice? There are only so many things you can do in porn, so a lot of the new production must be driven by “enthusiasts” who are looking for the next new star. But is this the standard consumer of porn? I suspect most individuals are not particular discerning in what they whack off to, certain preconditions being met. Similarly, in music we have Beethoven, Ella Fitzgerald, the Beatles, Nirvana, and N.W.A. Is something in the future going to be that much better? In fiction there is also an enormous back catalog. The vast majority of Victorian fiction is out of print, and there may even be no extant copies. What a waste. But how many ways can you tell a story? You get the picture.

My point with this thought experiment is to suggest that human creativity exists to fulfill particular urges, on the part of both producers and consumers, and any particular institutional scaffold around the process of production and consumption is a historical contingency. Going back to the headline of the TNR piece, the early 1980s saw the collapse of “New Hollywood” for reasons of capitalism. The rise of superhero films in the teens of the 21st century is simply the next stage in this process. I suspect in the future if you want to produce novel high concept art you are unlikely to be able to do so via the conventional capitalist means of production. And why would we be surprised by that? The production of high art in the past was often underwritten by elites, from Virgil’s Aenied to Beethoven’s symphonies. As for low mass art, that will always have an audience, so capitalism will suffice.

* A minor sidenote not acknowledged in this piece is that the heyday of file sharing in the 2000s is in abatement due to the emergence of services such as Netflix and Spotify, which can provide streaming for a modest fee. Of course the fees on a per unit basis are not particular high, and so may have the same effect.

• Category: Miscellaneous • Tags: Culture 
🔊 Listen RSS

220px-Braun_HF_1 Over on Twitter the always interesting W Bradford Wilcox highlights the fact that children of more educated parents watch less television. I stumbled upon this datum via social conservative blogger Rod Dreher under the heading How Idiocracy Perpetuates Itself. If you read this blog you are aware of W Bradford Wilcox’s ouvre. It is of the nature that both Left and Right can take some comfort in it, as it serves as an informational balm upon their own particular presuppositional sores. And so here I am to suggest that the data do not declare what you fear sirs!

First, let me put into the record another piece of surprising datum, the average American watches 34 hours of television per week. That’s well near a full time job. But wait, remember that average means a large number watch more than 34 hours. Who are these people? Do you know them? By the fact that you read this weblog I suspect many of you do not. Certainly when I ask my friends they’re agog at this much television watching.

Let me try to intuit what a traditional conservative would take away from this level of watching television. “The people have lost their moral center, and lack appreciation for the edifying arts of yore, debasing themselves to partake of the passive hedonism of our fallen age.” Perhaps I stated it pompously, but I suspect you get the picture. What about a liberal? “The people lack the disposable income to avail themselves of the outdoors and finer pleasures of life, and so must make do with the accessible joys of television.” In other words, for the conservative the passive television watching public have missed the mark of their own free will, they have sinned against what their life was meant to be. For the liberal television occupies the role in the lives of the proletariat it does because they lack the economic wherewithal to enjoy all the finer things they obviously must yearn for.

New York Public Library

New York Public Library

But there may be a different answer. The people watch television because they prefer television to what the cultural elites, Left and Right, would term the “higher arts.” The soul of man is not noble, and it is not made in the image of a divine being on high. It is that of a squalid savanna ape rutting in the open and greedily thrusting sweets into its mouth until waves of satiety wash over its corpulent physique. I certainly watched television when I was a younger person, from Saturday Morning Cartoons such as The Smurfs in the 1980s to reality television in the 2000s. In 2004 as a household we stopped paying for cable, and soon enough managed to fob off our television on a friend. It was not because we disdained television, but because we found ourselves gluttons for it too often. I still remember a Saturday dissipated by a Newlyweds marathon in July of 2004. It was indeed a temptation which we had to toss out of our house, lest we be swallowed by its demands upon our attention.

And yet obviously there was another part of me, which the cultural snobs would praise. I’ve always been ravenous for books, and as a small child had a stack of library books in my bedroom as a matter of course. The habit continues into adulthood. I don’t read because I believe it is morally edifying, or because I have the disposable income to engage in leisure reading. I read because I am. It is my nature, I can do no more, or no less. Public libraries are free, but most humans need television at minimum as an essential supplement, and in most cases a total replacement, for the joys of books. Quantities of library books which one can have for long periods of time for free which in ages past would have been out of the reach of the moderately wealthy are simply not attractive to most, who would pay out the sums demanded by cable television monopolies. For them a life of sensory reception, not one of contemplation or reflection.

Perhaps television can be analogized to sugar, ideally designed to trigger all of our most primal instincts. For the average human, even the not-so-average human, the passive pleasures of television are without parallel. And yet I believe television is likely far less dangerous than sugar: the average human has long been rather dull in comparison to the grand visions which the cultural elites held to be the exemplar of human existence. Their nature is pedestrian, but it can be called to decency. Dullness of nature should not be confused with low moral character. The simple pleasures of television are not fundamentally harmful, and I doubt they leave the mind or soul any less than they were beforehand. More likely the ultimate abomination of television in the eyes of the modern elites rooted in egalitarian presuppositions is that it gives the lie to the argument that all men are the same in their wants and preferences. Whether through self-cultivation or fundamental nature those of power, status, and wealth have aesthetic preferences which move in different directions than that of the mainstream, and they define their own goals as the true goals, as the good goals. The organized religions which emerged in the Axial Age have long been controlled at the highest levels by these classes, so the pleasures of the peasants have always been demonized as ungoldly, base, and sinful. Even when the priestly class killed their own gods, they continue to uphold that old morality.

Despite what what I say above, attempting to step outside of the circle of judgment, it is clear that I can not help but pass judgment, and elevate that which I hold to be true, good, and beautiful, to be what is objectively true, good, and beautiful. Intellectually I dispute such a proposition as self-evident, but emotionally it is difficult to shake that feeling in your bones. Our elites, from Left to Right, are arrogant creatures because they are elite, and wish to reshape all to universals of their own ends. So let us to move to the domain of data. The General Social Survey has a variable, TVHOURS, which asks respondents how many hours of television they watch per day.

Has there been a major drop in the watching of television over the past generation? Not according to these data.


There has been some recent talk about surveys which show that older people watch a lot more television than younger people. Does this show up in the data? From now on I am going to limit the data to the year 2008 or later, unless specified. In any case, yes, older people do watch more television.


How about education? This goes back to the result that W Bradford Wilcox shared.


No surprise. But how about IQ (measured via WORDSUM, a vocabulary test).


The results speak for themselves. Confirming our prejudices, the dull watch more TV than the bright, though even here you notice that the higher orders watch a fair amount of television. But some of you might wonder about confounds of education.


What this tells us is that both education and intelligence have an effect on hours watched of television, though education means a great deal (multiple regression seems to suggest that education has the bigger effect). So that readers can inspect the sample sizes and look at the confidence intervals, here is how you reproduce. First, go to the GSS. Under ANALYSIS, select COMPARISON OF MEANS. For DEPENDENT enter TVHOURS. For ROW enter WORDSUM(r:0-4;5;6;7;8-10). For COLUMN enter DEGREE(r:0;1;3-4). Finally, SELECTION FILTER(S) YEAR(2008-*).

How about income? Here I used the REALINC variable to establish some rough thresholds. You get the idea from this chart….


Now labor force participation (no idea why “Keeping House” is around as a variable in 2008 to 2012, but there it is)? Nothing surprising here.


There’s really no significant difference when it comes to politics.


Finally, how about hours worked? Limiting the sample to full-time workers the correlation between numbers of hours worked last week and hours of television watched is -0.12. Expanding the data back to 1972, the correlation is -0.10. Not large, but it’s robust.

You can explore the data yourself with the GSS.


• Category: Miscellaneous • Tags: Culture 
🔊 Listen RSS

Origins-of-the-Irish1 After reading Ancestral Journeys, I decided to get J. P. Mallory’s The Origins of the Irish. A bit on the academic side for some, but definitely a good dive into the literature. Mallory is well aware of the latest genetic research, so this is as up-to-date as it gets. It’s a good case study in how multidisciplinary prehistoric studies should be done.

As I’ve suggested earlier prehistory looks to be a good deal more complex than we had previous thought, so expanding beyond single methodological perspectives is probably essential if we really care about truth.

In other news, a short piece in The New York Times refers to Salafis as ‘ultraconservative.’ I think this misleads most people about the nature of Salafism: it is a radical utopian system which recently arose out of Islam’s confrontation with Western derived modernity. It isn’t conserving anything. This aspect of Salafism explains why Saudi Arabia condones the bulldozing of Muhammed’s tomb and celebrates modern monumental architecture in Islam’s holy city.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture, Open Thread 
🔊 Listen RSS

I’ll be on the Kathleen Dunn Show on Monday, 2 to 3 PM Central Standard Time. You’ll be able to listen to the show after it airs on the website. It will be about my piece in Slate.

AJjacket I recently read Ancestral Journeys: The Peopling of Europe from the First Venturers to the Vikings by Jean Manco. You can find more information at her website, but I pretty much would recommend this book to all my non-scientist readers. I’d recommend it to many of the scientists too, if you are rather weak on archaeology, because that’s where Manco’s knowledge is really impressive. It’s not a perfect book, and I don’t agree with all the details, but it’s a very detailed, dense, and fast read.

There was a question below in regards to the Fast Company profile of 23andMe and what they’re trying to do. A major ethical issue brought up is whether it is acceptable to type children and disclose possible disease risk later on in life. As an extreme case, what if you find out that your child is going to develop a life threatening disease by the time they’re 40? My own perspective as a parent is that I’d like to know, and I’d probably want to tell my child as soon as I think they can handle it. The reason is simple: you base your life decisions on various aspects of life expectancy. People put things off, or forgo consumption, all the time.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture, Open Thread 
🔊 Listen RSS

I am old enough to remember card catalogs. They did not make me happy. As a small child I noticed omissions and incorrect classifications so often that for long periods of time I would simply avoid the catalog, and methodically consume books from whole sections of the public library in line with my preferences through tedious manual browsing. I am also old enough to remember when the internet was still primitive in its data organization and storage capacity (i.e., pre-Google, pre-Wikipedia), and the library was the first, last, and best, recourse toward retrieving data. When Braveheart was released in 1995 I ran down to the local university library to see if I could find more about the protagonist’s biography than was present in Britannica. By chance there was a book available on the life and times of William Wallace, but it was checked out, and there were more than 10 holds ahead of me! This was not an uncommon occurrence in the age before the data rich internet. The reality is what I wanted to know about Wallace is probably found in the Wikipedia entry, but then there was no Wikipedia! These are just a few of the reasons that I have little patience for neo-Luddites such as Nicholas Carr. When I read Carr’s “old man” jeremiads I always wonder, “son, were you even around back in my day?”*

This line of thought crossed my mind as I was sitting in the audience at BAPG IX**; the ninth Bay Area Population Genomics meeting. Started by Dmitri Petrov at Stanford, it brings together research groups at from Petrov’s institution, Berkeley, UCSF, and UC Davis which work at the intersection of population genetics and genomics (I noticed a non-trivial UC Santa Cruz contingent this time around, so I suspect it’s getting more popular). BAPG illustrates the fact that the internet changes the way we communicate and consume information, but in a synergistic, not antagonistic, fashion in relation to traditional person-to-person interactions. The core elements of the meetings would be recognizable to someone from 1990 (perhaps replace PowerPoint with transparencies?), presentations and posters. But these two informational centerpieces are embedded in a scaffold of richer transmission and dissemination modalities. I first heard about BAPG through an email notice. Many people no doubt became aware via Twitter and blogs (which may have triggered word of mouth or emails). Prominent researchers in population genomics such as Dmitri Petrov, Graham Coop, and Carlos Bustamante, have robust and accessible internet presences, so you can hear about what their labs are doing from the horse’s mouth, so to speak. Not only can events like BAPG be organized rapidly with minimal overhead because of the ease of the spread of information, but the proceedings are often relayed on Twitter in real time.

And yet the fact that the BAPG meetings are “in the flesh” reflects two realities. The prosaic one is that this sort of meeting is probably only feasible in the San Francisco Bay region, due to the existence of the Berkeley-Stanford axis, as well as concentration of private sector genomics related firms.*** The deeper truth though is that even academics as comfortable with computation and information technology as population genomicists still thrive on interpersonal contact face to face. They are human. Despite all the worries about the ubiquity of smart phones, tablets, and notebooks (all on display at the meeting!), and their impact on social interaction, there was a copious amount of old fashioned free spirited conversation. WALL-E is not real. Yet.

The human element can not be abolished, only modified. And unforutnately the scarcity of human interest is often the shortcoming, not the technology. I thought of this while taking in Kelley Harris’ fascinating presentation on the distribution of mutations across the genome. Using 1000 Genomes data Harris found that point mutations are not randomly distributed; they cluster. And, they exhibit unexpected patterns in their proportions of transversions to transitions. As I said, these were very interesting results. But I wondered why it was that Harris was the one who had to discover this. As a Harvard and Berkeley educated mathematical biologist she obviously has some skills and aptitudes, but it isn’t as if there aren’t many competent thinkers across the Bay area working at places such as Google who couldn’t have done the same in their spare time. The data is there. It reminds me of a conversation I had with a very prominent statistical geneticist who I visited with in Cambridge. He wondered why bright minds around the world weren’t excavating the same mathematical and computational gold freely available to his research group.

I hold that the problem here has to do with the humans, and not what technology is doing to the humans. As outlined in works such as The Lunar Men it seems that the diversion of the energies of those with leisure and inclination toward intellectual pursuits is subject to the whims of fashion and cultural Zeitgeist. The problem is not the technology. Technology is used by people, to serve their interests and preferences. If you have a problem with the preferences of the human race, take that up with the human race, don’t crucify technology for the sins of humanity.

* I am joking, as I know Carr is older than I am.

** For your amusement. The keynote speaker was rattling off a series of population genetic parameters. So the individual directly in front of me was furiously Googling, and for about 30 seconds he was browsing one of my blog entries! I was tempted to tap him on his shoulder and inform him that the author was sitting directly behind him, but I did not do so.

*** I am aware there is now a southern California equivalent.

(Republished from Discover/GNXP by permission of author or representative)
🔊 Listen RSS

Unless you are under a rock you are aware of the controversy around the sound that the fox makes. But did you know about Stonehenge?

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture 
🔊 Listen RSS

Update 2: No longer accepting comments on this post. Please stop submitting. Thanks.

Update: Due to the vociferous and emotive nature of many comments, I am not publishing over half submitted on this post. Just so you know your chances…

One thing that I have read repeatedly is that circumcision rates in the United States have fallen over the past generation. For non-Americans in the readership, yes, American males are customarily circumcised even if they are not from a religious or cultural tradition where this is the norm (i.e., they are not Muslim, Jewish, or East or West African). For Americans, yes, circumcision has nothing to do with Christianity (something that would be obvious if more Americans actually read the New Testament, instead of just quoting selective passages from it). But looking more closely at the data it seems that the decline in circumcision is predominantly a function of its collapse as a normative practice in the western states!

One might think that this is due to demographic changes in the West, as Hispanics have lower rates of circumcision than non-Hispanics (black or white). But while California had circumcision rates of 22% in 2009, Washington state’s was 15%. It seems that Medicaid coverage has a strong effect, but this can’t explain all of the variation. In the late 1970s the western states had the same circumcision rates as the northeastern states. Today northeastern states have circumcision rates two to three times higher than in the west. And it doesn’t map onto politics either. Extremely conservative (and western) Utah has circumcision rates of 42%. Blue Rhode Island has rates of 76%.

Finally, I want to observe here that the males who were born during the era of diverging circumcision rates are now entering sexual maturity en masse. This is going to shape the expectations of both sexes, and perhaps result in some surprises for those who relocate to the other coast as they transition to adulthood….

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Circumcision, Culture, Health 
🔊 Listen RSS

Sorry, this was late. Was out of town, showing my daughter the sites of the world….

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture, Open Thread 
🔊 Listen RSS

Update: Also see, Pimpin’ the ghetto:

Pimps who run what little economy exists in the ghetto. They control the humanities ghetto, have old boys patronage networks to fall back on, and have a great deal in a slummy part of town. In other words, folks who get tenure-track PhDs at research universities.

The American Historical Association is run by pimps for pimps — by professors at research universities, for professors at research universities. That their policy does not help the public or most PhD graduates of history programs is besides the point. They are an old boys network protecting themselves.

The AHA isn’t out ot protect disaster tourists, or losers, or escapees. The AHA is by, for, and of pimps.

This isn’t too criticize pimps — if you actually love the ghetto, why not be successful in it? — but to say that not everything they do is in your best interests.

If you are in the AHA, here is your choice: You can like that, or you can get out.

To extend the analogy, do pimps facilitate good healthy sex for society, or do they encourage the spread of unpalatable contagion by perpetuating the ghetto and its conditions? You know where I stand….

End Update

In relation to the AHA’s bizarre embargo policy Patrick Wyman left a long comment which I think is worth promoting up. Observe that some of the same could be applied to the natural sciences (recall the Carl Sagan fiasco). So here you have it….

I’m a historian close to completing my dissertation, and this is pretty indicative of the general state of affairs in the discipline. Academic history, as a profession, has essentially abdicated any responsibility in educating the broader public about its research. There’s a reason most popular history is written by journalists and not professional historians: there’s absolutely zero positive incentive for historians to write for a broader audience, and in fact it’s actively discouraged by everyone from department chairs to AHA directors to academic publishers to advisors gleefully shooting down PhD students who had eventually hoped to do so.

This is despite the voracious appetite of the reading public for historical fare. People want to know about the past, and if historians aren’t interested in providing it, they’ll read whatever’s available without evaluating the source. The end result is that professional historians have made themselves irrelevant in the public sphere. What’s really shocking about the whole thing is that most historians treat this as a net positive: now, you see, they can really focus on their research without wasting time catering to people who can’t speak the specialized cant of the discipline.

Most dissertations don’t become books, and more importantly, most recipients of doctorates in history (anywhere from 85-95 percent) never even sniff the tenure-track job for which a tenure book is necessary. This statement is basically a paean to a time when the AHA could more effectively pretend (or if you want to be charitable, fool itself) into thinking that there’s really a future in academia for most of those who complete dissertations. It’s a cruel irony that the historians whom this policy hurts the most – everyone other than the students of the best-known historians at the top 5-10 institutions, who are massive favorites to get jobs anyway – would actually benefit professionally from the exposure that open dissertation access provides. If this policy becomes the norm, the vast majority of the research that’s conducted will never see the light of the day.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture 
🔊 Listen RSS

Why the title? Read it for yourself: American Historical Association Statement on Policies Regarding the Embargoing of Completed History PhD Dissertations. Here’s the conclusion:

By endorsing a policy that allows embargos, the AHA seeks to balance two central though at times competing ideals in our profession–on the one hand, the full and timely dissemination of new historical knowledge; and, on the other, the unfettered ability of young historians to revise their dissertations and obtain a publishing contract from a press. We believe that the policy recommended here honors both of these ideals by withholding the dissertation from online public access, but only for a clearly stated, limited amount of time, and by encouraging other, more traditional forms of availability that would insure a hard copy of the dissertation remains accessible to scholars and all other interested parties.

I’m going to try hard not to go “Michael Eisen” on this: did the AHA just compare the dissemination of knowledge with careers? It strikes me that if you do scholarship of any sort the discovery and dissemination of knowledge is all, it is the summum bonum. All else is secondary and marginal. As it is the academic job market is brutally Darwinian in the most extreme sense, and more so for humanities scholars. Can you truly push this thread any further by open access requirements for dissertations? I doubt it. But let’s test this proposition.

Note: I am granting many of the premises of the argument in the statement for the purposes of this post. Even allowing for those premises, when a scholarly discipline goes too far down the careerist rabbit-hole, then it is time for people to start thinking about become actuaries to put bread on the table.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture 
🔊 Listen RSS

My personal preference would have been that they somehow extend the plot line of The Chronicles of Riddick. As Brian Switek suggested this seems like Pitch Black, but with much better special effects. Though how many they have left after this trailer, I don’t know.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture 
🔊 Listen RSS

Non-Hispanic White vote for John McCain 2008 according to National Exit Polls
Red = 100% for McCain
Blue = 100% for Obama

As we come up to the day celebrating American independence from the Britain there will be the standard revelries and reflections. Personally, I have no problem with that. A modicum of patriotism seems healthy in all, and if appropriately channeled a surfeit is often useful in the populace as a way to maintain civic engagement. That being said I did admit that in the positive and descriptive sense I am far more ambivalent about the consequences and rationale for the rebellion than I was as a child. I don’t accept that the American revolution was indisputably about Virginia gentry who wished to avoid financial ruin, New England fundamentalists yearning for oppression of Quebecois Catholics, or upcountry Scots-Irish chafing at the bit to explode into the western hinterlands, heretofore restrained by the Empire. But I believe that this narrative is as true as the story I was told as a child about an unjust and oppressive British monarchy battling the cause for the cause of freedom and liberty. When Patrick Henry declared ‘Give me liberty, or give me death!’, it was not a universal declaration. It was implicitly a call to arms for the rights of white male property holders in the context of colonial Virginia. This is not a palatable message for elementary school age children, so such subtle but true details are neglected in the standard narrative.

Albion's Seed But the point of this post is not to re-litigate the American revolution. Rather, looking at the comments below I think it is time to reemphasize that American history needs to be thought of in plural terms. There was no one American revolution, but American revolution s. Without acknowledging this reality a plausible representation of the past can not be constructed. Our comprehension is limited by the tendency to back project a relatively homogeneous and unitary contemporary cultural and political union back two centuries. But to understand the disparate revolutions one must understand the disparate Americas.

In 2013 when we talk about “many Americas” we often conceive of it in coarse racial or regional terms. There is a “black America” and a “white America.” There is the South and the North. With the emphasis on racial identity politics, and to a lesser extent class, in elite discourse the deeper strands of historical difference rooted in the foundations of the original American colonies have been hidden from us. These older filaments of identity are outlined in historical works such as David Hackett Fischer’s Albion’s Seed: Four British Folkways in the America and Kevin Phillips’ The Cousins’ Wars: Religion, Politics, Civil Warfare, And The Triumph Of Anglo-America. A true typology of socio-cultural difference is essential toward understanding how and why the past unfolded as it did, but they are also illuminating in relation to patterns of the present.

For example, Colin Woodward’s American Nations: A History of the Eleven Rival Regional Cultures of North America is a contemporary updating of the standard geographic typology. The map I generated above from exit poll data outlines broadly a major consequence of the past and present fissures of the American nationality: white Americans tend to vote very differently. In the Deep South to a good approximation to be white is to be a Republican, and vote for Republicans. In contrast, in Greater New England there is a slight tilt toward the Democratic party among white voters. When you aggregate white voters nationally there is a tendency for it to lean toward the Republican party, but this masks deep regionalism. In Vermont 31% of whites voted for John McCain in 2008. In Alabama that figure was 88%.

And so it has always been. In the 1856 election the Republicans contested for the presidency, and as you can see on the map to the left only the Yankee regions supported their candidate. The waxing and waning of political power of the various American parties over time has to a large extent been the function of shifting alliances between distinct “sections” of the American nation. In the period before the Civil War Greater New England was isolated by an alliance between the South and portions of the Lower North bound together by culture and economics. Illinois, Indiana, and Ohio, might have notionally been Midwestern Northern states, but they were divided between “Yankee” and “Butternut” (from the Upper South) cultural zones. It was from the Butternut regions of these border states where much of the anti-war sentiment in the North was localized during the Civil War. In contrast New York City may not have been settled from the South, but its cosmopolitan mercantile elite had long had a tense relationship with the New Englanders who had begun to dominate much of upstate New York and had pushed into Long Island as well as elements of Manhattan society. On top of that the port of New York had a relatively close economic relationship with the South.

In other words, to understand the true texture of regional alliances and dynamics one must be cognizant of both deep historical contingencies rooted in cultural affinity, and, the exigencies of contemporary economic needs. It is difficult for me to believe that New England’s ultimately successful challenge of Southern political hegemony leading up to 1860 was not bound up in its economic dynamism, which began to tear apart the north-south connections which tied states such as Pennsylvania with the Upper South, and replaced them with east-west lines of transport and communication via rail, canal, and telegraphy. Similarly, the rise of the “Sunbelt” in the 20th century was contingent upon technological and medical revolutions which closed the quality of life chasm between North and South.

All this is not to deny a common American sense of nationhood which has evolved since the tenuous links of the days of the Articles of Confederation. But regionalism, which has both a physical and temporal aspect, is neglected at one’s peril in terms of understanding the political and social patterns of the American republic. There are two ways in which regionalism was often transcended. One was via class, as populists attempted to overcome ethnic and regional divisions against robber barons and bourbons alike. But another was race. The 1830s saw the rise of a Democratic hegemony in national politics, based in the South and its Butternut Diaspora, but with northern auxiliaries of immigrant white ethnics in large cities (German Catholics and the Irish) and the non-Yankee zones of settlement in Pennsylvania and New York. The Democratic party in this period was simultaneously both populist and racialist, expanding voting rights to all white males, but in some cases explicitly barring blacks in Northern states from the right to vote (as opposed to the implicit bar via property qualifications). The modern American cultural consensus which speaks of a white America and black America is in some ways a morally inverted resurrection of this concept, where whites are viewed as a homogeneous whole to a rough and ready approximation.

Credit: Matthew Hutchins

The problem with this view is that it is both wrong on a descriptive and moral sense. It is wrong descriptively because where black Americans have a dominant coherent national culture with ultimate roots in the South (though there have long been Northern black communities, these populations have been reshaped by the Great Migration out of the South), whites do not. To put it plainly, a privileged White Anglo-Saxon Protestant born in an upper middle class family in the northern shore suburbs of Boston is fundamentally different from a White Anglo-Saxon Protestant born in a working class family in rural West Virginia. And it is unjust because a uniformity and interchangeability of all white Americans neglects the reality that the privileged accrued to the former are not accrued to the latter. In the end what is true of whites is also true of non-whites. It seems blind to assume that a demographically expansive “Hispanic” population will remain as politically and socially homogeneous as black Americans, because of their original regional and cultural diversity (e.g., Texas Hispanics and California Latinos have long had distinct subcultures).

Of course don’t tell this to the standard press and pundit class, who remain wedded to cartoonish cultural and historical algebras.

(Republished from Discover/GNXP by permission of author or representative)
• Category: History, Science • Tags: Culture, Regionalism 
🔊 Listen RSS

Independence Day is coming up. Very excited to celebrate with my daughter. She may be old enough not to be frightened by the noise. On the other hand I came to the conclusion a few years back that the world might not be a worse off place if the American colonies had remained colonies for a while longer (to be honest my thoughts were triggered over a decade ago when I watched The Patriot and reflected on its misrepresentation of the British armed forces for dramatic effect).

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture, Open Thread 
🔊 Listen RSS

The other day my office mate wondered “what ever happened to Lady Gaga?” Obviously Lady Gaga is still around and making plenty of money, it doesn’t seem like she’s the pop culture phenomenon she once was. Of course you can live for decades off your early notorious culture changing explosion onto the scene. Madonna is proof of that. But it’s still of interest to know when someone is, or isn’t, the “It” thing. I don’t follow pop culture that closely, but I can say that I remember Gaga before she was Gaga. I was hanging out on the Lower East Side in January of 2006, and there were Gaga posters announcing her opening a show somewhere nearby. One of my friends was freaking out, because she was apparently a big deal. A few years later she did become just that. But somewhere along the way it seems that Lady Gaga has gone from the foreground to the background.

As is my wont I wanted to quantify this. I pulled Google Trends data for Lady Gaga, Katy Perry, Taylor Swift, and Adele from July 2008 to January 2013. I then plotted them with a loess smoothing function. You can see the results below. Nothing that surprising (though if you limit the search results to the United State Taylor Swift becomes a much bigger deal):

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Culture 
🔊 Listen RSS

He liked a good brew!

Kevin Zelnio recently made me aware of this fascinating piece in The New York Times, For Its Latest Beer, a Craft Brewer Chooses an Unlikely Pairing: Archaeology. Here’s the catchiest aspect: a microbrewery is attempting to recreate the taste of ancient Sumerian beer! Why? Though it’s purportedly educational, obviously it’s also the “cool” factor which is at the root of this enterprise. The brewery doesn’t aim to sell this. I say why not!

A few years ago Paul Boom wrote the book How Pleasure Works: The New Science of Why We Like What We Like. This may seem like a trivial exploration of a topic, after all, who doesn’t know how pleasure works? But when you plumb the depths of genuine hedonism there are often rapid diminishing marginal returns when you simply apply a robotic calculus of more sensory vividness. Rather than a stronger chocolate, sometimes you want a finer chocolate. But what does that even mean? One thing that a standard hedonistic account of pleasure often underplays is that it is not a simple toting of sensory qualities. Rather, it is the essence of the thing that matters.

Lover of fine things!


An example will suffice to illustrate what I’m talking about. There is a bizarre story in the media right now about Vladimir Putin being involved in stealing a Patriots Super Bowl ring. I haven’t followed the story closely. But, I can tell you that the reason this is a story is not because of the physical value of the ring. It is because it is a Super Bowl ring. Rationally as human beings we understand that things are reducible to quarks and leptons, but hundreds of millions of years of evolution have hard-wired us with a sort of essentialism which tell us deep in our bones that there is a fundamental ineffable ontology to particular objects in the world. The nature of these objects is tied not just up in what they are in a proximate sense, but where they have been.

How does this apply to Sumerian beer? I believe one of the appeals of the Paleo diet is that it is purportedly the diet of our ancestors, and that has an innate appeal, because it feels deeply authentic.* This has pleasurable consequences. Similarly, the idea of drinking like the Sumerians has genuine value in and of itself in terms of hedonism, no matter the quality of the beer. Because of the downsides of modern processed foods one might argue that a fad for retro ancient food rooted in irrational instincts may actually be greatly beneficial to our society.

Delenda est processed food!

Of course the qualifier here is that we’d eat like a prosperous Roman, not the marginal peasant subsisting on gruel. But though there is an aspect of contemporary culinary arts which tend toward futuristic sophistication, such as molecular gastronomy, there is also a strain which leans upon simple and spare preparations. It may benefit American public health and our gustatory experience if an industry arose which marketed itself not as “health food,” but as “authentic food.” Eating a hearty Roman meal worthy of Cato the Elder, and wash it out with a beer which would have brought a smile to stern Hammarubi’s face! Silly, but sillier than a twinkie?

* I do not wish to get into discussing the Paleo diet, but I do think that empirically it is beneficial for many people because it gets them away from loading up on processed sugar rich foods.

(Republished from Discover/GNXP by permission of author or representative)
• Category: Science • Tags: Beer, Culture, Food, Sumerians 
No Items Found
Razib Khan
About Razib Khan

"I have degrees in biology and biochemistry, a passion for genetics, history, and philosophy, and shrimp is my favorite food. If you want to know more, see the links at"