A few separate pieces that I read today came together thematically for me in an odd confluence. First, an article in The Straits Times repeats the shocking statistics about the nature of modern academic intellectual production, Prof, no one is reading you, that you may be aware of. Here’s the important data:
Even debates among scholars do not seem to function properly. Up to 1.5 million peer-reviewed articles are published annually. However, many are ignored even within scientific communities – 82 per cent of articles published in humanities are not even cited once. No one ever refers to 32 per cent of the peer-reviewed articles in the social and 27 per cent in the natural sciences.
If a paper is cited, this does not imply it has actually been read. According to one estimate, only 20 per cent of papers cited have actually been read. We estimate that an average paper in a peer-reviewed journal is read completely by no more than 10 people. Hence, impacts of most peer-reviewed publications even within the scientific community are minuscule.
What ever happened to the “republic of letters”? Are humanists reading, but not citing, each other? Or is it that humanistic production has basically become a matter of adding a line to one’s c.v.? So the scholar writes the monograph which is read by their editor, and then published to collect dust somewhere in the back recesses of an academic library.
Second, an article in The New York Times, Philosophy Returns to the Real World, declares the bright new world in the wake of the Dark Ages of post-modernism. Through a personal intellectual biography the piece charts the turn away from the hyper-solipsistic tendencies in philosophy exemplified by Stanley Fish in the 1980s, down to the modern post-post-modern age. Operationally I believe that Fish is a human who reconstitutes the characteristics of the tyrannical pig Napoleon in Animal Farm. Despite all the grand talk about subjectivism and a skepticism about reality which would make Pyrrho blanch, Fish did very well for himself personally in terms of power, status, and fame by promoting his de facto nihilism. Money and fame are not social constructs for him, they are concrete realities. Like a eunuch in the Forbidden City ignoring the exigencies of the outside world, all Fish and his fellow travelers truly care about are clever turns of the phrase, verbal gymnastics, and social influence and power. As the walls of the city collapse all around them they sit atop their golden thrones, declaring that they are the Emperors of the World, but like Jean-Bédel Bokassa are clearly only addled fools to all the world outside of the circle of their sycophants. After all, in their world if they say it is, is it not so? Their empire is but one of naked illusions.
Finally, via Rod Dreher, a profile of David Brooks in The Guardian. He has a new book out, The Road To Character. I doubt I’ll read it, because from what I can tell and have seen in the domain of personal self-cultivation of the contemplative sort our species basically hit upon some innovations in the centuries around 500 B.C., and has been repackaging those insights through progressively more exotic marketing ploys ever since. Xunzi and Marcus Aurelius have said what needs to be said. No more needed for me.
But this section jumped out:
“I started out as a writer, fresh out of college, thinking that if I could make my living at it – write for an airline magazine – I’d be happy,” says Brooks over coffee in downtown Washington, DC; at 53, he is ageing into the amiably fogeyish appearance he has cultivated since his youth. “I’ve far exceeded my expectations. But then you learn the elemental truth that every college student should know: career success doesn’t make you happy.” In midlife, it struck him that he’d spent too much time cultivating what he calls “the résumé virtues” – racking up impressive accomplishments – and too little on “the eulogy virtues”, the character strengths for which we’d like to be remembered. Brooks builds a convincing case that this isn’t just his personal problem but a societal one: that our market-driven meritocracy, even when functioning at its fairest, rewards outer success while discouraging the development of the soul. Though this is inevitably a conservative argument – we have lost a “moral vocabulary” we once possessed, he says – many of the exemplary figures around whom Brooks builds the book were leftists: labour activists, civil rights leaders, anti-poverty campaigners. (St Augustine and George Eliot feature prominently, too.) What unites them, in his telling, is the inner confrontation they had to endure, setting aside whatever plans they had for life when it became clear that life had other plans for them.
Many of the ancients argued for the importance of inner reflection and mindful introspection. Arguably, the strand of Indian philosophical thought represented by the Bhagavad Gita was swallowed up by this cognitive involution, as one folds in upon one’s own mind.
But let me tell a different story, one of the outer world, but not one of social engagement, but sensory experience of the material domain in an analytic sense. Science. A friend of mine happens to be the first using next-generation sequencing technologies to study a particularly charismatic mammal. I reflected to her recently that she was the first person in the history of the world to gaze upon this particular sequence, to analyze it, to reflect upon the natural historical insights that were yielded up for her by the intersection of biology and computation. It is highly unlikely that my friend will ever become a person of such eminence, such prominence, as David Brooks or Stanley Fish. Feted by her fellow man. But my friend will know truth in a manner innocent of aspirational esteem totally alien to the meritocratic professionals David Brooks references. On the day that you expire, would you rather be remembered for a law review article, or discovering something real, shedding light on some deep truth (as opposed to “truth”)?
This perhaps offers up a possibility for why humanists don’t cite each other. Too many have been poisoned by the nihilism of the likes of Stanley Fish. They do not see any purpose in the scholarship of their peers, because humanistic scholarship of the solipsistic sort is primarily an interior monologue with oneself. The experiments of English professors always support their hypotheses. Their struggle is to feed their egos, they wrestle with themselves, Jacob’s own angel as a distillation of their self-essence. The limits of their minds are the limits of their world.
Finally, this filament threaded through, of a reality out there, the possibility of being made aware of it, even through the mirror darkly, is why I continue to do what I do, and aspire to what I aspire to. The truth is out there. It does not give consideration to our preferences. But it is, and we can grasp it in our comprehension. Over the past ten years in the domain of my personal interest, and now professional focus, genomics, we’ve seen a sea change. That which we did not even imagine has become naked to us. Before the next ten years is out who knows what else we’ll discover?

RSS



Aren’t you uncomfortable at these claims of “truth”.
The history of science has been unkind to previous iterations of truths (ask, and I’ll provide a list).
Isn’t science essentially the scientific method, which provides mechanisms for counteracting the universal weakness for belief in the plausible, or at the very least, interim technologies?
Contrary to the title of the piece, I think we’ve found alarmingly often that the so-called truth does actually forsake you.
Was your friend the first to reflect on certain “natural historical insights” or was it rather “natural historical hypotheses”?
Even the notion that our “comprehension” is a litmus test for “truth” seems archaic considering the kinds of tasks evolution used to hone what we call “comprehension”.
It’s fine to bring up solipsism but even the frailty of causality might do with a bit of genuflection ever since David Hume.
This isn’t a big criticism, just wondering if a bit more stylistic humility may not better withstand the tests of time.
Are we really peddling truth, or just the best temporary explanation?
i'm coming to the conclusion that unfortunately the method is less important than what the method is applied to. many social scientists supposedly hew to the method, but they have not been as successful as physicists
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0010068
Are we really peddling truth, or just the best temporary explanation?
newtonian mechanics is a temporary explanation. if that's all you want to label it as, that's your prerogative. i don't think the human race should be stylistically humble about newton and his heir's achievements, even if superseded in its time by a more precise and accurate model.Replies: @Everyone Else
also, can we dispense with this sort of sophistry? this stylized assertion totally sidesteps the fact that unlike many other intellectual disciplines science has actually been both contingent and progressive. fashion influences it, but over the long term (or at least on a lifetime scale) it can not bend it in an arbitrary manner.Replies: @Everyone Else
fly on a plane. you won’t be uncomfortable.
The history of science has been unkind to previous iterations of truths (ask, and I'll provide a list).
Isn't science essentially the scientific method, which provides mechanisms for counteracting the universal weakness for belief in the plausible, or at the very least, interim technologies?
Contrary to the title of the piece, I think we've found alarmingly often that the so-called truth does actually forsake you.
Was your friend the first to reflect on certain "natural historical insights" or was it rather "natural historical hypotheses"?
Even the notion that our "comprehension" is a litmus test for "truth" seems archaic considering the kinds of tasks evolution used to hone what we call "comprehension".
It's fine to bring up solipsism but even the frailty of causality might do with a bit of genuflection ever since David Hume.
This isn't a big criticism, just wondering if a bit more stylistic humility may not better withstand the tests of time.
Are we really peddling truth, or just the best temporary explanation?Replies: @Razib Khan, @Razib Khan
Isn’t science essentially the scientific method, which provides mechanisms for counteracting the universal weakness for belief in the plausible, or at the very least, interim technologies?
i’m coming to the conclusion that unfortunately the method is less important than what the method is applied to. many social scientists supposedly hew to the method, but they have not been as successful as physicists
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0010068
Are we really peddling truth, or just the best temporary explanation?
newtonian mechanics is a temporary explanation. if that’s all you want to label it as, that’s your prerogative. i don’t think the human race should be stylistically humble about newton and his heir’s achievements, even if superseded in its time by a more precise and accurate model.
The history of science has been unkind to previous iterations of truths (ask, and I'll provide a list).
Isn't science essentially the scientific method, which provides mechanisms for counteracting the universal weakness for belief in the plausible, or at the very least, interim technologies?
Contrary to the title of the piece, I think we've found alarmingly often that the so-called truth does actually forsake you.
Was your friend the first to reflect on certain "natural historical insights" or was it rather "natural historical hypotheses"?
Even the notion that our "comprehension" is a litmus test for "truth" seems archaic considering the kinds of tasks evolution used to hone what we call "comprehension".
It's fine to bring up solipsism but even the frailty of causality might do with a bit of genuflection ever since David Hume.
This isn't a big criticism, just wondering if a bit more stylistic humility may not better withstand the tests of time.
Are we really peddling truth, or just the best temporary explanation?Replies: @Razib Khan, @Razib Khan
The history of science has been unkind to previous iterations of truths (ask, and I’ll provide a list).
also, can we dispense with this sort of sophistry? this stylized assertion totally sidesteps the fact that unlike many other intellectual disciplines science has actually been both contingent and progressive. fashion influences it, but over the long term (or at least on a lifetime scale) it can not bend it in an arbitrary manner.
I was a historian years and years ago before I embarked on other careers. One of the distressing things to me about academic research in the humanities and social sciences was the ever irrelevant descent into the exotic and the minute in the constant quest for “original” work. Frankly, much of academic writing in my field and in most other social scientific endeavors was esoteric and uninteresting. Big, important themes were discouraged by advisors mainly because they were difficult to do in an original, “new” manner. To tackle such ideas was seen as un-rigorous, egotistical, and even grandiloquent. So we got research like “a feminist perspective on water resource management in a small South American village” (translation: how did old women divide well water where no one speaks English) and the ilk. Who would cite that except for a handful of people (if even that) who might be interested in something similar?
And yet, there was also great deal of conservatism that discouraged “out-of-box” thinking. In my days as a Ph.D. student, I tried to introduce elements of wargaming into the methodology of my thesis. By this, I don’t mean games in the commercial market, but serious wargaming used at the Carlisle Barracks/Army War College. I exchanged much correspondence with and visited people at the USTRADOC, Army wargaming specialists, pioneers in the field, etc. I thought I was being quite innovative and original, coming up with ways to increase predictive value of certain theories.*
Nope. I was told not to “waste” my time with games. I ended up incorporating econometrics, which, frankly, glazed over my eyes the whole time I worked on it.
*It’s interesting that defense contractors took all that work very seriously much later in my life. Apparently it wasn’t rigorous enough for the academic world, but it was rigorous enough for the real world of men in combat.
Mr. Khan, there is certainly *enormous* value to the kind of work you do and others of the “hard” science persuasion. And I certainly agree that there is great danger in falling into “the solipsistic sort” of a search that is “primarily an interior monologue with oneself.” And yet, don’t you think we need both? A search into the truth that is “out there,” but also continuing inquiry into who we are? I don’t think Xunzi or Marcus Aurelius is the end of the inquiry into virtue or philosophy in general. Indeed, as we discover more about what’s out there, I would think we would need to continue to assess where that places us and who we are and how we are to adjust to the new worlds we have discovered.
I suppose this has much to do with one’s perspective about human beings and the communities they create. If you saw people as merely very advanced and evolved form of amoeba, it would be very limiting indeed to focus on such inner deliberations (which are, I suppose, merely incidental to the biological drive for survival and replication under such a utilitarian perspective). But those of us with a more theistic bent who see the divine spark in our souls tend to think that our minds and bodies are a kind of a nexus between a world to be colonized on the outside and a limitless, creative urge and drive that God has given us that, to this day, defies comprehensive understanding. In other words, does science not require wisdom?
the tsunami of lines afflicts CVs in natural sciences now too. probably not as bad.
I don’t think Xunzi or Marcus Aurelius is the end of the inquiry into virtue or philosophy in general.
suggestions?
In other words, does science not require wisdom?
if science is modeling the world out there, probably not. automatons could do it. but science is obviously intersected with other things of humane concern. that's what the humanities were presumably for, though that's fallen by the wayside among the elites in the west.Replies: @Twinkie
And yet, there was also great deal of conservatism that discouraged "out-of-box" thinking. In my days as a Ph.D. student, I tried to introduce elements of wargaming into the methodology of my thesis. By this, I don't mean games in the commercial market, but serious wargaming used at the Carlisle Barracks/Army War College. I exchanged much correspondence with and visited people at the USTRADOC, Army wargaming specialists, pioneers in the field, etc. I thought I was being quite innovative and original, coming up with ways to increase predictive value of certain theories.*
Nope. I was told not to "waste" my time with games. I ended up incorporating econometrics, which, frankly, glazed over my eyes the whole time I worked on it.
*It's interesting that defense contractors took all that work very seriously much later in my life. Apparently it wasn't rigorous enough for the academic world, but it was rigorous enough for the real world of men in combat.Mr. Khan, there is certainly *enormous* value to the kind of work you do and others of the "hard" science persuasion. And I certainly agree that there is great danger in falling into "the solipsistic sort" of a search that is "primarily an interior monologue with oneself." And yet, don't you think we need both? A search into the truth that is "out there," but also continuing inquiry into who we are? I don't think Xunzi or Marcus Aurelius is the end of the inquiry into virtue or philosophy in general. Indeed, as we discover more about what's out there, I would think we would need to continue to assess where that places us and who we are and how we are to adjust to the new worlds we have discovered.I suppose this has much to do with one's perspective about human beings and the communities they create. If you saw people as merely very advanced and evolved form of amoeba, it would be very limiting indeed to focus on such inner deliberations (which are, I suppose, merely incidental to the biological drive for survival and replication under such a utilitarian perspective). But those of us with a more theistic bent who see the divine spark in our souls tend to think that our minds and bodies are a kind of a nexus between a world to be colonized on the outside and a limitless, creative urge and drive that God has given us that, to this day, defies comprehensive understanding. In other words, does science not require wisdom?Replies: @Razib Khan
One of the distressing things to me about academic research in the humanities and social sciences was the ever irrelevant descent into the exotic and the minute in the constant quest for “original” work.
the tsunami of lines afflicts CVs in natural sciences now too. probably not as bad.
I don’t think Xunzi or Marcus Aurelius is the end of the inquiry into virtue or philosophy in general.
suggestions?
In other words, does science not require wisdom?
if science is modeling the world out there, probably not. automatons could do it. but science is obviously intersected with other things of humane concern. that’s what the humanities were presumably for, though that’s fallen by the wayside among the elites in the west.
How about this? Have you read Étienne Gilson? Especially his influential "The Spirit of Medieval Philosophy"? And as follow-ups, some of Thomas Merton's writings? (And I recommend Merton despite my conservative/orthodox Catholicism in which circles Merton is held at, er, less than stellar repute).
A relatively modern writer-social scientist whose writing is deceptively simple but profound (at least to me) is Robert Nisbet and his "The Quest for Community."Replies: @Razib Khan
the tsunami of lines afflicts CVs in natural sciences now too. probably not as bad.
I don’t think Xunzi or Marcus Aurelius is the end of the inquiry into virtue or philosophy in general.
suggestions?
In other words, does science not require wisdom?
if science is modeling the world out there, probably not. automatons could do it. but science is obviously intersected with other things of humane concern. that's what the humanities were presumably for, though that's fallen by the wayside among the elites in the west.Replies: @Twinkie
Oh, my goodness. Where to start? But given that you appear to be an extremely well-read scientist, not just in your field, but in humanities as well, I am a bit at a loss.
How about this? Have you read Étienne Gilson? Especially his influential “The Spirit of Medieval Philosophy”? And as follow-ups, some of Thomas Merton’s writings? (And I recommend Merton despite my conservative/orthodox Catholicism in which circles Merton is held at, er, less than stellar repute).
A relatively modern writer-social scientist whose writing is deceptively simple but profound (at least to me) is Robert Nisbet and his “The Quest for Community.”
How about this? Have you read Étienne Gilson? Especially his influential "The Spirit of Medieval Philosophy"? And as follow-ups, some of Thomas Merton's writings? (And I recommend Merton despite my conservative/orthodox Catholicism in which circles Merton is held at, er, less than stellar repute).
A relatively modern writer-social scientist whose writing is deceptively simple but profound (at least to me) is Robert Nisbet and his "The Quest for Community."Replies: @Razib Khan
thanks. i’ll put it into my stack.
Razib, your writing is something I look forward to everyday.
With so many scientific articles being published, reading every article you might find informative is every bit as difficult as listening to every pop song you might enjoy. With popular music, we know that it isn’t just intrinsic artistic value that determines which songs become hits and which musicians become superstars. Although talent certainly does play a role, there is also a very large element of randomness.
I imagine something similar must be going on with scientific publications. It’s isn’t enough to be good; you also have to be lucky. Once you catch a break, the Matthew effect (“them that has, gets”) might take over for you, but you’ve got to get noticed first. With both music and science, the things that can get you noticed are not always the things that are most conducive to musical or scientific excellence.
Interesting to note that the natural sciences do not come off scot-free here, though the linked article does not make clear what exactly is meant by “read”.
Many papers in the sciences can be valuable (and cited) just for the procedures or protocols described, and extracting these nuggets of information doesn’t usually require a cover-to-cover reading.
“Or is it that humanistic production has basically become a matter of adding a line to one’s c.v.? ”
Bingo. Publication value has been quantified and for most PhDs, the more publications the better. There are jigs and gimmicks showing how to multiply one valid publication into 3-5 publications. Lots of log rolling, let’s get money to start a bulletin so we can publish one another.
There are also ways of quantifying papers by number of times cited, and to a degree this separates the productive scholars from the timeservers. But a high proportion of scholars trying to get tenure or promotions are not often cited.
There’s also extreme timidity about topics and methodologies, especially when there’s relevance to political issues. Most people here will think of this as enforced liberalism / leftism, and in many areas it is, but in some disciplines (international relations, economics) the tendency enforced is not left at all, and the leftism enforced anywhere is very narrow and skewed toward gender and identity studies.
Or is it that humanistic production has basically become a matter of adding a line to one’s c.v.?
Thoughts on academic publishing (my tuppence):
1) It’s not just the humanities, it is pervasive throughout academia.
2) It is a matter of incentives or, equivalently, Goodhart’s law. Once upon a time, in that glorious golden age, scientists pursued knowledge for its own sake, and published their results both to increase their glory and to increase the general body of knowledge. Publishing, formal communication, was the final step in adding value. As the number of “scientists” grew and “scientist” became first a career and then an occupation, it became necessary to measure output. And so on. Before too long, the goal became not to work for the pursuit or advance of knowledge, but to maximize publications because status and income depended on them.* We know this. Introduce money and status into the mix, and things begin to go very haywire.
3) Again, the problem is not just social sciences or humanities (see John Ionnides’s work) where the costs are likely (though not guaranteed) to be relatively small, but also in biology. And not just in academia. Much of the time when pharmaceutical companies sponsor research, the goal is not to find a new drug that will treat some condition, but to demonstrate that a new drug treats some condition and to do so in a way that both satisfies the FDA and the medical establishment (or prospective patients/customers). Because money.
4) Of the social sciences, it is likely most serious in economics, since “results” here are often used to support (or oppose) some policy proposal that can impose substantial costs (or generate large benefits) for someone or some group. Cui bono? or, once again, because money.
*I had a friend in graduate school 30 years ago who joked that his goal was not to maximize the number of publications per idea, but (what in other contexts should lead to the equivalent result) to minimize the number of ideas per publication. Unfortunately, he said, he had found that there is a lower bound to the number of idea per publication (either 0 or 1, take your pick).
I’m a law professor at a moderately prestigious institution. Here’s a story for you. A colleague of mine was being pestered by student editors to improve an article of his that had been accepted for publication. He related that he laughed them off and assured them that the article was fine as is. And as he explained, “Let’s face it, our colleagues can count, but they can’t read.” I.e., count lines on the resume denoting number of published articles. He was not being quite fair to his colleagues. They can read the name of the journal that published the article.
“she was the first person in the history of the world to gaze upon this particular sequence”
My mantra is that this is true for every minute of my life. Even more poignant, unlike the gene sequence, I will also be the last. Helps curtail time wasting on the internet (somewhat).
There is a lot of crap published. Who could disagree? OTOH, very few people reading papers indicates hyper-specialization, and specialization has its benefits as well as its costs. If the goal of humanities is a well-rounded person, then the costs are doubtless greater in humanities. But I can see the point of some historical researcher who just wants to find something in some documentary archive that no one discussed before, even if there are at most 2 other people in the world who could get excited about the same thing. Is that different (or worse) than an expert on some obscure species of termite?
You seem to be making a philosophical point, and I am not exactly sure what it is. Academic philosophy has its faults, but post-modernism is not one of them. Very few English-speaking universities have ever had more than a handful of self-avowed post modernists in their philosophy departments. No doubt philosophers have engaged in many arid controversies that have no impact on the real world, but post-modernism is a sin of comparative literature.
You say, “Money and fame are not social constructs for [Fish], they are concrete realities.” I don’t get this. If money and fame aren’t “social constructs”, what are? They obviously have reality, but they don’t have reality independent of social convention. When someone claims that something is a social construct, that’s all they are saying. The World Cup and the Chief Justiceship of the US Supreme Court are uncontroversially social constructs, but that isn’t going to stop people from lusting after them.
The instrumentalist claim about science (which seems to be the target of your ire) is that scientific truth is a social construct, like money or law. It’s useful, there are rules about what counts as valid moves within the game, and not everyone is capable of making progress within those rules. Instumentalists can agree that science only “works” because of something about the way the world is (same with money and law). But instrumentalists would claim that if we have a model that helps us organize our perceptions and predict what will happen, then we add no further praise when we say the model represents the way the world really would be if no scientific enterprise had ever developed.
I am not particularly sympathetic to instrumentalism, but it is no threat to science. Science is still superior to law (for example) in terms of the extent to which it has mechanisms to counteract psychological and social bias. To a greater extent, what matters is what you know, not who you know — although this is obviously an idealization of how science works. The more specialist science gets, the more society needs to rely on the a pre-rational respect for the authority of scientists.
The dominance of logos over ethos is the glory of natural science, in comparison with more fashion-dominated fields, but it is orthogonal to the ontological issues of instrumentalism vs. realism. Math is even better on this score, but most people would not say it describes an external world of mathematical objects. Platonists do believe this, but I don’t think you are one.
i'm coming to the conclusion that unfortunately the method is less important than what the method is applied to. many social scientists supposedly hew to the method, but they have not been as successful as physicists
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0010068
Are we really peddling truth, or just the best temporary explanation?
newtonian mechanics is a temporary explanation. if that's all you want to label it as, that's your prerogative. i don't think the human race should be stylistically humble about newton and his heir's achievements, even if superseded in its time by a more precise and accurate model.Replies: @Everyone Else
Are you implying the method is not necessary for the conduct of science?
I’m not sure the cited study proves anything. It compares the results of studies in soft and hard sciences, and concludes that soft science studies are more likely to confirm the study hypothesis. One possible explanation (among several) may be that publishers of hard science are more willing to publish studies that “fail”. This would bias the data selected for inclusion in the meta analysis.
We now know that Newton’s laws don’t work for the very large, the very small and the very fast. This is typical of how science actually works. Ultimately, it doesn’t prove anything, it only disproves. It’s the reason why I balk at your use of the word “truth”.
https://www.fourmilab.ch/fourmilog/archives/2014-08/001530.html
also, can we dispense with this sort of sophistry? this stylized assertion totally sidesteps the fact that unlike many other intellectual disciplines science has actually been both contingent and progressive. fashion influences it, but over the long term (or at least on a lifetime scale) it can not bend it in an arbitrary manner.Replies: @Everyone Else
What are the other intellectual disciplines that you are comparing to science?
We now know that Newton’s laws don’t work for the very large, the very small and the very fast.
i don’t have time right now to reply to everything. but why exactly are you just restating what you yourself quoted me saying? unless you’re a moron you are well aware that newton’s breakdown in certain conditions. obviously you think this is a big deal, which is fine. that’s your prerogative. but don’t talk again like it’s something that everyone doesn’t know already and you are telling us something amazingly insightful (everyone = readers of this weblog, not the general public).
and yes, i’m aware of hume. ‘david hume’ is a pseudonym i use on another blog i contribute to. you don’t need to belabor the point.
this is starting to feel like someone who wants to discuss qualia when i assert that human feces tastes like crap 😉
There’s a slow sea change occurring in the “digital humanities.” Franco Moretti’s Distant Reading (as well as his Stanford Lit Lab pamphlets) is an attempt to put the study of literary history on empirical, quant.-driven ground. Matthew Jockers’ Macroanalysis tries to do the same. It’s an exciting sub-field to be actively following and working in because it has actually inspired ongoing projects and unresolved questions. One recent topic, for example, has been how to mathematically model the emotional trajectory of novel plots.
Of course, plenty of humanists see no value in this work, which is why the most interesting DH discussions are occurring in blogspace instead of in journals.
There’s a motte and bailey dynamic at play here. Certainly, some humanists (like Bruno Latour) use “social construct” to mean something like, in your words, “models that help us organize our perceptions and predict what will happen.” This is uncontroversial and most scientists, I imagine, would agree enough with the sentiment not to bother arguing against it even if they think (rightly, IMO) that the “models” are hard won and deserve a little more descriptive respect.
But then there’s a definition of “social construct” that inches closer to your “something that has no reality independent of social convention.” That’s a big claim. I don’t think your examples—“money” especially—have no reality independent of social convention. They have no reality independent of human action but that’s not at all the same thing, because human action is always connected to and influenced by implacable materialities.
The point is, “social construct” is itself a slippery construct. But there is a difference between using it uncontroversially to denote that humans can only ever make sense of things in human terms versus using it as a proxy for “your models are only in your head, you silly empiricists!”
No one is saying scientific models are only in scientists' heads, or that the ontological status of scientific models, whatever it might be, makes them silly. Let's take Richard Rorty, who would seem like an exemplar of the bad guys Razib is railing against. Rorty denied that there was a difference in principle between the discovery in the nineteenth century that life evolves through natural selection and the discovery in the same century that slavery is wrong.
In both cases, there are "implacable materialities" that make these discoveries true. If people didn't feel pain or have aspirations for autonomy, then maybe slavery wouldn't be wrong. If human biology did not imply more wants than can be compossibly satisfied, we wouldn't have money. Rorty certainly didn't deny that evolution-through-natural-selection is the most useful model because of how the world is. He certainly didn't think that scientific models are in individual scientists' heads, since he thought they were social realities. Least of all did he think that a good scientific model is silly, since what makes it good is that it is the most useful way of organizing its domain.
Rorty did deny that there is a relationship of correspondence or representation between the model and the world, or at least that claiming such a relationship adds anything to saying the model is the most useful way for humans to organize and predict experience. Rorty thought this was important because he was worried about scientism and positivism driving out any non-scientific modes of understanding. I don't know that this is a real worry, since scientific thought is so unnatural for human beings and since the domain of genuine scientific discovery is never going to include everything we are inevitably interested in.Replies: @Razib Khan
http://secularright.org/SR/wordpress/i-am-who-i-am/
As a challenge, in 5 years might you reapply to the NYT as Mr. Juan Hume?
Or, is that name too tacoish and brown and Marco Rubish like?
Perhaps as Mr. Jamal Hume?
Is that name perhaps too chocolaty and black and more for The Root? But with such a name, the NYT editors could see you have repented and add you to their token list.
The penultimate selection might be Gideon Humeberg to appeal to the ultra right Jewish readers of the NYT such as Sheldon Adelson.
How about giving your readers a chance to suggest a newer and more debonair Razib nom de plume?
Some of us would also like to see you harpoon a NYT writer such as Brooks at least once a month.
Can’t speak to other fields, but this problem has seemed to me conspicuously bad in academic philosophy since I started paying attention seven years ago. I came across a glaring omission on Stanford’s online philosophy encyclopedia the other day, which I intend to spring on them the next time something significant gets added to the page. And just today, I see that Colin Wilson apparently had no idea who Paul Tillich was when he wrote The Outsider in 1956. Of course, who can blame one for not knowing everything, and I’ve long thought Nietzsche was only novel to the extent his ignorance allowed his mind the room to roam. The basic truth specialization in a field like philosophy comes to screen out is that there is nothing new under the sun. Even when Wittgenstein said the deepest truths can only be spoken as jokes he was reformulating a line I read by G.K. Chesterton somewhere, and the fact that I can’t remember where I read it sums up the problem.
I think it’d be more accurate to say that modern science doesn’t seek to explain anything or prove and disprove things, but rather seeks to describe, specifically mathematically describe and model, things and make reliable predictions of experiments:
https://www.fourmilab.ch/fourmilog/archives/2014-08/001530.html
You are touching on a very important issue, and that is honour. Totally forgotten in the sadly decadent country I live in. We fucked up big time. Too much money, so now everybody are victims and proud of it. Blame somebody else seems to be the motto.
I’ve tried to to do some humanities( history) at the local ” University” but I only ended up scaring them and was told to write ” so your mother could read it”. Kind of impossible when I’m quoting Khan, Hawks and Cochran. But will try.
Thanks for inspirational ideas, we need guys like you.
Peace
“Now I’m heading up the river in a boat with no paddle.”
I see very little in the humanities to validate this claim the post-modernism is over, or on the defensive.* I would say the opposite is true. Certain ideas associated with postmodernism have become an accepted part of the bedrock in the humanities. Crudely there are two main PM strands. One with a focus on language and another with a focus on politics. Fish and Rorty’s strand of language relativism postmodernism is currently losing ground, partly because it is seen to be apolitical. The other strand in which New Left ideology is entwined with a curious belief that whilst social constructionism explains why everything is bad, it won’t stop everything becoming good, is if anything growing. The name on everyone’s lips is, the execrable, Pierre Badiou.
Post-modernism itself is a lazy, dumbed down, melange of different positions in continental philosophy. For example, in France Derrida and Foucault came from different generations, had different interests and approaches and hated one another. In the kind of introduction most liberal arts students receive all of this is lost. The irony is that Derrida himself was actually appalled at the lazy thinking, relativism & lack of knowledge about the history of philosophy demonstrated by many Anglo-Sphere thinkers who champion PM.
*”associate professor of philosophy at Dickinson College in Carlisle” says it all.
You are definitely right that neither Derrida nor Foucault disliked each other. As far as I know, they never referred to themselves as postmodernists.Replies: @Andrew
You are welcome. By the way, I’d like to recommend a couple of other books from my old field of military history. One is an oldie but a goodie, “The Face of Battle” by John Keegan. The other is Martin van Creveld’s “The Transformation of War,” which is pretty much the latter-day bible of the RMA/4G warfare crowd. The latter ended up becoming something of future history, because van Creveld accurately predicted (in 1991 at the height of the Gulf War I victory euphoria) the asymmetric warfare between the state and the non-state forces to come in the 2000’s and on.
I stopped at Boethius (which this post paraphrases quite well IIRC). They said it all pretty much back then and *much* shorter. I suspect much of the later increase in length is simply camouflage.
On the death-bed point my guess is shallow people will be happy with shallow memories while others won’t so it comes down to “Know Thyself” (as multiple beardy old guys said long ago).
Far be it from me to defend the usefulness of the term “social construct”, especially since it makes anyone who uses it non-ironically sound like an obtuse Women’s Studies major from 1993. Around these parts, “social construct” is often taken to imply some etiological conclusion that there is no evolutionary or genetic causal influence on the convention. It doesn’t really imply that. As I understand it, the US Supreme Court is a social construct in the sense that if there were no conventions about federal US courts and their hierarchies, it wouldn’t exist. That’s consistent with saying that the evolution of such a convention was the inevitable product of physical forces unleashed by the big bang.
No one is saying scientific models are only in scientists’ heads, or that the ontological status of scientific models, whatever it might be, makes them silly. Let’s take Richard Rorty, who would seem like an exemplar of the bad guys Razib is railing against. Rorty denied that there was a difference in principle between the discovery in the nineteenth century that life evolves through natural selection and the discovery in the same century that slavery is wrong.
In both cases, there are “implacable materialities” that make these discoveries true. If people didn’t feel pain or have aspirations for autonomy, then maybe slavery wouldn’t be wrong. If human biology did not imply more wants than can be compossibly satisfied, we wouldn’t have money. Rorty certainly didn’t deny that evolution-through-natural-selection is the most useful model because of how the world is. He certainly didn’t think that scientific models are in individual scientists’ heads, since he thought they were social realities. Least of all did he think that a good scientific model is silly, since what makes it good is that it is the most useful way of organizing its domain.
Rorty did deny that there is a relationship of correspondence or representation between the model and the world, or at least that claiming such a relationship adds anything to saying the model is the most useful way for humans to organize and predict experience. Rorty thought this was important because he was worried about scientism and positivism driving out any non-scientific modes of understanding. I don’t know that this is a real worry, since scientific thought is so unnatural for human beings and since the domain of genuine scientific discovery is never going to include everything we are inevitably interested in.
for me to to agree with this we're going to move up the chain of presuppositions of the world around us to the point where it's a trivial assertion. as a practical matter they are very different assertions.
from what i know it is true btw that postmodernism has not been particularly influential within english speaking philosophy, where analytic traditions reign supreme. but this sort of woolly thinking has become quite normal in areas outside of the narrow purview of academic philosophy, such as literary theory. by normal, i'm just talking about people who actually study topics at the graduate level who i occasionally encounter.
when talking to these people it doesn't strike me that they understand or love their topic of study as much as they do broader social-political implications. or at least scholarship on a fundamental level is not quite all-absorbing. why? i wonder if it has to do with the fact that they're not sure what they're doing really says anything definitive. this, is where i think there's a contrast with scientists. science is hard, frustrating, and grad school is grueling and dispiriting. but no one doubts that despite all of the flaws of scientific culture that people are actually striving to understanding something real about the world around us.
p.s. my attitude toward science is actually on the instrumentalist side. but psychologically the pay-dirt of insight and understanding that you very rarely get in science is something that gives the psychological illusion that for a moment you have a clear window into god's book (borrowing from erdos).
Post-modernism itself is a lazy, dumbed down, melange of different positions in continental philosophy. For example, in France Derrida and Foucault came from different generations, had different interests and approaches and hated one another. In the kind of introduction most liberal arts students receive all of this is lost. The irony is that Derrida himself was actually appalled at the lazy thinking, relativism & lack of knowledge about the history of philosophy demonstrated by many Anglo-Sphere thinkers who champion PM.
*"associate professor of philosophy at Dickinson College in Carlisle" says it all.Replies: @Pithlord
I have never heard of “Pierre Badiou” and google doesn’t seem to know anybody famous by that name either. Do you mean Alain Badiou? I would be willing to bet that you can’t find a philosophy department in North America where that obscurantist old commie is taken seriously. You would find a lot more people interested in experimental psychology or neuroscience.
You are definitely right that neither Derrida nor Foucault disliked each other. As far as I know, they never referred to themselves as postmodernists.
I don't mean necessarily philosophy departments as in many ways they did a better job of ignoring/rejecting these ideas than English or History or Anthropology departments. Someone like Judith Butler who considers herself to be a philosopher is more likely to a professor of English or Cultural Studies or the oxy-moronical Critical theory. None of this blunts the general influence of an approach that valorises "subjectivism and a skepticism about reality". Judith Butler is more vastly influential than say Peter Turchin.
PM is a term that can be used to describe a specific movement or used to cover everything from the work of Fish & Rorty or Bloom & Paul de Man or Derrida & Foucault or Deleuze & Baudrillard. Strictly neither Derrida or Foucault are post modern in the first sense however the phrase "there is nothing outside the the text"* from Of Grammatology is a foundational part of the problem Razib alludes to.
Sometime in the future we will see an end to the influence of these ideas derived from 60's-70's French thinking. However it won't necessarily be because everyone has accepted the limitations of linguistic relativism or social construction-ism. It may even be something worse.
*Derrida later claimed everyone had misunderstood him.
Reading the NYT article, it looks like Fish did perpetrate a confusion on the impressionable Mr. Sartwell. It is true that the difference between a ball and a strike is dependent on the conventions of baseball. It is also true that in those conventions, an umpire’s call is taken to be final. It doesn’t follow that an umpire call is never wrong. That’s obviously fallacious.
Sartwell confuses the reader about what Rorty actually said. He didn’t say it is impossible to describe a non-linguistic reality; he said it is impossible non-linguistically to describe reality. If a model is useful for your purposes, that’s all you need to know about it. That is probably still wrong, but it isn’t quite so stupid.
Anyway, I suspect that most philosophers have always been scientific realists. But I agree that there has been more of a turn towards developments in cognitive science and experimentalism in the last few decades.
No one is saying scientific models are only in scientists' heads, or that the ontological status of scientific models, whatever it might be, makes them silly. Let's take Richard Rorty, who would seem like an exemplar of the bad guys Razib is railing against. Rorty denied that there was a difference in principle between the discovery in the nineteenth century that life evolves through natural selection and the discovery in the same century that slavery is wrong.
In both cases, there are "implacable materialities" that make these discoveries true. If people didn't feel pain or have aspirations for autonomy, then maybe slavery wouldn't be wrong. If human biology did not imply more wants than can be compossibly satisfied, we wouldn't have money. Rorty certainly didn't deny that evolution-through-natural-selection is the most useful model because of how the world is. He certainly didn't think that scientific models are in individual scientists' heads, since he thought they were social realities. Least of all did he think that a good scientific model is silly, since what makes it good is that it is the most useful way of organizing its domain.
Rorty did deny that there is a relationship of correspondence or representation between the model and the world, or at least that claiming such a relationship adds anything to saying the model is the most useful way for humans to organize and predict experience. Rorty thought this was important because he was worried about scientism and positivism driving out any non-scientific modes of understanding. I don't know that this is a real worry, since scientific thought is so unnatural for human beings and since the domain of genuine scientific discovery is never going to include everything we are inevitably interested in.Replies: @Razib Khan
Rorty denied that there was a difference in principle between the discovery in the nineteenth century that life evolves through natural selection and the discovery in the same century that slavery is wrong.
for me to to agree with this we’re going to move up the chain of presuppositions of the world around us to the point where it’s a trivial assertion. as a practical matter they are very different assertions.
from what i know it is true btw that postmodernism has not been particularly influential within english speaking philosophy, where analytic traditions reign supreme. but this sort of woolly thinking has become quite normal in areas outside of the narrow purview of academic philosophy, such as literary theory. by normal, i’m just talking about people who actually study topics at the graduate level who i occasionally encounter.
when talking to these people it doesn’t strike me that they understand or love their topic of study as much as they do broader social-political implications. or at least scholarship on a fundamental level is not quite all-absorbing. why? i wonder if it has to do with the fact that they’re not sure what they’re doing really says anything definitive. this, is where i think there’s a contrast with scientists. science is hard, frustrating, and grad school is grueling and dispiriting. but no one doubts that despite all of the flaws of scientific culture that people are actually striving to understanding something real about the world around us.
p.s. my attitude toward science is actually on the instrumentalist side. but psychologically the pay-dirt of insight and understanding that you very rarely get in science is something that gives the psychological illusion that for a moment you have a clear window into god’s book (borrowing from erdos).
You are definitely right that neither Derrida nor Foucault disliked each other. As far as I know, they never referred to themselves as postmodernists.Replies: @Andrew
Yes sorry Alain (confused with Pierre Bordieu) . That his seminal work Being and Event was only translated into English recently is an indication of renewed interest, and just because he’s an elderly apologist for the cultural revolution won’t stop people lauding him. In recent times i have heard a lot of people breathless talking about his work but who knows (Harvard University Press has published a lot of books about Badiou in recent years & he is a visiting professor at Columbia). The irony is people seem to like him because he uses mathematical language that they don’t understand.
I don’t mean necessarily philosophy departments as in many ways they did a better job of ignoring/rejecting these ideas than English or History or Anthropology departments. Someone like Judith Butler who considers herself to be a philosopher is more likely to a professor of English or Cultural Studies or the oxy-moronical Critical theory. None of this blunts the general influence of an approach that valorises “subjectivism and a skepticism about reality”. Judith Butler is more vastly influential than say Peter Turchin.
PM is a term that can be used to describe a specific movement or used to cover everything from the work of Fish & Rorty or Bloom & Paul de Man or Derrida & Foucault or Deleuze & Baudrillard. Strictly neither Derrida or Foucault are post modern in the first sense however the phrase “there is nothing outside the the text”* from Of Grammatology is a foundational part of the problem Razib alludes to.
Sometime in the future we will see an end to the influence of these ideas derived from 60’s-70’s French thinking. However it won’t necessarily be because everyone has accepted the limitations of linguistic relativism or social construction-ism. It may even be something worse.
*Derrida later claimed everyone had misunderstood him.
Hardly anyone reads anything longer a paragraph of nonfiction anymore, even smart people. It’s because the distracting nature of hypertext has caused our reading muscles to atrophy. The medium is the message, as was once said. If you look at a middle school textbook now, there’s no continuity to the text; it’s disconnected chunks of information placed in colorful boxes. Those schoolbook publishers know how people actually read, and they arrange their books accordingly. I know that I write differently since the Internet came of age. My writing is a lot choppier than it used to be and I have to struggle against it.
At some point it’s going to occur to people that the book-length treatise has a real purpose, and it’ll be revived, if it’s not too late.
Speculating a little, I think this explains a lot of the silly arguments you see online over minutiae like whether it’s racist for a white person to wear a dashiki, ignoring much more important and clear instances of racism elsewhere. People have lost the ability to see the big picture; they only see individual facts in isolation, so whatever they just read seems like the most important thing in the world.
The elite of intellectuals in my country awarded this french fog prince 577663,67 Dollars for his impressive toiletpaperwritings. Way to go!
“The Holberg Prize 2013 was awarded to the French anthropologist and sociologist Bruno Latour.
Citation of the Holberg Prize Academic Committee:
‘Bruno Latour has undertaken an ambitious analysis and reinterpretation of modernity, challenging the most fundamental categories such as the distinction between modern and pre-modern, nature and society, human and non-human.
He has completely re-imagined Science Studies, pioneering new ethnographic methods and introducing new concepts and possibilities of communication to engage in collective research projects. He combines empirical methods and observation with the unsettling of concepts, reconfiguring the organization of knowledge and inviting participation. His influence has been felt internationally and well beyond the social study of science, in history, art history, philosophy, anthropology, geography, theology, literature and legal studies.
Only things that comes to my mind is Ol’ Dirty Bastard’s Dog Shit:
These felines would totally agree with Mr Kahn’s announcement:
Yes, to blindly touch the face of God. Nothing can stand beside it.
No source given for the citations data tho.
I’d also like to know how many of papers of the papers that are cited are only cited by the authors themselves (alone or as one of multiple).