On productive criticisms

socratesI have been in a couple of discussions recently in which someone opined that certain sorts of objections or criticisms are not “productive”. The basic idea is that a criticism should not be wholly negative; it should point out a possible correction, or a new way forward, or something constructive. It should not merely have the form “That is wrong/stinks/has no justification”. One occasion, in fact, was a classroom discussion of Socrates in the Apology. Socrates, of course, insists that he does not know what virtue is, or justice. He merely has discovered that all of the supposed experts he has encountered also do not have this knowledge. He points this out to them, causing no small amount of embarrassment – and he does not follow up this criticism with any suggestions of his own. In this way, according to some students, his criticisms are not “productive”.

Socrates would insist otherwise. We do take steps forward when we realize that all the steps we have taken up to this point have been in the wrong direction. Sure, it’s not a happy discovery – all that wasted effort, and now no clear idea of where to go next – but at least we now know to stop and to reconsider. A negative result is indeed productive, as it stops us from continuing in a nonproductive direction.

This is a simple point, but I think it touches on a psychological element that underlies our efforts. Humans, particularly in a society like ours, place great value in work, forward movement, progress, and general “bettering”. Anytime we are not engaged in forward movement, we feel our time is being wasted. Consider this situation: you need to drive to the movie theater, and your current route is blocked by very slow-moving traffic. At this rate, it will take you thirty minutes to get where you’re going. But you could take the long way around, sure to have little traffic, which will also take thirty minutes. Which do you choose? We all know that it doesn’t matter. But many of us – myself included – would rather take the long way around, since faster forward movement feels better than slower movement. We feel more productive along the way, even though clearly we’re not: the same destination is reached in the same time.

Transfer this point to the situation where we are trying to discover what truth is, or what virtue is. Someone articulates a project or theory that gives us a lot to talk about, and we get busy raising criticisms, amendments, further applications, complications, and so on. Then someone like Socrates raises an objection that reveals a deep inconsistency in the foundations of the project. What are we supposed to do? We are already invested in the project, busily making what seemed to be progress with it. Now this critic points out our efforts are deeply ill-conceived – but the critic has no new project to offer us, no new efforts on which to spend our energy. Some us will call the criticism “unproductive”, for it leaves us with nowhere to go. But: what? Would it be better to continue to work on an ill-conceived project? Would it be better for the critic to launch some new project that, for all anyone knows, isn’t any less ill-conceived? We feel frustrated, to be sure; but scapegoating the critic for our folly won’t make our efforts any less wasted.

The policy that results from the “that’s not productive” criticism of criticism is that critics should not raise criticisms of a project unless they have a new project to offer. But what if all the critic knows is that the current project is doomed? Should that information be withheld until some new outlet for the wanton expenditure of energy is identified? To go back to the driving example: if I am a passenger, and I calculate that, no matter which way we go, we’ll be late for the movie – or if I somehow figure out that the movie is cancelled – should I shut up about it until I come up with an alternative idea of what we can do? If I make my observation without supplying that alternative, should the other passengers chastise me for being “unproductive”? That would be mad.

Clearly there are unproductive criticisms: criticisms which do nothing more than provide needless complications to projects whose overall intelligibility remains intact. If I insist that, for all we know, the traffic problems might get worse, and the alternative route might be equally congested, this really does not help us in our situation, particularly if I’m just spinning out possibilities without any reason for thinking they are actual. My criticisms (pointing out ways in which our knowledge is incomplete), though they are true, don’t give us any reason for thinking our current efforts won’t succeed, and they only clutter up our efforts with truths we can’t use productively. It occurs to me now that it may be that the students criticizing Socrates thought this was what he was doing, perhaps because they were confusing (a) cluttering up our efforts with truths we can’t use productively, with (b) proving our efforts to be wasted with truths we cannot use productively. That seems to me a distinction worth preserving.

Posted in Items of the academy / learning | Leave a comment

Truth: an initial stab at the thing

knife-stuck-table-black-background-60407940On campus we are having a series of discussions under the title of “facticity.” No, it’s not a headlong plunge into German idealism and the impossible task of capturing the brute “thatness” of what experience coughs up. Instead, it is about (and ultimately against) a perniciously widespread notion that agents can believe whatever they want, and enlist any old set of “facts” to support those beliefs. In short, there are no “alternative facts”; there are just facts, and though they may be difficult to discover, and tricky to work into a theory, they are stubborn and unforgiving things, and not up for negotiation.

The vast majority of the participants and audience have no background in philosophy, so I often find myself pressing my lips together tightly while others breeze obliviously past distinctions I want to make and low-hanging counterexamples. For it is a worthwhile discussion to have, even (or especially) outside of a rigorous philosophical context. Truth is for everyone, I want to say, and they need to find their own ways to make it their own. I pop in here and again where I think I have something useful to say.

I have wanted to supply some sort of easy-going, introductory account of truth for participants and attendees to read, but so far have had no luck in finding one. So, faute de mieux, I’ll try to offer one myself –


For starters, there’s nothing so easy to define as truth. Aristotle put it most economically: When I say of what is, that it is, and say of what is not, that it is not, then I speak the truth. Truth, then, is a matter of putting into words how things really are. This is a perfectly good workaday definition of truth, the one we use most commonly when we are trying to find the truth of the matter: we are looking for the best description or explanation of how things really are.

But truth is like money: it’s one thing to define the concept, and another matter entirely to get as much of it as you can. How can we tell when something we say is true? Aristotle’s definition would suggest that we listen to a string of words, and then we take a peek at the world to see if that’s how things really are. But taking a peek at the world turns out to be a tricky business. All we ever see are events, and we can’t see events except from some perspective, through the lens of some interpretation, under certain lighting conditions, and so on. Events are flavored by our own preconceptions and circumstances. Moreover, usually the “truths” we are interested in go beyond the facts: they state generalities, or causal relationships, or matters from the past or future, and so a mere collection of facts is not enough, even if we could see them clearly.

The fancy term for claiming that facts are flavored by the condition of the observer is the “theory-ladenness of observation.” The fancy term for claiming that the truths we are interested in go beyond the mere facts is the “underdetermination of theory by the facts.” When observations are theory-laden, and when theories are underdetermined – which is basically all the time – there is plenty of room for all kinds of unwelcome garbage to flow in. Theories come to be shaped by class, economics, prejudice, politics, and personality – nasty features of the human experience which, on the whole, tend to cloud our visions of what’s true.

I’ll give the briefest sketch of an example. In the 17th century, a bunch of Brits were arguing about whether there could be a vacuum, a space empty of all matter. Robert Boyle and his pals thought they could produce a vacuum with their new-fangled air pump. They would place a mouse or a lit candle under a bell jar, turn on the pump, and watch the mouse die and the flame go out. But Thomas Hobbes did not believe the space in the jar was empty. If it were, then there would be a nothing that had a determinate size and shape, and that was metaphysically ridiculous. Instead, Hobbes believed, what Boyle’s air pump showed was that there is some further material substance which slips in through tiny pores in the glass, and which is insufficient for supporting the life of mouse or candle.

Hobbes’s worry was that Boyle & Co. were making themselves out to be experts of invisible things (i.e., vacua). But England had recently suffered a terrible civil war, fought mainly on religious grounds, between two opposing teams of experts on invisible things. Not good, thought Hobbes; let’s rather stick to tangible things. Boyle, on the other hand, was helping to found a scientific institution, the Royal Society of London, which would consist of Scientific Experts who could Float Free from Politics and Engage in Disinterested Inquiry into Truth. Such experts, one might expect, would discover truths which even kings must recognize. That’s what Hobbes feared, as he believed that challenges to authority almost always result in war.

Anyway, what this sketch of an example (and it’s only a sketch; the longer story is more complicated) is meant to show is that usually humans don’t transition smoothly from observed facts to scientific theory. There are many ingredients in the mix, and most of them complicate things considerably.

This leads me to another dimension of truth, the affective one. Socially, when we insist that some claim is true, or that something is a fact, we are really saying: “Don’t argue about this with me; this is not up for negotiation.” This is a truth, this is a fact: the fricatives themselves (f, ct, th) disclose the threat hidden in our meaning. “Fact” means “do not f*ck with me.” Apart from the important intellectual work of figuring out the process by which facts support or fail to support theories, we need to be aware of the affective dimension of truth, or how claims to truth operate as billy-clubs by which we assert dominance over others. Indeed, this may be a big part of why the controversy over facts vs. “alternative” facts causes such dramatic rises in people’s temperatures: it is not just truth that is at stake, but power, and dominance.

But it would be the highest folly to believe that the affective dimension of truth is all there is to truth. And it must be admitted that, in some quarters of the university, under the aegis of Foucault, people who should know better have succumbed to this folly. Yes, there can be no doubt that claims to truth have been used to put people in their place, so to speak. But one cannot make this claim and at the same time insist that that’s all there is to truth – unless they think of themselves as using this claim just to put people in their place. As hard as they may be to discover and describe, there are facts, and reality is not merely constructed out of human relations and conflicts, though these complications surely play important roles. Getting to the truth may be hard, and even at times impossible, but this is not to say that anything goes. Some things go, and others do not; the task of discerning between the two is what should keep all of us busy.

Posted in Historical episodes, Items of the academy / learning, Meanings of life / death / social & moral stuff, Uncategorized | 4 Comments

Access Utah interview

Host Tom Williams played a game of “let’s see what we can throw at Charlie” on the radio yesterday. The recording is here. It was a very fun conversation, as always.

Posted in This & that in the life of CH | 2 Comments

Putting history into history of philosophy

If we wish, however, to arrive at an interpretation of a text, an understanding of why its contents are as they are and not otherwise, we are still left with the further task of recovering what the author may have meant by arguing in the precise way he argued. We need, that is, to be able to give an account of what he was doing in presenting his argument: what set of conclusions, what course of action, he was supporting or defending, attacking or repudiating, ridiculing with irony, scorning with polemical silence, and so on and on through the entire gamut of speech-acts embodied in the vastly complex act of intended communication that any work of discursive reasoning may be said to comprise. – Quentin Skinner

martin-luther-9389283-1-402We are coming up on the 500th anniversary of Martin Luther making 95 posts to the Church’s Facebook page, and in recognition my home institution is sponsoring a small symposium on the Reformation. A colleague in History put out a call for abstracts, offering the following list for inspiration:

Topics include but are not limited to:

  • Medieval religious reform movements, heresy, or Inquisition
  • The role of language, art, or material culture in reform
  • Environment and reform (e.g. architecture and landscape)
  • Book and manuscript studies
  • Global Reformation
  • Gender and the Reformation
  • Royal Courts and the Reformation
  • Lived experience of ordinary people
  • Reformation and the family
  • Geography and the Reformation

There is something here for any humanist – historians, obviously, but also art historians, literary scholars, librarians, and the broad array of scholars falling under religious studies, European studies, global studies, and gender studies. It is meant to be a big tent, for we’re a friendly bunch out in the rural west, and we don’t like to leave anyone out.

But as a philosopher, even as a supposed historian of philosophy who covers periods from 1500 forward, I saw in this list no obvious entry point, no topic with my name on it, as it were. This isn’t just reflective of my own narrowness and provincialism, but the narrowness and provincialism and character of my discipline as a whole. Philosophers, basically, rarely play with others; and when they do, they tend to play with scientists, and possibly the occasional social scientist; rarely if ever will they play with a humanist.

This is because philosophers generally conceive philosophy as having everything to do with arguments, and having nothing to do with context. This claim will sound nonsensical to a humanist – for how on earth are we supposed to understand an argument without finding it in some text that was written on some occasion with some audience in mind and some motivation behind it? A philosopher’s response to this question might be, “Of course that is so. But if the argument makes sense only in its historical situation, we’re not interested. On the other hand, if it can be airlifted out of that context and explored with our own arsenal of distinctions and analytical tools, then we are all eyes and ears.”

The most thorough articulation of this attitude has been presented by Jonathan Bennett in what he calls “the collegial approach” to the history of philosophy. In this approach, we treat historical philosophers as colleagues down the hall, whose arguments should be subjected to the same scrutiny as any piece published a day or two ago. After all, we don’t explain away our colleagues’s arguments as products of their time, class, and circumstance; instead, we deal with their claims and arguments directly, and judge whether they have gotten things right. In doing this, Bennett urges, we accord historical philosophers with the greatest respect: as colleagues in the very same effort of discerning truth.

Such an approach presupposes that philosophers from different times and places have equal access to a shared domain of ideas, and in this shared domain there are relations of implication and consistency which hold (or fail to hold) objectively. It is a bit like when historians reading Newton’s alchemical works turn to chemistry to help them figure out what Newton was talking about when he referred to the green dragon or the blood of the whore of Babylon. The chemical elements haven’t changed between then and now; so when we head into the lab and see what happens, we are seeing what Newton himself saw. Philosophers head into the concept lab, and see what implies what and what doesn’t, and when they do, they are seeing just what Locke or Leibniz saw when they were working away in their concept labs. Moreover, if Locke or Leibniz were thinking about free will or the nature of matter, then there is the odd chance that what they saw will help us along in our own thinking about these subjects.

This expresses the main paradigm for Anglophone scholars in the history of philosophy over the years 1960 to 2000. It is still very strong today, though now there are more historians of philosophy turning their attention to historical contexts, if only to gain better interpretations of the texts they’re reading. They are still not delving into the economics, politics, and culture of earlier times, but they are heading into the archives and gaining firmer knowledge of the biographies of the philosophers, and so to that extent they are more historically attuned. But Bennett’s collegial approach, or something near to it, still holds sway over the subdiscipline – at least for now, so far as I can see, judging from relevant conferences and journals.

I regard this as unfortunate. I’m not ready to dismiss the notion of relations of implication and consistency among ideas holding objectively across time and space; for if we abandon that, then we really have no hope of understanding the texts we confront. But, obviously, philosophical thought is shaped by historical circumstance, and what we learn by understanding those circumstances is more valuable than what we might glean from the more austere collegial approach. We learn that philosophy involves not just human minds, but human lives. The quote from Quentin Skinner at the beginning of the post gets things just right: there are many ways to read a text, and ignoring context is a needless handicap philosophers place upon themselves.

Historians of philosophy – like me – should be able to look at calls for abstracts like the one given above and see good entry points. But to put ourselves in a position to do so, we need to start playing with the humanists. In my experience, they’re a fun bunch of people, and they have a lot to teach us. And philosophers – once we catch up on a dialogue we have ignored for too many years – may have a lot to offer in return.

Posted in Historical episodes, Items of the academy / learning | 7 Comments

The Cold War’s shaping of American philosophy

John McCumber, Time in the Ditch: American philosophy and the McCarthy era (Northwestern UP 2001)

George Reisch, How the Cold War Transformed Philosophy of Science (Cambridge UP 2005)

Whether inclined toward socialism in the 1930s or defending itself against anticommunism in the 1940s and 1950s, logical empiricism was neither apolitical in its values and ambitions nor an unpolitical community of scholars, somehow insulated from Cold War pressures.  (Reisch, p. 373)

truth dollars

from Reisch, p. 354

According to McCumber and Reisch, as the logical positivists moved from Europe to the U. S. and formed a loose alliance with the pragmatists, they retreated into safer political territories in order to protect themselves against the nefarious forces of McCarthyism. And even further: the “scientific philosophy” they developed, forsaking traditional moral philosophy for rational choice theory, provided a theoretical backdrop to America’s cold war temperament (and in several cases was funded directly by the RAND corporation). As a result, several generations professional philosophers in America cut themselves off from topics of any social or cultural relevance, and also actively resisted meaningful connections to the rest of the humanities or social sciences.

(A short version of McCumber’s view – the one that got me into reading further on the topic – can be found in an Aeon essay here.)

Both studies trace the gradual transformation of academic philosophy in the U. S. into a discipline of thought that worked hard to free itself from both metaphysics and ethics (at least into the 1970s). Metaphysics, according to the logical empiricists, was just loopy, untethered science, never held in check by actual experimental results. Ethics, on the other hand, was all fine and good, but also not empirical, and guided ultimately by whatever values an individual might happen to have. The resultant philosophy is a perfect fit for a society that aimed at producing technical know-how and left morality to individuals as a matter of personal preference. (One thinks immediately of Tom Lehrer: “Once the rockets are up, who cares where they come down? That’s not my department, says Werner von Braun.”)

It is an odd result for philosophy, which one might otherwise believe to be a realm where questions of value (and critical assessments of knowledge, scientific and otherwise) should be the meat-and-potatoes of scholarship.

Of the two books, Reisch’s pays closer attention to details and the changing relations among the actors involved. One of several interesting episodes he recounts is a struggle at the University of Chicago, where the University’s president, Robert Maynard Hutchins, did indeed think that philosophers should be more than adjuncts to the RAND corporation. In the 1930s, philosopher Charles Morris came to the Chicago with the intention of building its philosophy program in a direction that would blend Dewey’s pragmatism with logical empiricism and bring the resulting amalgam to bear on American culture. But Hutchins did not share Morris’s enthusiasm, as he was drawn more toward Mortimer Adler’s neo-Thomistic vision for philosophy (which holds science at arm’s length, and Thomas close to the heart). The more that Morris tried to connect the department’s efforts to the rising tide of logical empiricism, the more opposition he met from Hutchins and Adler. Indeed, it seems that both Morris and to some extent John Dewey were interested in finding ways to make logical empiricism actually connect with the sorts of practical, political, and cultural problems the U. S. was experiencing over some intensely unsettling decades (though eventually these two also fell out with one another). Over time, Morris was marginalized (and Dewey died), and logical empiricism followed its evolution into analytic philosophy, partitioning off any social concerns as valuable but not properly scientific, and so not within philosophy’s proper scope. Hutchins and Adler never had much influence beyond “great books” curricula at Chicago, Columbia, and several small colleges.

McCumber’s proposed solution to this disciplinary dead-end is basically less Carnap, more Hegel. Hegel knew concepts to be historically conditioned, and envisioned philosophy as an age grasping itself through concepts. To do this right, one has to be a careful student of history and an astute observer of contemporary society, and one has to think our culture through down to its deepest features. This is what an ordinary person might suspect philosophers are supposed to be doing anyway; but then an ordinary person has probably not spent much time around a university department of philosophy, which more often than not strives to talk about stuff no one outside of their membership can understand as meaningful or relevant.

McCumber obviously has some personal axes to grind: at some point he left academic philosophy so that he could pursue his own studies more freely in a department of Germanic studies. But his critique rings true. I had no idea of the connections between the analytic philosophy of the 50s and 60s and McCarthyism, and at first I was doubtful, but I now think Resich and McCumber make a compelling case. And the weaker thesis – that American philosophy has striven to be irrelevant – does seem quite evidently true. I can remember years ago reading the Library of Living Philosophers volume on Quine, and coming across an article that criticized Quine for not engaging actively with the big philosophical questions of human understanding, social morality, and the meaning of human existence. I suspect the editors included this fellow’s essay just to give Quine the chance to smack back – and smack back he did, writing that if this author had any good proposals for making the world a better place, then he should get on with it. Then and now I imagined readers of the volume cheering Quine on, saying, “Way to go, Van! Tell that ninny to take his big questions elsewhere! We have problems of linguistic reference to sort out!” 

I do agree with McCumber’s proposed solution: philosophers need to be better-equipped to apply philosophical thinking to questions and problems that matter to people. As I’ve argued before, there should be at least some graduate programs in philosophy that aim to prepare young philosophy PhDs for the actual array of courses they are likely to teach, and which nourish and support the big-picture enthusiasms that attract many students to philosophy in the first place. Some philosophers should write some best-selling books on subjects that interest a broader swath of readers, if they have the talent for it; for if it can be done in subjects like particle physics and economics, it can likely be done in philosophy. (And if it can’t, that’s more evidence that something has gone terribly wrong.) And undergraduate philosophy programs should strive to find ways to integrate with professional degree programs, as we can all agree that the world would be a better place if the people pushing the buttons had some training in thinking philosophically – meaning, with open hearts and critical minds.

Posted in Books, Historical episodes, Items of the academy / learning | Leave a comment

Mars teleporter essay on Aeon

I am stranded on Mars. The fuel tanks on my return vessel ruptured, and no rescue team can possibly reach me before I run out of food. (And, unlike Matt Damon, I have no potatoes.) Luckily, my ship features a teleporter. It is an advanced bit of gadgetry, to be sure, but the underlying idea is simplicity itself: the machine scans my body and produces an amazingly detailed blueprint, a clear picture of each cell and neuron. That blueprint file is then beamed back to Earth, where a ‘new me’ is constructed using raw materials available at the destination site. All I have to do is step in, close my eyes, and press the red button…

The rest here.

Posted in Metaphysical musings | Leave a comment

Handcarts, beer, and apes

To the rest of the world, today is an ordinary Monday – people are going to work, the mail is being delivered, the media focus on the latest outrages issuing from politicians, and so on. But here in Utah, it is Pioneer Day, a holiday bigger than the Fourth of July. Pioneer Day marks when Mormon settlers completed their arduous trek from Missouri to the Salt Lake Valley, thus entering into their Promised Land and escaping the hegemony and oppression of their tyrannical overlords – this being the U. S. government.


(from the Salt Lake Tribune, illustration by Francisco Kjolseth) 

Like any such holiday, it’s more hype than history, and it tends to drive non-Mormon Utahns (called “gentiles” in these parts) straight up the wall. And so they celebrate their own holiday – “Pie ‘n’ Beer” Day, trumpeting the fact that they prefer beer to celebratory parades of handcarts. It is all meant in good fun, and most Mormons take it in stride. But, beneath the humor and irony, Pie ‘n’ Beer day is a way for Utahn gentiles to celebrate the ways in which they can escape the hegemony and oppression of their tyrannical overlords – this being the Church of Jesus Christ of Latter-Day Saints.

My family and I tend to be on the reclusive side, and so we will avoid any handcart parades or parties featuring pie and beer (separately delicious, but a most unfortunate combination, to my way of thinking). Instead, we have taken this holiday weekend to watch the latest re-boots of the Planet of the Apes movies.

The movies (Rise of the Planet of the Apes and Dawn o.t.P.o.t.A.) are better than the original movies in every conceivable way, and all the credit goes to CGI and the amazing Andy Serkis. The core thrill of the films is to see apes – mistreated and tortured by greedy and violent humans – rise in intelligence and power until they can break free from their bondage and create a civilization of their own, while the human civilization goes down in flames.


Andy Serkis as Caesar

It’s puzzling why my family, a small band of human primates, should cheer while watching our kind get pounded by another branch of primates. But I think it is because the films highlight ways in which we know our civilization has gone wrong – the capitalistic enterprises of pharmaceuticals and genetic engineering, the cruelty of those enterprises, and the broad human disposition toward war and devastating weaponry. The apes, led by the forward-thinking Caesar, represent at least the possibility of a different path – though one, as it turns out, that ends up facing the same problems of greed, power, treachery, and tragic misfortune. By cheering for the apes, we are cheering for some fantasy in which we can wipe the slate clean and establish a new society, thus celebrating the thought of escape from the hegemony and oppression of our tyrannical overlords – in this case, our own species.

Plus, the apes are wicked cool as they swing through the trees and roar and tumble. They are delightful films to watch with a beer in hand – saving the pie for later.

Posted in This & that in the life of CH | Leave a comment