What’s the point of knowledge?

November 18th, 2012 | Posted by:

As always, my posts shouldn’t be taken to be representative of Chris’s views, and remember to help out the Values in Action project, which will be packing 40,000 meals today for food insecure children. 

Zach Alexander left a few lengthy comments in response to my discussion of his review Chris’s book. I don’t think either of us was convinced by the other, so rather than enter into a lengthy back-and-forth in the comments section, I decided it would be more productive to take a few of the points he addresses and open them up to broader discussion.

To counter Zach’s still unfounded idea that Chris doesn’t (Edit: seriously) value truth, I wrote:

Zach complains that Chris focuses too much on eliminating suffering but not ignorance, and to that I ask, if eliminating ignorance doesn’t make the world a better place then what’s the point?

Zach says that “these don’t sound like the words of someone with a passion for knowledge and truth.” He writes:

Eliminating ignorance is an inherent good. That doesn’t mean it’s the only good, or that it outweighs everything else. But it is a good in its own right, contrary to the implication of your question. I would therefore go even further – we should reduce ignorance even at the cost of slightly increasing the suffering in the world. Better to be Socrates dissatisfied than a fool satisfied; better to gain knowledge of the universe and be a bit depressed by its vastness, coldness, and ultimate meaninglessness, than to be ignorant and a bit happier. If that statement sounds bizarre (which seems like your favorite word), I rest my case. 

Putting aside the slight mixture of condescension and presumptiveness (I studied almost exclusively philosophy and psychology in college, am doing neuroscience research now, and plan to go to graduate school to pursue a career in academia. If Zach doesn’t think I value truth or knowledge, he may be overestimating the pay and prestige that comes with a job as an assistant professor), there’s an interesting point raised.

Saying knowledge is an inherent good certainly sounds appealing, but it’s much more difficult to justify in practice. Having anything as an inherent good is a tough sell in a naturalistic framework. Most atheists seem to subscribe to some kind of Utilitarianism, though, where something like pleasure, well-being or preference-satisfaction serves as the fundamental good–the thing that is good in and of itself–and what is moral is what maximizes that good. In this type of system, anything else can only be instrumentally good–not inherently good, but good because it tends to bring about the fundamental good. It seems clear to me that knowledge is an instrumental good in this case, which means we should only be be pursuing it if it makes the world a better place (lucky for us, recent history seems to validate this for the most part, but of course we have limits–things like ethics rules for research exist for a reason). Knowledge as a fundamental good is a hard case to argue, and I’m having trouble thinking of any ethical system that might allow it. As always, I’m curious and open to be proven wrong.

But there is something uncomfortable about living, in Zach’s words, as a “fool satisfied,” rather than “Socrates unsatisfied.” I think part of this can be explained by noting that curiosity and a striving for self-improvement are instrumental goods and thus should be encouraged, but there are nonetheless some tough cases like the experience machine or willfully accepting a comforting illusion. But I think a lot of this confusion boils down to our ability to decide for ourselves. There’s something noble about a scientist or philosopher forgoing material wealth and happiness to uncover some deep truth about the universe, but it seems perverse to force that decision onto someone else. That is to say, I’m more than happy to live a life as a scientist and accept all that comes with it, but it feels wrong to knowingly make someone’s life worse just so that they can have less ignorance. It just doesn’t seem like our choice to make (whereas I would have no qualms at all about going out of my way to help make a stranger’s life better).

I think this has implications for how we enter into debates or arguments–we should always be striving to make the person we debate with better off. Though I still struggle to apply this in my own life, it seems clear to me that we shouldn’t be arguing to boost our ego or make ourselves feel smart. Rather, we should aim to sincerely help and better our partner. In this case, I think some of Chris’s arguments against a subset of New Atheists hold: it’s not hard to find blogs posts or submissions to r/ atheism that seem to aim primarily to degrade believers, rather than address them with their well-being in mind.

Epistemic concerns are obviously important, but, to me at least, they seem necessarily grounded in ethics. That Chris and I might put moral concerns prior to epistemic concerns isn’t a bug that displays a disdain for knowledge, but rather a feature that properly grounds knowledge in human well-being. And I don’t see anything wrong with that.

Vlad Chituc is a lab manager and research assistant in a social neuroscience lab at Duke University. As an undergraduate at Yale, he was the president of the campus branch of the Secular Student Alliance, where he tried to be smarter about religion and drink PBR, only occasionally at the same time. He cares about morality and thinks philosophy is important. He is also someone that you can follow on twitter.

9 Responses to “What’s the point of knowledge?”

  1. Zach Alexander Says:

    Vlad,

    It’s difficult to continue this conversation if you are unable or unwilling to accurately characterize what I’m saying.

    If you’d like to debate the strawperson that you or Chris “doesn’t value truth,” full stop, that’s fine, but that’s not what I’ve been saying. I’ve been very careful to virtually always qualify that statement, because it’s not black and white, but a question of degree.

    Most recently, here is what I said in the comments to your last post. Every single statement is carefully qualified:

    * Highlighting the difference “in how much we value”, not whether we both value epistemic matters.
    * Questioning whether he “highly values” them, not whether he values them at all.
    * Saying he “almost completely ignored” epistemic values in the book, not that he totally ignored them.
    * And they seem to be “extremely low on his list of priorities, not completely absent.

    In the original review, every time I directly make the claim, I qualify it. Even most of the times it is merely implied, or stated indirectly, I still qualify it.

    * I called him a “post-truth atheist” (not an anti-truth one)
    * I then said he seems to place “almost no value” on epistemology (not no value)
    * And compared him to a restaurant critic who is “mostly indifferent” to the quality of food (not wholly indifferent)
    * And I questioned whether he gave some specific, low amount (“half a fig”) of care (not whether he cares at all)
    * I suggested he doesn’t see “much value” in the epistemic goals (not no value)
    * I suggested matters of truth and knowledge “play little role” in how Chris operates (not that they play no role)

    The same goes for what my comments about you personally in the previous thread:

    * I observed that you don’t seem, from your previously blog post, to have a “passion” for truth and knowledge (not whether you care about them at all)
    * And I suggested that someone who “significantly” or “particular[ly]” cares about them (not that you don’t care about them at all)

    I understand that when a conversation gets heated it’s easy to gloss over the nuances.

    But there’s little point in even reading the rest of your piece (though I did skim it) if your starting premise is a gross mischaracterization.

  2. Vlad Chituc Says:

    It was hardly a starting premise. In fact, it doesn’t really figure into the rest of what I wrote at all. I can change “doesn’t value truth” to “doesn’t seriously value truth” if you’d like, but that seems a minor complaint that doesn’t appreciably affect the point of my piece at all, or for that matter how absurd your claims about how much we value epistemology.

  3. Zach Alexander Says:

    PS – In the spirit of finding common ground, I can explicitly say that yes, I believe you and Chris do care about matters of truth and knowledge to at least some degree.

    And I can even apologize for the extent to which my review, despite the myriad qualifications, conveyed a more black-and-white impression to the contrary.

    Now can we please talk about the claim I’m actually making?

  4. Zach Alexander Says:

    I would appreciate that change, actually.

  5. Vlad Chituc Says:

    “I’m not saying you don’t value truth, just that you hardly value it at all!” is hardly a much better or less absurd claim to make. I’ve still yet to see any argument from you to take it seriously.

    “Now can we please talk about the claim I’m actually making?”

    What claim are you actually making? Particularly since the only thing you addressed about this post was a minor and inconsequential sentence at the beginning.

  6. Vlad Chituc misses the mark badly. Says:

    [...] the mark badly.November 19, 2012 By JT Eberhard Leave a CommentThe ever dishonest Vlad Chituc has a post up over at Chris Stedman’s blog in response to Zach Alexander’s review of Stedman’s book, in which the bulk of [...]

  7. Zach Alexander Says:

    First, thanks for the clarifying edit.

    If you don’t see much difference, then I think you should look harder. Subtle differences matter. The difference between not valuing truth very much, and not valuing it at all, is a small, but important difference, like being out of gas vs. being almost out of gas. It’s the difference between Chris being someone who can be reasoned with, and someone who can’t.

    And it’s the difference between inserting into my mouth an extreme claim I didn’t actually make, and which is unsupportable from Chris’s text, vs. accurately characterizing the more moderate claim which I did in fact make.

    And which is, in fact supported by the text. You keep saying it’s “still unfounded” and you’ve “yet to seen an argument…”, but that doesn’t make it true. I made a very thorough, careful case to that effect from his book. (Is it true of the man? I’m still persuadable. But it was a book review, not a review of his entire œuvre and unpublished thoughts.) You tried to refute it, and failed. So if you still think my claim is unfounded, we’re still waiting to see your evidence. I’ll check the comments on that post every so often in case you come up with something.

    Or perhaps you can instead take a deep breath, concede that I have a point, think about it, and explain where exactly you think I’m onto something, and where you think I’m going too far. That would be a more productive conversation than concede-nothing, challenge-everything apologetics.

    Moving on –

    I meant no condescension, and there’s nothing presumptive about telling you how your words come across – especially since I prefaced it by saying I’m open to persuasion on what you care about, but FWIW…

    But anyway, I think we’re making progress here.

    Because your parenthetical comment (about your studies and job) made me realize I’ve been a bit too broad in how I’m saying what I’m saying. I haven’t been intending to say that you or Chris doesn’t care about, in the sense of “finding interesting” or “thinking is important”, knowledge and truth. Some of the phrasing I’ve been using can indeed be interpreted this way, so you’ve got me there.

    What I have been intending to say is a narrower claim: that you don’t have normative opinions in these domains that are, I think, sound and sufficiently strong. In other words, yes, certainly – if you’re an academic, you probably care a great deal about knowledge, and probably also some definition of truth. But that doesn’t necessarily mean you have strong normative opinions about them. You can get a PhD in epistemology, and still be a epistemological relativist. You can care a lot about food, and still not have strong normative opinions about what makes food good or bad. You can study art (as in, art criticism or art history), and still have few strong artistic opinions, or bad ones. They’re just two totally distinct things. One is descriptive; the other is normative.

    So if you’re a budding neuroscientist, I believe that you’re interested in knowledge, truth, and discovery on some level, and perhaps to a great degree. And I could take your word that you do, in fact, have normative opinions on these things as well. But if you have no objection to the concept of faith (do you not?) – belief in the abscence of evidence, or even contrary to evidence – then however great your interest in knowledge and truth as objects of study, it’s hard for me to see your opinions about them as sound or appropriately strong.

    Anyway, that’s all by way of clarification of my main claim.

    A brief response to the main new point this post makes – yes, I can concede that in some limited cases, it does feel wrong to “make someone’s life worse just so that they can have less ignorance.”

    When my Christian mother was dying, my godless (AFAIK) older sister comforted her by telling her things like she was going to go to heaven to be with Jesus. As repulsive as it is to knowingly reinforce ignorance or superstitious beliefs like that, the value of my mother having a more intellectually defensible understanding of what was to come (namely, nothing) didn’t seem worth the pain it would cause to undermine what for her was a source of emotional support. And if I had gone through the hospice telling random strangers they weren’t going to heaven, that would’ve been even more cruel.

    But I can think very few other cases where I would agree with you. People have a duty to be reasonable. Ignorance is often a bad thing. Absurdities do facilitate atrocities. So if a little bit of suffering – not deathbed despair, not physical pain, not severe depression, but mild emotional distress, essentially – is the price of rationality and knowledge, so be it. Losing my faith was excruciatingly painful. But I’m a better person for it. And society is better off with one more person who is trying to live in the real world instead of an imaginary one. The same, I think, could be said for Chris.

  8. What’s normative about epistemology? | NonProphet Status Says:

    [...] travelling for the Thanksgiving holiday and catching back up with my work, so I’ve neglected a rather long comment by Zach Alexander on my post about how the pursuit for knowledge necessarily (as far as I can tell) [...]

  9. Vlad Chituc Says:

    Hi Zach, so I touched on some of the broader points and continued our discussion in a longer post, but I just wanted to touch on a few of the smaller things here:

    First, I think you should maybe look a bit harder. There’s certainly a difference between “none” and “very little,” but the point is we aren’t on the “very low” end of things. If you ace a test, the difference between a claim that you got a 0 or the claim that you get an F doesn’t really seem very substantive at all. But that’s minor.

    And your last paragraph strikes me as confused. Why? do we have a duty to be reasonable. If you’re just going to state it without justifying it, then you’re begging the question. If the duty is grounded in wellbeing, then how can you possibly value it over wellbeing? Your comments on your mother seem to suggest you realize that it doesn’t, so I’m confused what you’re arguing for exactly. No one is under any confusion over whether believing the earth is flat is a good thing, but this conversation is about why it’s a good thing, and you haven’t really given an answer.

Leave a Reply