Talks Not About Info

You can often learn about your own world by first understanding some other world, and then asking if your world is more like that other world than you had realized. For example, I just attended WorldCon, the top annual science fiction convention, and patterns that I saw there more clearly also seem echoed in wider worlds.

At WorldCon, most of the speakers are science fiction authors, and the modal emotional tone of the audience is one of reverence. Attendees love science fiction, revere its authors, and seek excuses to rub elbows with them. But instead of just having social mixers, authors give speeches and sit on panels where they opine on many topics. When they opine on how to write science fiction, they are of course experts, but in fact they mostly prefer to opine on other topics. By presenting themselves as experts on a great many future, technical, cultural, and social topics, they help preserve the illusion that readers aren’t just reading science fiction for fun; they are also part of important larger conversations.

When science fiction books overlap with topics in space, physics, medicine, biology, or computer science, their authors often read up on those topics, and so can be substantially more informed than typical audience members. And on such topics actual experts will often be included on the agenda. Audiences may even be asked if any of them happen to have expertise on a such a topic.

But the more that a topic leans social, and has moral or political associations, the less inclined authors are to read expert literatures on that topic, and the more they tend to just wing it and think for themselves, often on their feet. They less often add experts to the panel or seek experts in the audience. And relatively neutral analysis tends to be displaced by position taking – they find excuses to signal their social and political affiliations.

The general pattern here is: an audience has big reasons to affiliate with speakers, but prefers to pretend those speakers are experts on something, and they are just listening to learn about that thing. This is especially true on social topics. The illusion is exposed by facts like speakers not being chosen for knowing the most about a subject discussed, and those speakers not doing much homework. But enough audience members are ignorant of these facts to provide a sufficient fig leaf of cover to the others.

This same general pattern repeats all through the world of conferences and speeches. We tend to listen to talks and panels full of not just authors, but also generals, judges, politicians, CEOs, rich folks, athletes, and actors. Even when those are not the best informed, or even the most entertaining, speakers on a topic. And academic outlets tend to publish articles and books more for being impressive than for being informative. However, enough people are ignorant of these facts to let audiences pretend that they mainly listen to learn and get information, rather than to affiliate with the statusful.

Added 22Aug: We feel more strongly connected to people when we together visibly affirm our shared norms/values/morals. Which explains why speakers look for excuses to take positions.

GD Star Rating
Tagged as: ,
Trackback URL:
  • efalken

    Because reputation lags achievement, we should expect people to reach the zenith of their reputation well past the zenith of their productive output.

    • Ari


      As a related noted, Robin Hanson is the walking Plato among us. If history is just (and we wont kill ourselves with nukes etc), he has to be there with Popper, Newton, Leibniz etc.

      • efalken

        He should aspire to Aristotle, not Plato. Aristotle was better on a lot of stuff and happier.

    • Reputation doesn’t necessarily lag achievement. Often our responses to achievements are impressionistic: what’s happening now is most important. Most thinkers we regard as important today will be forgotten in a few generations. (As in: who was Talcott Parsons?)

    • UWIR

      Corollary of Peter’s Principle?

  • VK

    Not surprising, I only became really interested in saying and knowing true things after I started reading OB and LW since I realized there was a group of people who achieved status by actually saying and knowing true things rather than sounding just interesting. I always knew deep down that I was winging it, I’m so glad I got on the right path but it doesn’t happen for many.

    • I don’t know if even those people have their status so closely tied to speaking the truth. Is anyone keeping track of how accurate everyone’s predictions have been?

  • Lord

    I think this runs deeper than that in a form of pastiche development. Areas central to interests and story are developed while lesser areas are borrowed for background. Some of my pet peeves are asteroid belts so dense and slow moving they would have coalesced millennia ago, never hiding or defending by putting a moon or planet in between, and those marvelous electrical systems that continue to work with more than half the ship, building, or city destroyed, but then the future is fantastic.

  • Robert Koslover

    OK, so just who are these genuine “experts” about future society that should be speaking instead of popular science fiction (SF) authors? Economists? Politicians? Social scientists? Do they also have to have somewhat charismatic/entertaining personalities (it would sure seem to help, right)? Do ALL people who have published non-fiction books and/or articles about the future qualify as experts? Do their works have to appear in refereed journals to qualify? And even so, if they prove to be wildly wrong in their predictions over and over again (Paul Ehrlich comes to mind), can/should we then reject them as “experts” even when compared to utterly-novice SF authors? Does your book, the Age of Em, prove right now that you are an expert about the future or do we have to wait and see whether you prove to be at least somewhat accurate? Can we consider your other credentials and accomplishments and declare you an expert based on your overall resume? Can you really make a solid case that you are, in fact, an expert about the future? And if so, is it based almost entirely on you being someone who is able to secure payment in exchange for providing your opinions about the future (sort of like how dividing up “professional” vs. “amateur” sports is all based on money)? If so, it would seem that both you and the SF authors that you mentioned (at least, the ones who get paid) qualify.

  • Robert Rounthwaite

    It seems to me that the following statement would describe things as well as does yours:

    “When discussing subjects where experts have demonstrable expertise, like physics, medicine, and computer science, authors look to these experts to guide their thinking. On topics where experts have no demonstrable track record to prove their expertise, like social, moral or political topics, authors prefer to trust their own thoughts and judgments based on their own experience, or to take positions that signal their affiliations or personal morality.”

    Do you believe genuine demonstrated experts in these other topics exist? The fact that your first list is of concrete specific topics while your second list is fuzzy and descriptive without naming topics is suggestive.

    I actually think there are numerous topics — such as economics — where sf authors have historically been quite ignorant and expertise exists. But I have found that authors (SF or otherwise) are as likely to have insight into moral questions as experts on ethics, to take one example.

    • Yes there really are experts who know more than amateurs about many social topics.

      • Where is convincing evidence of this? And if these experts exist what’s a reliable means of identifying them? The replication crisis, and the well documented left leaning bias in the social sciences, makes me skeptical that the obvious experts are actually experts, and I don’t see an easy way to identify experts outside of that.

        Storytelling ability seems as good a heuristic as any.

        More broadly speaking, if you are a person who’s attained a position of high status, that’s a demonstrative proof that you know something about how human society works. You mention “generals, judges, politicians, CEOs, rich folks, athletes, and actors”. All of those jobs except athlete basically require you to be an expert with people to attain them. And I don’t see people pretending that athletes are experts on how society works.

      • UWIR

        If the thesis is that people pay to see famous people talk because they wish to be associated with famous people, and not because they are seeking the most informative speakers, then the question is not so much whether there are people with more expertise than the famous people, so much as whether there are people with greater claim to expertise.

      • Robert Rounthwaite

        Either way — whether the decision is explained by the quality of experts in the area or something else — your post has made me mull over the areas where I do and don’t trust experts to inform my thoughts.

  • marshall bolton

    Such a cynical analysis would do wonders for solving your puzzle about unequal love.

  • Vítor Margato

    Though I think your explanation is probably right, other things play a role. The weight we assign to expert opinions is a function of both the expected precision of those positions and that of our own intuitions. It may well be that, for most “social” topics, we think our judgements are more precise, and the experts’ less so, than it’s the case in physics or medicine. Note we don’t have to find our judgements *more precise* than that of the experts for that to be true (and they almost certainly aren’t).

  • Romeo Stevens

    The beginning of The Glass Bead Game describes exactly this phenomena.

  • Thomas Wilson

    This is the perfect post, sums up the SFF crowd perfectly.

  • free_agent

    There’s a favorable mention of “Showing That You Care: The Evolution of Health Altruism” in

  • Cambias

    In defense of myself and fellow SF writers who do convention panels, the expertise we bring to discussions of the future is our skill at speculation, and on how fiction (it’s a science FICTION convention, after all) can make use of things. So if we’re talking about the future of, say, wearable technology, I’m the one who points out that people might take to “self-surveillance” wearing body cameras in all interactions as a safeguard, with the resultant decline in the value of privacy. And that in turn leads me to think about how one would write a murder mystery in that kind of setting. I don’t know any more than the average WIRED reader about wearable tech, but I can make up stuff, and that’s what the audience pays to hear.

    • If the panels on topic X mainly discussed how X might be used when writing a story, I’d agree with you. But they mostly talk just about X.

      • It sometimes adds to a discussion of X to include individuals particularly gifted with imaginative capacity.

  • arch1

    “You can often learn about your own world by first understanding some other world, and then asking if your world is more like that other world than you had realized.”

    Amen, brother. Douglas Hofstadter’s 1985 essay “A Person Paper on Purity in Language” used this approach very effectively to shed light on the sexism of typical English usage, vividly depicting a parallel world whose English is black/white racist to roughly the same degree that *this* world’s English was (and largely still is) female/male sexist.

  • vivaran

    There does need to be a balance on how much one understands about a particular topic and implications of the interpretation from that understanding. Some times, hard core researchers too make the mistake of communicating findings that is purely based on their interpretation and less on the actual result of their experiments. On the other hand, Jules Verne wrote about Captain Nemo and his submarine in 1870, far before it’s time. I am just relaying my fears about interpretations and how thoughtful one must be, whether an expert or not.

  • Pingback: On Appreciation – Death Is Bad()