A World Without Lies?

Among the many provocative answers to this year's Edge question, "What Will Change Everything?" my favorite was Sam Harris' "True Lie Detection":

Deception commends itself, perhaps even above violence, as the principal enemy of human cooperation. Imagine how our world would change if, when the truth really mattered, it became impossible to lie. … Reliable lie-detection will be much easier to achieve than accurate mind reading. … We will almost surely be able to determine, to a moral certainty, whether a person is representing his thoughts, memories, and perceptions honestly in conversation. Compared to many of the other hypothetical breakthroughs put forward in response to this year's Edge question, the development of a true lie-detector would represent a very modest advance over what is currently possible through neuroimaging. …

The greatest transformation of our society will occur only once lie-detectors become both affordable and unobtrusive. Rather than spirit criminal defendants and hedge-fund managers off to the lab for a disconcerting hour of brain scanning, there may come a time when every courtroom or boardroom will have the requisite technology discretely concealed behind its wood paneling. Thereafter, civilized people would share a common presumption: that wherever important conversations are held, the truthfulness of all participants will be monitored.  Of course, no technology is ever perfect. Once we have a proper lie-detector in hand, we will suffer the caprice of its positive and negative errors. … But some rate of error will, in the end, be judged acceptable.

I'm more skeptical about developing unobtrusive detectors soon, but even cheap obtrusive detector-caps would change a lot; by refusing to put one on you'd be admitting you expected to lie.  Of course by asking someone to put one on, you'd be admitting you don't trust them, but such admissions are already pretty common today.

I'm also more skeptical that "lying" is such a clear categories of mind states.  Many people seem to find it relatively easy to find a state of mind where they can "honestly" saying whatever is in their interest to say, no matter what other beliefs their minds may hold.  A world of cheap "lie" detectors would reward people with good self-deception abilities, and encourage others to train such abilities.  Perhaps we could also develop self-deception detectors, but I expect a murky mess of an arms race to follow.  Still this is indeed one of the biggest changes likely to come in the next twenty years. 

GD Star Rating
loading...
Tagged as:
Trackback URL:
  • frelkins

    I imagine that people will simply fight back with “brain detectors,” or even “brain veils,” the way people now use the common but illegal devices to find police speeding radars. One company sells a coating that supposedly absorbs the radar.

    Lower-tech ways might be learning certain types of meditation to suppress tell-tale brain activity in the necessary region? Researchers are apparently finding signs that meditation can change acitivity patterns in brain regions.

    But of those little essays, I most noted Brian Eno’s on technological optimism, and I also was interested in the one by Olver Morton on geo-engineering, which seemed OB-related. Perhaps the one that saddened me most was by the artist Tino Sehgal, who seemed to feel that in the future it would be better if masculinity was just abolished. Why do even men hate men nowadays?

  • http://partiallystapled.com/ gxti

    Minds are a poor place to find truth. Even without self-deception it is common for people to recall things incorrectly. To put faith in a lie detector for even something as clear-cut as “did you kill Joe Smith” is to assume that the person was not mentally unstable to where they may legitimately not remember, which is a poor assumption considering they are accused of murder. No matter how advanced lie detectors get they will never find more than is within the mind of the person they scan, and a single mind is hardly trustworthy.

  • http://zbooks.blogspot.com Zubon

    I always say that if you can stop the stupid and the incompetent, you will eliminate 90% of the problem. I am open to the claim that this would worsen total harm by aiding good liars to achieve better deception. I think we can instill “some liars can still get through” better if that is the only thing people need to worry about, rather than having many false positives.

  • http://profile.typekey.com/sentience/ Eliezer Yudkowsky

    This technology really worries me. It’s a way to increase the power of the powerful over the powerless. Who wants to bet that it gets used to interrogate suspects, but somehow, politicians never seem to go under the veridicator? That employees somehow get interrogated more often about whether they’re working at all hours, than CEOs get interrogated about whether they really have the company’s best interests at heart?

    And it might not be a good thing if politicians did go under the veridicator. We could end up with real religious fanatics in charge, instead of intellectuals faking it.

    And China would just be screwed. Forever, more or less.

    You can see how this could turn out well, but human history does not give me to be optimistic.

    Self-deception detectors would be a much more hopeful technology, if it could be done.

  • Carl Shulman

    “That employees somehow get interrogated more often about whether they’re working at all hours, than CEOs get interrogated about whether they really have the company’s best interests at heart?”

    CEOs are already legally required to make various attestations with criminal penalties for falsehood, and a federal requirement (with an amnesty for past wrongdoing) for lie-detector verification of Sarbanes-Oxley statements would benefit from the same political forces that created the requirements in the first place. On the market side, making declarations under a lie-detector should slash financing costs and raise stock prices, so activist shareholders (VCs, private equity, Icahn, etc) would be able to make massive amounts of money by requiring that CEOs be tested. With the pooling equilibrium broken, holdout companies would be subject to extreme suspicion.

    “Who wants to bet that it gets used to interrogate suspects, but somehow, politicians never seem to go under the veridicator?”

    Existing politicians would try to say that veridicator testing was an insult to their integrity, or that private providers could not be trusted to be unbiased. On the other hand, any candidate losing a race would have strong incentive to use the verifier on at least some self-selected statements, and things could snowball rapidly from there.

    “And it might not be a good thing if politicians did go under the veridicator. We could end up with real religious fanatics in charge, instead of intellectuals faking it.”

    Yes, ‘forbidden truths’ could dangerously penalize rational leaders. On the other hand, many ‘forbidden truths’ would be revealed as such by experts who could no longer pretend to believe approved lies, and moderate voters would no longer be able to dismiss the horrific promises of extremists as ‘cheap talk.’

    “And China would just be screwed. Forever, more or less.”

    Lie detectors also enable effective verification systems for international arms control and other treaties. A ruling elite secured by lie detectors might be capable of doing whatever it wants to the rest of the population without fear of successful revolt, but it could also be trusted to honor bargains to treat subjects well, provided the bargain was backed by lie detection. On the other hand, those elites would have to be paid, and they would have effectively appropriated most of the potential wealth of their subjects.

    “Self-deception detectors would be a much more hopeful technology, if it could be done.”

    Yes, although the right sets of detailed questions on a lie-detector (questions about time spent studying an issue, calibration statistics from past mistakes, etc) might go rather far in this direction.

  • http://timtyler.org/ Tim Tyler

    There’s a well-known book about this very topic: http://en.wikipedia.org/wiki/The_Truth_Machine

  • http://www.mccaughan.org.uk/g/ g

    Further to Eliezer’s comments: one consequence of the fact (pointed out by Robin) that “lying” isn’t likely to be a simple thing neurologically is that for something to function effectively as a lie detector it might need to do much more. All that Eliezer said apply at double strength to readily available mind-reading technology. (And Carl Shulman’s hopes seem too optimistic to me, though I’m going on pure prejudice here.)

    Like Zubon, I wonder what will happen as people adapt their lying (using these devices) to make it harder to spot. Will that lead to an increase in self-deception?

  • http://www.mccaughan.org.uk/g/ g

    Er, for “will” read “would”; I’m not taking any particular position as to whether such devices will actually become available. (Though if I had to guess I’d guess that they will.)

  • http://profile.typekey.com/robinhanson/ Robin Hanson

    Tim, yes that was a good book. I vaguely recall there was another novel on the subject as well.

    On average I’m guessing cheap lie detectors would be a good thing, but with Eliezer I worry most about getting sincere politicians.

  • Carl Shulman

    “”(And Carl Shulman’s hopes seem too optimistic to me, though I’m going on pure prejudice here.)”
    I raised a set of optimistic-seeming arguments to round out Eliezer’s statement of risks, but I didn’t say I was optimistic on net.

  • vroman

    effective, ubiquitous lie detection would be a fantastic invention. surely Robin Hanson, as an economist, can appreciate the high cost of information gathering as one of the major barriers to trade. if I dont trust you, and its expensive to acquire sufficient evidence to disprove my suspicions, then Im just not going to do business with you, even if in reality you are perfectly honest. any technology that lowers transaction costs in this manner will increase trade, and thus increase standard of living.

  • ad

    Having recently reread 1984, I am sure that the Party would be delighted.

  • billswift

    People who whine about not being trusted shouldn’t be trusted. I know I’m trustworthy, so it doesn’t bother me if people I deal with take precautions. I also know most people are not trustworthy most of the time. Even when they’re not consciously lying they are self-deceiving – a lie detector that doesn’t detect self-deception is only doing half (or less) the job. Also, as several commenters have pointed out, who controls the technology? This would be another problem like surveillance technology, and the only potential long-term, stable situation I can see with that, despite my dislike of that solution is Brin’s “Transparent Society”, make sure no one has anywhere near a monopoly.

    The same is the only realistic “solution” I can see to any potentially threatening technology. I do not believe provably “Friendly” AI is possible.

  • Tom Breton

    even cheap obtrusive detector-caps would change a lot; by refusing to put one on you’d be admitting you expected to lie.

    Yes, if the only thing it did was give a yes/no vote on whether you’re lying [1]

    But it seems likely that any such device would pick up a lot more information that yes/no lying. Even if it couldn’t pick up any real details, it might pick up broader signals: anxiety, surprise, inattention, that sort of thing. Detecting those is already possible today. And there seems no reason that detecting them is contingent on you answering a question. You’d give info away even if you “took the Fifth”.

    Your interrogator could promise that the truth hat wouldn’t report those things. But how do you know it’s not secretly reporting them? You could, I suppose, have your interrogator wear the truth hat and ask. He or she would probably respond with a prepared legalistic formulation that didn’t quite promise anything. At the very least, you’d have to work to close all the loopholes. That also leaves you vulnerable to someone who dupes the interrogator, and to any freak interrogator who can fool the truth hat.

    [1] …and you knew this and the other party knew you knew it, and you were prepared to refuse to answer out-of-bounds questions. I think the existence of the truth hat would turn the encounter into something very like a deposition.

  • Tom Breton

    Addendum: That all assumes that the interrogator doesn’t just surreptitiously turn the truth hat off when she’s wearing it and turn it back on when you are.

  • Snark

    Cheap obtrusive detector-caps would rock the foundations of dating.

  • http://cabalamat.wordpress.com/ Cabalamat

    billswift: “I also know most people are not trustworthy most of the time.”

    Do you? I find I can trust most of the people I know personally most of the time. If you find you can’t, maybe you just hang out with the wrong people.

  • http://profile.typekey.com/robinhanson/ Robin Hanson

    George, yes, claims are likely to arrive well ahead of real useful tech.

    Tom, yes, we’d want standardized tech that is easy to check for tampering.

    Bill and cabal, friends should usually be trustworthy on things you can easily check, but should be tempted to be otherwise when checking is rare or takes a long time. On those topics, it should be hard to tell if they are trustworthy or not.

  • frelkins

    @Bill, Cabal, & Robin

    friends should usually be trustworthy on things you can easily check, but should be tempted to be otherwise when checking is rare or takes a long time. On those topics, it should be hard to tell if they are trustworthy or not.

    I have thought about trust for a long time and have read several books on trust (like Fukuyama) and also game theory. Indeed, Chris Hibbert and I have talked about this.

    Credibility can be built in 9 ways, I posit: establish a reputation for trustworthiness & then use it; develop your reputation through teamwork with the other party; rely on a mutually trusted intermediary to vouch for your reputation; move in small steps at the start, then larger and larger ones to cement your reputation; destroy your own retreats; post collateral; voluntarily deprive yourself of valuable information first by sharing information; when all else fails, write contracts.

    High trust usually returns high trust, at least in high trust societies, which the US isn’t anymore. Still, I always start with high trust. I found this highly effective in the UK, for example, where there is still the idea of real manners. Publicans will take the opportunity to be a “true English gentleman” when offered.

    I have personally found that by being open to people first, they are more likely to respond “better” because they want to think of themselves as “nice.” Then, when they are nice, praise them for their niceness. Tell them they are more trustworthy than others. This accords with what most people already think of themselves (we’re all above average, right? the Lake Wobegone bias).

    By confirming their own beliefs, they like you even more and offer you increased reliability because they are loathe to destroy their own sense of being nice and being above average. Reinforce by occasionally telling them stories of your other horrible acquaintances who aren’t as nice and trustworthy as they are – this raises their sense of relative status, which is how people usually judge themselves. Rinse and repeat.

    Once we understand what our common biases are, we can use some of them successfully to help ourselves and other people too! By the way, I have often wondered if one of the many purposes of gossip is to serve as a tool of the kind of checking Robin mentions. I may not personally be able to check X, but someone on the grapevine may have, so we all spread information for verification purposes?

  • billswift

    I wasn’t talking about friends. I usually start out trusting people, within reasonable bounds, until they demonstrate where they can’t be trusted. And I don’t “hang out” with anyone, I’m just talking about people I’ve seen and have worked with. Gossip is only as trustworthy as the **least** trustworthy person who passed it on. Something that is not trustworthy is not necessarily inaccurate – it means that you cannot **rely** on either its accuracy or inaccuracy, its truth or falsity.

  • loqi

    by refusing to put one on you’d be admitting you expected to lie

    Aren’t you dismissing the full game-theoretic possibilities here? Those of us who cooperate against such a technology will always refuse to don the truth hat. By our own action, we reduce the certainty of such a conclusion proportional to the number of us that cooperate. I am my own existence proof of someone who would unconditionally refuse the hat (subject, of course, to sufficiently extreme coercion), so I expect at least a few others to cooperate.

  • http://antipolygraph.org George Maschke

    CBS 60 Minutes ran a relevant story this past Sunday (4 January). See, How Technology May Soon “Read” Your Mind.

  • Aaron

    Eliezer,

    I’m at a loss as to why you see this as troubling. Aren’t deep brain scans precisely what’s needed for full-brain emulation? The only troubling aspect I see is our biases in interpreting the data into clear cut categories, like Robin was saying.

    Aaron

  • Rich Rostrom

    This is indeed one of the possible “big inventions” of the next 25 years or so.

    I don’t know that it would generate quite as much trust as suggested. Politicians, for instance, are very good at evading inconvenient questions. In many business/financial situations, correct but deceptive answers are possible.

    But it would have very radical effects on the criminal justice system. It would make it far easier to solve the great majority of crimes, with benefits to the innocent suspect as well as the authorities.

    Even a fairly expensive hat-of-truth would pay for itself.

    Of course there are questions about what it could actually measure – many people self-deceive, or simply are mistaken. Children as witnesses are always problematic, even when it assumed they aren’t lying.

    Still, a device that could detect conscious deception would be worth a lot.

  • Cyan

    In many business/financial situations, correct but deceptive answers are possible.

    It seems likely that any lie detection device based on brain states will actually detect intent to deceive, and not lies per se.

  • http://blog.greenideas.com botogol

    Robin, I remember that last year you posted up your own answer to that year’s Edge question. What would be your answer to this one?

  • Cameron Taylor

    Another arms race that began thousands of generations ago makes a move from biological to technological. With all the effort humans already have to spend lying to ourselves to circumvent intuitive detection measures it is amazing we get anything done as it is.

  • Pingback: AntiPolygraph.org News » Sam Harris on True Lie Detection