27 Comments

Another arms race that began thousands of generations ago makes a move from biological to technological. With all the effort humans already have to spend lying to ourselves to circumvent intuitive detection measures it is amazing we get anything done as it is.

Expand full comment

Robin, I remember that last year you posted up your own answer to that year's Edge question. What would be your answer to this one?

Expand full comment

In many business/financial situations, correct but deceptive answers are possible.

It seems likely that any lie detection device based on brain states will actually detect intent to deceive, and not lies per se.

Expand full comment

This is indeed one of the possible "big inventions" of the next 25 years or so.

I don't know that it would generate quite as much trust as suggested. Politicians, for instance, are very good at evading inconvenient questions. In many business/financial situations, correct but deceptive answers are possible.

But it would have very radical effects on the criminal justice system. It would make it far easier to solve the great majority of crimes, with benefits to the innocent suspect as well as the authorities.

Even a fairly expensive hat-of-truth would pay for itself.

Of course there are questions about what it could actually measure - many people self-deceive, or simply are mistaken. Children as witnesses are always problematic, even when it assumed they aren't lying.

Still, a device that could detect conscious deception would be worth a lot.

Expand full comment

Eliezer,

I'm at a loss as to why you see this as troubling. Aren't deep brain scans precisely what's needed for full-brain emulation? The only troubling aspect I see is our biases in interpreting the data into clear cut categories, like Robin was saying.

Aaron

Expand full comment

CBS 60 Minutes ran a relevant story this past Sunday (4 January). See, How Technology May Soon "Read" Your Mind.

Expand full comment

by refusing to put one on you'd be admitting you expected to lie

Aren't you dismissing the full game-theoretic possibilities here? Those of us who cooperate against such a technology will always refuse to don the truth hat. By our own action, we reduce the certainty of such a conclusion proportional to the number of us that cooperate. I am my own existence proof of someone who would unconditionally refuse the hat (subject, of course, to sufficiently extreme coercion), so I expect at least a few others to cooperate.

Expand full comment

I wasn't talking about friends. I usually start out trusting people, within reasonable bounds, until they demonstrate where they can't be trusted. And I don't "hang out" with anyone, I'm just talking about people I've seen and have worked with. Gossip is only as trustworthy as the **least** trustworthy person who passed it on. Something that is not trustworthy is not necessarily inaccurate - it means that you cannot **rely** on either its accuracy or inaccuracy, its truth or falsity.

Expand full comment

@Bill, Cabal, & Robin

"friends should usually be trustworthy on things you can easily check, but should be tempted to be otherwise when checking is rare or takes a long time. On those topics, it should be hard to tell if they are trustworthy or not."

I have thought about trust for a long time and have read several books on trust (like Fukuyama) and also game theory. Indeed, Chris Hibbert and I have talked about this.

Credibility can be built in 9 ways, I posit: establish a reputation for trustworthiness & then use it develop your reputation through teamwork with the other party; rely on a mutually trusted intermediary to vouch for your reputation; move in small steps at the start, then larger and larger ones to cement your reputation; destroy your own retreats; post collateral; voluntarily deprive yourself of valuable information first by sharing information; when all else fails, write contracts.

High trust usually returns high trust, at least in high trust societies, which the US isn't anymore. Still, I always start with high trust. I found this highly effective in the UK, for example, where there is still the idea of real manners. Publicans will take the opportunity to be a "true English gentleman" when offered.

I have personally found that by being open to people first, they are more likely to respond "better" because they want to think of themselves as "nice." Then, when they are nice, praise them for their niceness. Tell them they are more trustworthy than others. This accords with what most people already think of themselves (we're all above average, right? the Lake Wobegone bias).

By confirming their own beliefs, they like you even more and offer you increased reliability because they are loathe to destroy their own sense of being nice and being above average. Reinforce by occasionally telling them stories of your other horrible acquaintances who aren't as nice and trustworthy as they are - this raises their sense of relative status, which is how people usually judge themselves. Rinse and repeat.

Once we understand what our common biases are, we can use some of them successfully to help ourselves and other people too! By the way, I have often wondered if one of the many purposes of gossip is to serve as a tool of the kind of checking Robin mentions. I may not personally be able to check X, but someone on the grapevine may have, so we all spread information for verification purposes?

Expand full comment

George, yes, claims are likely to arrive well ahead of real useful tech.

Tom, yes, we'd want standardized tech that is easy to check for tampering.

Bill and cabal, friends should usually be trustworthy on things you can easily check, but should be tempted to be otherwise when checking is rare or takes a long time. On those topics, it should be hard to tell if they are trustworthy or not.

Expand full comment

billswift: "I also know most people are not trustworthy most of the time."

Do you? I find I can trust most of the people I know personally most of the time. If you find you can't, maybe you just hang out with the wrong people.

Expand full comment

Cheap obtrusive detector-caps would rock the foundations of dating.

Expand full comment

Addendum: That all assumes that the interrogator doesn't just surreptitiously turn the truth hat off when she's wearing it and turn it back on when you are.

Expand full comment

even cheap obtrusive detector-caps would change a lot; by refusing to put one on you'd be admitting you expected to lie.

Yes, if the only thing it did was give a yes/no vote on whether you're lying [1]

But it seems likely that any such device would pick up a lot more information that yes/no lying. Even if it couldn't pick up any real details, it might pick up broader signals: anxiety, surprise, inattention, that sort of thing. Detecting those is already possible today. And there seems no reason that detecting them is contingent on you answering a question. You'd give info away even if you "took the Fifth".

Your interrogator could promise that the truth hat wouldn't report those things. But how do you know it's not secretly reporting them? You could, I suppose, have your interrogator wear the truth hat and ask. He or she would probably respond with a prepared legalistic formulation that didn't quite promise anything. At the very least, you'd have to work to close all the loopholes. That also leaves you vulnerable to someone who dupes the interrogator, and to any freak interrogator who can fool the truth hat.

[1] ...and you knew this and the other party knew you knew it, and you were prepared to refuse to answer out-of-bounds questions. I think the existence of the truth hat would turn the encounter into something very like a deposition.

Expand full comment

People who whine about not being trusted shouldn't be trusted. I know I'm trustworthy, so it doesn't bother me if people I deal with take precautions. I also know most people are not trustworthy most of the time. Even when they're not consciously lying they are self-deceiving - a lie detector that doesn't detect self-deception is only doing half (or less) the job. Also, as several commenters have pointed out, who controls the technology? This would be another problem like surveillance technology, and the only potential long-term, stable situation I can see with that, despite my dislike of that solution is Brin's "Transparent Society", make sure no one has anywhere near a monopoly.

The same is the only realistic "solution" I can see to any potentially threatening technology. I do not believe provably "Friendly" AI is possible.

Expand full comment

Having recently reread 1984, I am sure that the Party would be delighted.

Expand full comment