16 Comments

Employers would use lie detection tests to weed out embezzlers and other 'undesirables' from their workforces, but venture capitalists and hedge funds could require that corporate management submit to lie detector tests and ask whether they had violated their fiduciary duties to their companies (pursuing mergers that offered poor return to shareholders in order to increase their compensation, etc), or had violated the law.

Groups like the Gates Foundation could demand truth-in-science from their grant recipients (as they now require the sharing of data) or, even better, require publication in journals using lie detection on authors and reviewers.

With respect to politicians refusing to be tested: we already see politicians bending over backwards to make statements that are not verifiably false in order to give misleading impressions, evading making statements under oath, etc. However, even if politicians initially refuse to submit to testing, lie detectors would make it much harder to dismiss honest whistleblowers.

I agree that removing political slack can be very dangerous, given the irrationality of the voters, but this might be offset by lie detection enabling a much more credible presentation of expert opinion, e.g. claims that Intelligent Design has any greater scientific credibility than creationism would be untenable, and claims that trade with China is economically beneficial could avert the suspicion that the speakers were being paid off by business elites.

In general, one of the most important norms of lie detector use would have involve questions verifying the speaker's level of knowledge and the effort put into determining truth on the topic.

Expand full comment

Also, it's not like the powerful don't have other ways to oppress the unpowerful--so many other ways that one can argue that it is the vigilance of the unpowerful that will determine how much liberty they retain rather than the availability of liberty-reducing technology.

Expand full comment

Given what I've said about privacy and the "Transparent Society", it might not be surprising that I take a very different view than Eliezer. I think more truth and accuracy will be good things. I think if the existence of such detectors is common knowledge, the demand for their application on the "powerful" will be hard for them to resist against.

Expand full comment

Carl,Great points. Bringing up peer review committees is an interesting "who's watching the watchers" problem. Who actually is empirically studying and proposing optimizations of peer review and the role it plays in the production of scientific consensus knowledge?

Expand full comment

There seems to be enough conscious deception or selective presentation in science that ferreting out the actual opinions of scientists could be very valuable, especially when combined with inquiries to clarify expertise on a subject. Peer reviewers and tenure committee members could be asked whether they held any personal grudges or other such motivations with respect to key decisions. Researchers could be asked whether they their work was more beneficial than an alternative, by a particular metric (e.g. quality-adjusted years of life saved), and the strength of the evidence in support of that view.

Expand full comment

But think of the increased power to enforce bad laws, perform loyalty tests, or other unpleasant things

This is probably only so bad if the lie detector requires the testee to submit, thus encouraging its use by centralized powers (police, employers). If everyone had them, and could test everyone they heard say something, it would probably be a good thing.

And self-deception detectors would be of indescribable worth.

Expand full comment

Robin writes "Eliezer, economic analysis suggests it is no where near as simple as benefiting the powerful at the expense of the powerless."

I agree strongly. It's actually a bit of a slight to the readership of the blog to give analysis as simplistic as "A lie detector is a tool for increasing the power of the powerful over the powerless" in my opinion as a reader. I think we're looking for comentary as propaganda denuded and as empiricism and critical thought derived analysis rich as possible.

Expand full comment

Also, it sounds strange to portray police having lie detectors as a dystopic thing. The police already has access to a comprehensive arsenal of techniques which it could abuse, from fingerprinting and DNA analysis to house searches and tapping of phone lines. Adding lie detectors to the mix would presumably only add one more technique that could only be used with a warrant. And imagine if criminal cases could be settled with a simple, lie detector-checked "are you guilty or innocent" question, saving countless of hours of investigative work and an untold number of wrongly sentenced innocents... obvious there'd need to be very tight measures to make sure the technicians didn't distort the information, like multiple independent groups supervising and carrying out the scans, but it would be worth it if it could be pulled off.

Even if the lie detector was reserved for only serious crimes, as it maybe should be, it sounds more like a utopia than a dystopia to me - bring in everybody even remotely suspected of the crime in for a five-minute scan, ask them if they did it or if they know who did it, if they truthfully answer no to both questions then they're cleared.

Expand full comment

Robin, could you elaborate on that economic analysis?

LemmusLemmus, the difference is that current-day lie detectors aren't very (if at all) reliable. Eliezer was, as I understand it, talking about a hypothetical 100% accurate lie detector.

Expand full comment

Elizer,

there are already lie detectors, and I am not aware of any western nations other than the US in which police use them. If I'm just ignorant, please let me know.

Expand full comment

Nathan, I guess lies show up as some different pattern of brain activity; it need not just be a different overall effort.

mk and Eliezer, I agree lie detectors would induce more self-deception, but I'd still guess they result in more truth overall.

Eliezer, economic analysis suggests it is no where near as simple as benefiting the powerful at the expense of the powerless.

Expand full comment

Robin, on what basis do you make the claim that we will probably have robust lie detectors within the next 50 years? According to the article,

"Brain-scan lie detection is predicated on the idea that lying requires more cognitive effort, and therefore more oxygenated blood, than truthtelling."

It is probably true that extemporaneous lying requires more cognitive effort and a variety of detectable physiological signs, but recalling a memorized lie is probably little different from recalling a memorized truth--at least for "talented liars" who can control their sympathetic nervous systems.

Expand full comment

Eliezer, I'm not sold on most of your analysis, but the last sentence is fantastic. Self-deception detectors would indeed be worth their weight in platinum.

Expand full comment

A lie detector is a tool for increasing the power of the powerful over the powerless. Yes, it could be used to scan the brains of politicians up for election and it could be strictly forbidden for police, judicial, and employer use. But fifty bucks says to a donut that it ends up never being used on politicians, and freely used by police and employers. Whereupon society becomes nightmare, and police states beyond imagining.

And if you did use it on politicians, you'd end up with a bunch of religious fundamentalists who actually believe that crap, rather than the cynical exploiters we have now; this is probably much worse.

I cannot look upon lie detectors as a good thing. People who seek truth for legitimate reasons can already usually find it.

Self-deception detectors would be worth their weight in platinum.

Expand full comment

At least we will know how many liars believe their lies...

Expand full comment

I dunno, I'm not convinced. I think once you have a "brain scanning" lie detector, fooling the detector will just be a subtler-but-manageable problem. I think people will be able to train themselves to "think a certain way" as they tell a story, and that will make them look on the scan screen pretty much like a truth-teller. As humans, we have a lot more voluntary control over ourselves than we think. The ability to exercise that control will increase as positive incentives to exercise control come about.

Just a conjecture.

Expand full comment