In a recent New Yorker, the New Republic editor who was duped by Stephen Glass takes a skeptical look at brain scan lie detectors. Along the way we learn:
“People hold a stereotype of the liar—as tormented, anxious, and conscience-stricken,” … In fact, many liars experience what deception researchers call “duping delight.” … When people tell complicated lies, they frequently pause longer and more often, and speak more slowly; but if the lie is simple, or highly polished, they tend to do the opposite. Clumsy deceivers are sometimes visibly agitated, but, over all, liars are less likely to blink, to move their hands and feet, or to make elaborate gestures. …
Liars are more likely to tell a story in chronological order, whereas honest people often present accounts in an improvised jumble. Similarly, … subjects who spontaneously corrected themselves, or said that there were details that they couldn’t recall, were more likely to be truthful than those who did not … People who are afraid of being disbelieved, even when they are telling the truth, may well look more nervous than people who are lying. This is bad news for the falsely accused … “Baby-faced, non-weird, and extroverted people are more likely to be judged truthful.”
I can believe that current brain scan lie detectors are over-hyped, but within the next half century we’ll probably have robust devices like this, which could revolutionize our social relations.
Employers would use lie detection tests to weed out embezzlers and other 'undesirables' from their workforces, but venture capitalists and hedge funds could require that corporate management submit to lie detector tests and ask whether they had violated their fiduciary duties to their companies (pursuing mergers that offered poor return to shareholders in order to increase their compensation, etc), or had violated the law.
Groups like the Gates Foundation could demand truth-in-science from their grant recipients (as they now require the sharing of data) or, even better, require publication in journals using lie detection on authors and reviewers.
With respect to politicians refusing to be tested: we already see politicians bending over backwards to make statements that are not verifiably false in order to give misleading impressions, evading making statements under oath, etc. However, even if politicians initially refuse to submit to testing, lie detectors would make it much harder to dismiss honest whistleblowers.
I agree that removing political slack can be very dangerous, given the irrationality of the voters, but this might be offset by lie detection enabling a much more credible presentation of expert opinion, e.g. claims that Intelligent Design has any greater scientific credibility than creationism would be untenable, and claims that trade with China is economically beneficial could avert the suspicion that the speakers were being paid off by business elites.
In general, one of the most important norms of lie detector use would have involve questions verifying the speaker's level of knowledge and the effort put into determining truth on the topic.
Also, it's not like the powerful don't have other ways to oppress the unpowerful--so many other ways that one can argue that it is the vigilance of the unpowerful that will determine how much liberty they retain rather than the availability of liberty-reducing technology.