In this discussion of Allegra Goodman’s book novel Intuition, Barry wrote, "brilliant people are at least as capable of being dishonest as ordinary people." The novel is loosely based on some scientific fraud scandals from the 1980s, the one of its central characters, a lab director, is portrayed as brilliant and a master of details, but who makes a mistake by brushing aside evidence of fraud by a postdoc in her lab. One might describe the lab director’s behavior as "soft cheating" since, given the context of the novel, she had to have been deluding herself by ignoring the clear evidence of a problem.
Anyway, the question here is: are brilliant scientists at least as likely to cheat? I have no systematic data on this and am not sure how how to get this information. One approach would be to randomly sample scientists, index them by some objective measure of "brilliance" (even something like asking their colleagues to rate their brilliance on a 1-10 scale and then taking averages would probably work), then do a through audit of their work to look for fraud, and then regress Pr(fraud) on brilliance. This would work if the prevalence of cheating were high enough. Another approach would be to do a case-control study of cheaters and non-cheaters, but the selection issues would seem to be huge here, since you’d be only counting the cheaters who got caught. Data might also be available within colleges on the GPA’s and SAT scores of college students who were punished for cheating; we could compare these to the scores of the general population of students. And there might be useful survey data of students, asking questions like "do you cheat" and "what’s your SAT" or whatever. I guess there might even be a survey of scientists, but it seems harder to imagine they’d admit to cheating.
Arguments that brilliant scientists are more likely to cheat
Goodman makes the argument (through fictional example) in her book that brilliant scientists are more likely to be successful lab directors, thus under more pressure to keep getting grants (many mouths to feed), thus susceptible to soft cheating, at least. Similarly, the cheating postdoc is described as so smart he never had to work hard in college, again under high expectations and cheating partly to maintain his reputation as the golden boy. On the other side, a more ordinary "worker bee" type will not be expected to come up with a brilliant insight, and so won’t be under that pressure to cheat.
Another argument that brilliant scientists are more likely to cheat comes from some of the standard "overcoming bias" ideas, that a brilliant person is more likely to have made daring correct conjectures in the past, then when the person comes up with a new conjecture, he or she is more likely to believe in it and then fake the data. (I’m assuming that scientific cheating of the sort that’s interesting is in the lines of twisting the data to support a conclusion that you think is true. If you don’t even think the hypothesis is true, there’s not much point to faking the evidence, since later scientists will overturn you anyway. The motivation for cheating is that you’re sure you’re right, and so you overconfidently discard the cases that don’t support your case.)
Arguments that brilliant scientists are less likely to cheat
I’m half-convinced by the overconfidence argument above, but overall I suspect that brilliant scientists are more likely to be honest than less-brilliant scientists, at least in their own field of research. I say this partly because science is, to some extent, about communication, and transparency is helpful here. Also, as illustrated (fictionally) in Goodman’s book, fraud is often done to cover up unsuccessful research. If you’re brilliant, it’s likely that your research will be successful: even if you don’t achieve your big goals–even brilliant people will, perhaps should, bite off more than you can chew–you should get some productive spinoffs, and the simple cost-benefit analysis suggests that cheating would stand to lose you more than you’d gain.
Conversely, for a more mediocre scientist, cheating may be a roll of the dice, which, if it succeeds, can bring you to a plateau, and if it fails, you won’t be that much worse off than before–you don’t have such a big potential reputation to lose. And if the stakes are low, the cheating might never be discovered: you get the paper, the job, tenure or whatever, your findings are never replicated, and you move on.
Thinking of honesty as a behavior rather than a character trait
The other thing is that it might make more sense to think of honesty as a behavior rather than a character trait. I’m pretty honest (I think), but that also makes me an unpracticed liar (and, unsuprisingly, a bad liar). So the smart move for me is not to lie–again, more to lose than to gain (in my estimated expected value). But if I worked in a profession where dishonesty–or, to put it more charitably, hiding the truth–was necessary, something involving negotiation or legal maneuvers or whatever, then I’d probably get better at lying and then maybe I’d start doing more of it in other aspects of life.
Science seems to me like an area where lying isn’t generally very helpful, so I don’t see that the best scientists would be good or practiced liars. The incentives, at least for the very best work, go the other way.
P.S. Thanks for Robin for encouraging me to present arguments on both sides of the question.
My impression is that brilliant scientists are more likely to know the difference between cheating and not cheating. The ones who then become prominent scientists are either very lucky, very socially skilled, or willing to violate their principles and cheat because they have to in order to keep up with their less brilliant colleagues who always cheat because they don't really know better.
Is any contributor willing to go on the record that they're a fairly good and practiced liar? That, for example, they lie to obfuscate privelege and inequities from which they benefit? I think there are probably real problems with non-anonymous blogging and that kind of admission.