Moral Psychology and Innate Lying/Honesty

We have based our society on the assumption that deciding to lie or to tell the truth is within our conscious control. But […] this assumption may be flawed and […] honesty may instead be the result of controlling a desire to lie (a conscious process) or of not feeling the temptation to lie in the first place (an automatic process).

An intriguing idea and one with far-reaching consequences, especially given that this is on what our entire judiciary system is based. Can someone fairly be punished for a genetic trait (innate lying)?

So is the desire to lie (or, conversely, the desire to be honest) innate, and if so, what does this mean?

What they found is that honesty is an automatic process-but only for some people. Comparing scans from tests with and without the opportunity to cheat, the scientists found that for honest subjects, deciding to be honest took no extra brain activity. But for others, the dishonest group, both deciding to lie and deciding to tell the truth required extra activity in the areas of the brain associated with critical thinking and self-control.

One surprising finding from this study reveals the complexity [we] face in trying to dissect moral behavior: The decision to lie for personal gain turns out to be a strikingly unemotional choice. Some moral dilemmas Greene studies, like the trolley problem, trigger emotional processing centers in our brains. In his coin toss experiment, there was no sign at all that emotions factored into a subject’s decision to lie or to tell the truth. “Moral judgment is not a single thing,” Greene concludes, suggesting that although we often lump them together under the heading of “morality,” deciding what’s right or wrong and deciding to tell the truth or to tell a lie may, in some situations, be entirely disconnected processes.

On a related note: the classic Good Samaritan study.

Tags: