Some recent research has shown that our conscious minds controls less of our interactions than previously thought:
The researchers could predict how around 70% of the students would rate an instructor just by analysing the instructor’s body language in 30 seconds of soundless video. […] The researchers were able to devise an algorithm that could predict whether a call would result in a sale from only a few seconds of data. Successful operators, it turned out, spoke little and listened more. And when they did speak, their voices fluctuated strongly in amplitude and pitch, suggesting interest and responsiveness. […] In an experiment involving a 45-minute mock salary negotiation between students in a business school, [Alex] Pentland says that by combining several display signals from the first 5 minutes of the negotiation, his team could predict who would come out on top with 87% accuracy. […]
As a result of such experiments, the MIT group has identified a handful of common social signals that predict the outcomes of sales pitches, the success of bluffing in poker, even subjective judgements of trust. These signals include the ‘activity level’, effectively the fraction of time the person speaks; their ‘engagement’ or how much a person drives the conversation; and ‘mirroring’, which occurs when one participant subconsciously copies another’s prosody and gesture.
The original Nature article is behind a paywall, hence the link to Overcoming Bias with their larger excerpt.