How do people decide what they decide? Usually the answer you get is something like this: "People who come to believe what we believe are smart; they gather information from the best sources, weigh the evidence and come to a rational decision". On the other hand, the people who don't believe what we believe are dim bulbs who rely on simplistic heuristics like the pronouncements of authorities or by choosing the answer that's most comforting.
But what if competing groups of partisans in a debate "get" cognition? In other words, they "engage in higher-level forms of reasoning" yet somehow still come to opposite conclusions. How could that be? It turns out that conflicting decisions about e.g. gun laws may arise not because one side is less committed to reason than the other but because both are equally skilled at problem solving - at solving a particular sort of problem - by quickly fitting all the factoids science is constantly raining down on us into their personal narratives about how it all works. That anyway is our take on "Ideology, Motivated Reasoning, and Cognitive Reflection: An Experimental Study" - an excellent paper through and through.
Our experience with juries tell us that in the course of even a long trial you're not likely to change most people's minds; but you can get jurors to select a different (and hopefully more helpful) narrative into which they'll fit the facts presented at trial (the real art of persuasion IOHO). And when it comes to bright jurors ... attempting to change their their minds is a hopeless endeavor. The great physicist Max Planck nailed it when he said "A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it."