{"id":3084,"date":"2006-04-14T08:40:03","date_gmt":"2006-04-14T13:40:03","guid":{"rendered":"http:\/\/timesandseasons.org\/?p=3084"},"modified":"2006-04-14T08:42:12","modified_gmt":"2006-04-14T13:42:12","slug":"agreeing-to-disagree","status":"publish","type":"post","link":"https:\/\/timesandseasons.org\/index.php\/2006\/04\/agreeing-to-disagree\/","title":{"rendered":"Agreeing to Disagree"},"content":{"rendered":"<p>Robert Aumann,  a winner of the 2005 Nobel Prize for Economics, once published a paper in <span style=\"font-style:italic;\">The Annals of Statistics<\/span> titled &#8220;Agreeing to Disagree.&#8221; The basic idea of the paper is that two rational people should, by sharing their beliefs with each other, come to a common understanding about what is likely to be true.<br \/>\n<!--more--><\/p>\n<p>The argument is given as a mathematical theorem, which I will briefly describe.  (Impatient readers may skip to the next paragraph.) Economists and others who study decision theory generally model beliefs as a probability distribution.  This distribution describes the subjective (&#8220;Bayesian&#8221;) likelihood that I place on something being true or false.  For example, I might believe with 75% probability that it will rain tomorrow, or I might believe with 90% probability that the woman sitting next to me has name that starts with a &#8220;C.&#8221;  When I am confronted with new evidence (I hear a weather forecast, the woman introduces herself as Susan) I update my beliefs, and the new probability distribution is called the &#8220;posterior.&#8221;  The situation is modeled as follows: assume there are two people who have &#8220;common priors.&#8221;  This means, roughly, that given the same total body of evidence they would come to the same conclusions.  Furthermore assume that each is able to observe some private information that is not observable to the other person.  This should cause each to update their beliefs to reflect the new information.  Now assume that the two people can get together and express their beliefs (&#8220;posteriors&#8221;) to each other.  Then they should each update their beliefs to reflect the other persons beliefs, until they have exactly the same beliefs.  Sharing their beliefs is all that is required&#8212;there is no need for them to describe their private evidence.<\/p>\n<p>The idea is easy to understand: my beliefs summarize the weight of all the evidence that I&#8217;ve observed in my life.  If I want to know the truth, I should place weight not just on the evidence I have personally observed, but on evidence that <em>you<\/em> have observed as well.  In fact, if I believe that you are as smart and honest as I am, then I should give your evidence as much weight as my own.  Since your beliefs summarize all your evidence, they allow me to update my beliefs to reflect your private information.  Of course, you should do the same, until we believe the same thing.  (Note that the theory applies only to objective truths, and not to matters of taste&#8211;there is nothing problematic about agreeing that you like chocolate and I like vanilla.)<\/p>\n<p>Why might two truth seeking people continue to disagree?   One possible reason for disagreement is that they doubt each other&#8217;s honesty or rationality.  I will not believe the evidence from your &#8220;posterior&#8221; if I think you have been faulty in evaluating your private evidence, or if I think you are lying about your beliefs, even to yourself.  Another possible reason is that they don&#8217;t have &#8220;common priors,&#8221; or in other words, that they would not agree even if they saw all the same evidence.  But again, this seems  to require a judgment by me that your thinking is somehow less valid than mine.  In the end, it is difficult for good-faith disagreement to persist.  If it persists, at least one of the parties is essentially judging the other to be on some level dishonest, deluded, or dumb.  In an <a href=\"http:\/\/hanson.gmu.edu\/deceive.pdf\">unpublished paper<\/a>, blogger-economists Tyler Cowen and Robin Hansen consider this question and conclude that &#8220;typical disagreements are dishonest.&#8221;<\/p>\n<p>It is straightforward to apply this idea to religion, especially religion as it is understood by Mormons.  First, we believe that a primary goal of true religion is to find the truth about objective facts: about the will of God, the authority of the church, the right way to get baptized, etc.  Second, we believe that the most important evidences about these truths consist of private information: personal spiritual experiences and feelings.  Other people cannot verify our evidences, and in fact we might not even be able to accurately describe them.  Some maintain that we shouldn&#8217;t even <em>try<\/em> to describe these experiences, because they&#8217;re &#8220;too sacred.&#8221;  Instead we resort to describing our posteriors, or, as we call it, bearing our testimonies.*<\/p>\n<p>If others find us trustworthy, this testimony should be enough to convince them to adopt our beliefs.  Indeed this process is explicitly approved by the Lord in D&#038;C 46:<\/p>\n<blockquote><p>13 To some it is given by the Holy Ghost to know that Jesus Christ is the Son of God, and that he was crucified for the sins of the world.<br \/>\n14 To others it is given to believe on their words, that they also might have eternal life if they continue faithful.<\/p><\/blockquote>\n<p>Of course, this cuts both ways.  We live in a world with many who do not, in the end, &#8220;believe on our words.&#8221;  What are we to make of this?  Many of these others seem just as rational and well adjusted as we do, yet there are many who have no religious beliefs, or, more problematically, strong convictions that are much different than ours.  How should we account for these beliefs?  Do we need to update our own posteriors?<\/p>\n<p>One popular solution is to simply decide that these non-believers are simply dishonest or self-deluding, perhaps because they are involved in sin and ensnared by the wiles of the evil one.  Or, we could decide that the non-believers don&#8217;t listen because they eroneously think <em>we&#8217;re<\/em> deluded or stupid.  These solutions resolve the epistemological dilemma, but they also make it hard for us to have good faith relationships with the non-believers.  Our disagreement about these fundamental matters reveals an unresolved underlying mistrust, no matter how friendly a face we try to put on it.<\/p>\n<p>Another solution that is popular among the more liberal types is to decide that religion isn&#8217;t so much about objective facts, but it&#8217;s more like chocolate and vanilla&#8230;if you find beliefs that work for you, great!  But this seems to deny some core aspects of our faith, and so many find it unworkable.<\/p>\n<p>In the end, I think this theory helps explain why discussions like those in the bloggernacle can so easily become unpleasant.  &#8220;Agreeing to disagree&#8221; sounds nice, but it is ultimately more problematic than we realize.  Perhaps we&#8217;d be wise to take Cowen and Hanson&#8217;s advice, and focus ourselves,  considering how we can &#8220;become more honest,&#8221; learning to &#8220;look for signs of self deception,&#8221; and being &#8220;a bit more wary of our own judgments when we disagree.&#8221;<\/p>\n<p>* I&#8217;d love to see someone go to the podium on fast Sunday and say &#8220;I&#8217;d like to describe my posterior, that I believe with probability approaching one that the church is true&#8230;.&#8221;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Robert Aumann, a winner of the 2005 Nobel Prize for Economics, once published a paper in The Annals of Statistics titled &#8220;Agreeing to Disagree.&#8221; The basic idea of the paper is that two rational people should, by sharing their beliefs with each other, come to a common understanding about what is likely to be true.<\/p>\n","protected":false},"author":82,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-3084","post","type-post","status-publish","format-standard","hentry","category-corn"],"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts\/3084","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/users\/82"}],"replies":[{"embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/comments?post=3084"}],"version-history":[{"count":0,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts\/3084\/revisions"}],"wp:attachment":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/media?parent=3084"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/categories?post=3084"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/tags?post=3084"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}