{"id":48583,"date":"2024-12-28T03:00:22","date_gmt":"2024-12-28T10:00:22","guid":{"rendered":"https:\/\/timesandseasons.org\/?p=48583"},"modified":"2025-05-28T21:20:27","modified_gmt":"2025-05-29T03:20:27","slug":"science-is-approaching-the-soul","status":"publish","type":"post","link":"https:\/\/timesandseasons.org\/index.php\/2024\/12\/science-is-approaching-the-soul\/","title":{"rendered":"Science is Approaching the Soul"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-48585 aligncenter\" src=\"https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/12\/0ddebc2a-36b3-4e8b-afea-29f8ff2ea52b-800x800.webp\" alt=\"\" width=\"359\" height=\"359\" srcset=\"https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/12\/0ddebc2a-36b3-4e8b-afea-29f8ff2ea52b-800x800.webp 800w, https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/12\/0ddebc2a-36b3-4e8b-afea-29f8ff2ea52b-150x150.webp 150w, https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/12\/0ddebc2a-36b3-4e8b-afea-29f8ff2ea52b-360x360.webp 360w, https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/12\/0ddebc2a-36b3-4e8b-afea-29f8ff2ea52b-260x260.webp 260w, https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/12\/0ddebc2a-36b3-4e8b-afea-29f8ff2ea52b-160x160.webp 160w, https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/12\/0ddebc2a-36b3-4e8b-afea-29f8ff2ea52b.webp 1024w\" sizes=\"auto, (max-width: 359px) 100vw, 359px\" \/><\/p>\n<p><span style=\"font-weight: 400;\">A little while ago OpenAI announced o3, a new (and extremely expensive) LLM. There\u2019s a lot to say about its new capacities in a variety of domains, but the one relevant here is its performance on the ARC Challenge, a measure of general intelligence.\u00a0 Without boring you about the technical details, previous LLMs have done quite poorly at such measures of general intelligence, but o3 is now scoring at human-level.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">So have we achieved the vaunted \u201cartificial general intelligence\u201d? Not quite, there are other general intelligence tests that it still doesn\u2019t do quite so well on, but the writing is on the wall as we whittle away at those and even some of the \u201cAI is just glamorous autocomplete\u201d folks are squirming a bit at a system that can, for example, solve unique, bespoke math problems that would take a professional mathematician a day to solve, and scores among the top professional coders in the world.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">We\u2019re closing in on being able to replicate any kind of human-level general purpose reasoning, creating what is known as <\/span><a href=\"https:\/\/en.wikipedia.org\/wiki\/Philosophical_zombie\"><span style=\"font-weight: 400;\">a philosophical zombi<\/span><\/a><span style=\"font-weight: 400;\">e, an entity that can act and respond like a human would to different stimuli, but who doesn\u2019t have an internal sensory experience.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">And this is where things get interesting, because the philosophical zombie has been at the center of the philosophy debate about consciousness and, in a sense, the soul. Suddenly all sorts of esoteric philosophy questions will become quite relevant in our day-to-day, and I think a religious perspective that postulates a soul as the repository of the self will come out a winner, if not the winner as the orthodox, as the mainstream naturalist position is increasingly found wanting.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Why? Consciousness is famously one of the hardest problems in philosophy and science. Among these discussants is a group of prominent thinkers like Daniel Dennett (RIP) and the Churchlands who argue that what we label consciousness and internal subjective experience, which according to some takes doesn\u2019t even really exist, is the result of what is essentially a meat computer with enough power and the right software and hardware.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">But as we are increasingly able to replicate the computing part of the human brain we are arguably getting no closer to the internal sense experience even if we are close to having chipped away everything that is not consciousness, which remains stubbornly intractable. This isn\u2019t surprising to me. Like many <\/span><a href=\"https:\/\/en.wikipedia.org\/wiki\/David_Chalmers\"><span style=\"font-weight: 400;\">non-naturalists<\/span><\/a><span style=\"font-weight: 400;\"> I think it is a category error to attribute internal experience to the mechanics of atoms bouncing against and with each other. Unsurprisingly, I don&#8217;t think that consciousness can simply arise out of faster supercomputer and larger neural network; that it\u2019s an emergent property of raw compute. I don\u2019t think, fundamentally, you can get from bouncing particles and electrical charges to self-awareness, no matter how complex your cognitive gears and pulleys are.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Chat-GPT cannot feel any more pain than an abacus just because it\u2019s more complex. As our electronic calculators become more sophisticated and more powerful than the human brain, with the internal experience nowhere to be found, it is going to be increasingly difficult to argue that adding more RAM is going to make Chat-GPT 20 feel pain or, famously, experience <\/span><a href=\"https:\/\/en.wikipedia.org\/wiki\/Knowledge_argument#Dualist_responses\"><span style=\"font-weight: 400;\">the sensation of the color red<\/span><\/a><span style=\"font-weight: 400;\"> even if knows everything about the color red. (Although it\u2019s worth noting that Latter-day Saint theology that the soul is the body and spirit of man is perhaps more friendly to a sort of hybrid perspective on the mind\/body problem than some).\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The common argument that internal sense experience (or \u201cqualia\u201d) is a mirage seems patently ridiculous, and is a testament to the idea that there are some things so stupid you need a PhD to believe them (maybe I should be nicer, but Dennett was similarly flippant about any ideas that didn\u2019t accord with his). If anything, the fact that I feel is the one thing I am sure about (\u201cI think, therefore I am\u201d and all that). But fine, if we\u2019re going to take that perspective at face value, then its proponents need to have the courage of their convictions and, once OpenAI can pass every artificial general intelligence test we throw at it, insist that we give Chat-GPT human rights, since there is no real difference between it and us at that point according to their framework.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Of course, that\u2019s ridiculous. Chat-GPT doesn\u2019t <\/span><i><span style=\"font-weight: 400;\">feel<\/span><\/i><span style=\"font-weight: 400;\">, and we do. And it\u2019s interesting to me that there is very little discussion in the naturalist camp of Chat-GPT being a human being even though we blew past the Turing Test years ago. We\u2019re moving to the realm where thought experiments are becoming a reality and it\u2019s becoming awkward for people who thought that philosophical zombies would safely remain an abstract idea in a lecture hall.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">There is the chance that if we understood the brain mechanics a little bit better we could find consciousness in the biochemistry equations of the brain, but as we get better and better at replicating the brain\u2019s functions this will ironically lead to a sort of \u201cscience of the gaps\u201d situation as the gap becomes smaller and we\u2019re relying more on, well, faith that consciousness is somewhere in the shrinking gap. (IMHO there\u2019s a similar situation with fine tuning in physics or origin-of-life research). Like the aether (or biological vitalism, ironically), after the umpteenth attempt to reproduce it fails one has to start to ask whether the consciousness is really reducible to synaptic mechanics.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Of course, all of this doesn\u2019t mean that everybody is going to rush out and get baptized in some faith or another, as there are non-\u201dreligious\u201d options that allow for something beyond raw atomic mechanics. Even famous anti-religionist Sam Harris is sympathetic to some versions of panpsychism, where the universe has a sort of consciousness (although I\u2019m not sure why that wouldn\u2019t qualify as \u201cGod\u201d).\u00a0 David Chalmers, the main opponent of the Churchills\/Dennett crowd who believes that consciousness is a fundamental facet of reality, does not identify as religious, although with him too it would be hard to look at his thought and not identify it as <\/span><span style=\"font-weight: 400;\">\u201cspiritual.\u201d<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Still, people who see the relationship between scientific discoveries and faith as simply one of science whittling away at phenomena traditionally explained by faith are living in the 19th century. Science is revealing more questions than answers and showing the limits of a mechanistic universe: the original primordial cell was much more, not less, complicated than Darwin believed, the fact that the universe has parameters seemingly precisely tuned for life had found widespread acceptance, and to top it all off the universe \u201cspookily\u201d knows when we\u2019re looking at it according to quantum mechanics. I wouldn\u2019t be surprised if, over the next decade or two, our inability to replicate self-awareness will start to make people take the idea of a soul more seriously and will add another data point that tantalizingly suggests that there is much more underlying this universe, metaphysically and fundamentally, than raw particles bouncing off of each other.\u00a0\u00a0<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A little while ago OpenAI announced o3, a new (and extremely expensive) LLM. There\u2019s a lot to say about its new capacities in a variety of domains, but the one relevant here is its performance on the ARC Challenge, a measure of general intelligence.\u00a0 Without boring you about the technical details, previous LLMs have done quite poorly at such measures of general intelligence, but o3 is now scoring at human-level.\u00a0 So have we achieved the vaunted \u201cartificial general intelligence\u201d? Not quite, there are other general intelligence tests that it still doesn\u2019t do quite so well on, but the writing is on the wall as we whittle away at those and even some of the \u201cAI is just glamorous autocomplete\u201d folks are squirming a bit at a system that can, for example, solve unique, bespoke math problems that would take a professional mathematician a day to solve, and scores among the top professional coders in the world.\u00a0 We\u2019re closing in on being able to replicate any kind of human-level general purpose reasoning, creating what is known as a philosophical zombie, an entity that can act and respond like a human would to different stimuli, but who doesn\u2019t have an internal sensory [&hellip;]<\/p>\n","protected":false},"author":10403,"featured_media":48585,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[43],"tags":[],"class_list":["post-48583","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-science"],"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/12\/0ddebc2a-36b3-4e8b-afea-29f8ff2ea52b.webp","_links":{"self":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts\/48583","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/users\/10403"}],"replies":[{"embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/comments?post=48583"}],"version-history":[{"count":5,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts\/48583\/revisions"}],"predecessor-version":[{"id":50301,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts\/48583\/revisions\/50301"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/media\/48585"}],"wp:attachment":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/media?parent=48583"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/categories?post=48583"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/tags?post=48583"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}