{"id":47763,"date":"2024-08-22T03:12:36","date_gmt":"2024-08-22T09:12:36","guid":{"rendered":"https:\/\/timesandseasons.org\/?p=47763"},"modified":"2025-05-28T20:39:50","modified_gmt":"2025-05-29T02:39:50","slug":"ai-censorship-and-sacred-cows","status":"publish","type":"post","link":"https:\/\/timesandseasons.org\/index.php\/2024\/08\/ai-censorship-and-sacred-cows\/","title":{"rendered":"AI Censorship and Sacred Cows"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-47764 aligncenter\" src=\"https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/08\/456295708_10234081997370053_5846448429848365157_n.jpg\" alt=\"\" width=\"371\" height=\"281\" \/><\/p>\n<p><span style=\"font-weight: 400;\">In the AI world there is a debate swirling about how much AI providers should censor their image generation. Of course there are plenty of things to mock in past attempts to censor or otherwise put a thumb on the scale of AI to be more socially appropriate. Exhibit A of course were the racially diverse, Black SS stormtroopers created by Google Gemini, but anybody who\u2019s spent a decent amount of time using AI has run into these guardrails, and sometimes they can be annoying. I had a tragicomical experience myself in the early days of Midjourney when they didn\u2019t have the fingers right, and when I tried to create a picture of Adam and Eve it gave Adam multiple genitalia. I tried to regenerate the image specifying \u201cno nudity,\u201d and got a warning that I was using a forbidden term and would be banned if I continued to try to create nude images.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The guardrails around religious topics in particular are so strict that it becomes difficult to do anything religious per se, one has to describe a religious scene without invoking religious vocabulary. (I assume the skittishness about depicting religious imagery is really just about depicting Mohammad, but they\u2019re trying to be consistent).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">However, in the past week or so the world was exposed to an almost completely uncensored AI tool with the release of Elon Musk\u2019s Grok 2 (because of course it\u2019s Elon). All of the sudden the Internet was flooded with images of Donald Trump flying an airplane into the Twin Towers, Donald Trump and Kamala Harris making out, etc. I would be lying if I said that last one did not make me laugh out loud, and there was a treasure trove of meme material.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">However, as a faith that unapologetically has our own sacred cows, we obviously know where this is going. On the whole, all silliness aside, I think it\u2019s probably better for a corporation to put guardrails in place, to say \u201chey, that picture you want to generate isn\u2019t cool; you\u2019re clearly being a jerk to a particular group of people (above and beyond good-faith jabs we all enjoy), so you can\u2019t use our tools to do what you\u2019re trying to do.\u201d Of course, open source is developing so quickly that, big corporations or not, the censorship debate is kind of a moot point. <\/span><span style=\"font-weight: 400;\">(Matter of fact Flux, the state-of-the-art engine driving Twitter\u2019s image generation, is now technically open source, it just takes a lot more computing power than most people have to run it). Whether we like it or not, the extreme libertarians are going to get this wish, and virtually every photorealistic scenario (and, in the future, video) can and will be easily generated. And I mean <em>every one<\/em>; there is an ongoing legal\/ethical debate about the status of computer generated child pornography for pedophiles, for example. Nowadays it&#8217;s hard to get a law passed based on &#8220;it&#8217;s just wrong;&#8221; you have to show harm. Proponents say artificially generated child pornography is &#8220;victimless,&#8221; since no actual child is being abused, but whether pornography leads to more offending hinges on the social scientific question of whether pornography is substitutionary for the real thing, thus reducing demand, or whether it entices behavior, and my understanding is that the science is unsettled on that question. \u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">But back to our sacred cows. If there is a way to disparage what somebody holds sacred, not only will people do it in an attempt to become some kind of an edgelord, but they see it as a moral act (often while sitting in their mother\u2019s basement). Of course, this has always been with us, whether with those people smearing garments with dirt at general conference or the adult film producer who shot a scene in a temple locker room.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">You would think it goes without saying, but people who engage in such behaviors are objective ******** (I&#8217;ll let you count the asterisks, TS is PG) by any reasonable ethical framework. Even if I left the Church and hated Joseph Smith with every fiber of my being, I would think that intentionally mocking a temple ritual (or a Catholic Mass or an Islamic prayer) that good people find solace in is being an *******, regardless of background.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">However, with open source AI image generation all sorts of creatively desecratory images of our sacred rituals are on the horizon. (Of course, to paraphrase DeMille, you can&#8217;t desecrate the temple, you can only desecrate yourself.) So something to be aware of from those who \u201cleave the Church but can\u2019t leave it alone.\u201d (A category which does not include most ex-members, but enough).\u00a0<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In the AI world there is a debate swirling about how much AI providers should censor their image generation. Of course there are plenty of things to mock in past attempts to censor or otherwise put a thumb on the scale of AI to be more socially appropriate. Exhibit A of course were the racially diverse, Black SS stormtroopers created by Google Gemini, but anybody who\u2019s spent a decent amount of time using AI has run into these guardrails, and sometimes they can be annoying. I had a tragicomical experience myself in the early days of Midjourney when they didn\u2019t have the fingers right, and when I tried to create a picture of Adam and Eve it gave Adam multiple genitalia. I tried to regenerate the image specifying \u201cno nudity,\u201d and got a warning that I was using a forbidden term and would be banned if I continued to try to create nude images.\u00a0 The guardrails around religious topics in particular are so strict that it becomes difficult to do anything religious per se, one has to describe a religious scene without invoking religious vocabulary. (I assume the skittishness about depicting religious imagery is really just about depicting Mohammad, but [&hellip;]<\/p>\n","protected":false},"author":10403,"featured_media":47764,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[43],"tags":[],"class_list":["post-47763","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-science"],"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"https:\/\/timesandseasons.org\/wp-content\/uploads\/2024\/08\/456295708_10234081997370053_5846448429848365157_n.jpg","_links":{"self":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts\/47763","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/users\/10403"}],"replies":[{"embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/comments?post=47763"}],"version-history":[{"count":8,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts\/47763\/revisions"}],"predecessor-version":[{"id":50271,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/posts\/47763\/revisions\/50271"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/media\/47764"}],"wp:attachment":[{"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/media?parent=47763"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/categories?post=47763"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/timesandseasons.org\/index.php\/wp-json\/wp\/v2\/tags?post=47763"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}