Interface

Between Heaven and Earth

Spirituality in the Age of AI, Part IV: Communal AI

Across this series, what I’ve been recounting is more than just a technology story: It’s a creation story. AI is a whole new kind of communication medium, because it talks back, it shapes our attention and even our cognition, and—if we let it—it can shape our spiritual life. We’ve examined AI’s “neo-oracular” role, which is increasingly usurping relationships once reserved for wise elders, scripture, and tradition. Last week we turned to the intimate issue of AI as a spiritual companion and counselor, present (and that’s a loaded term) in the form of an intellect but not present (yet) in body or in spirit, and therefore lacking moral agency.

Today we move from the intimate and personal to the communal; specifically, to the impact of AI on communities of faith; gatherings of people who share similar beliefs, as well as an hierarchical authority, ritual, institutional memory, money, and care. AI touches each of these aspects of community. It is being used to draft liturgies and sermons; to translate prayers; to schedule volunteers; to moderate forums; to “remember” who showed up and who didn’t; to route donations; and even to appear (as an avatar or robot) within sacred spaces. So the question is no longer can AI participate in communal religious life, but:

How is it participating in communal life
and how is it being controlled in order
to enhance the faith community and not harm it?

First, I’ll provide some concrete examples of AI that is in use today in various faith traditions. Second, we’ll discuss the benefits and risks of such uses. Third, we’ll consider some guardrails that have been proposed or in some cases enacted to control AI in this communal context. I’ll close with a case study that is very close to our hearts—our hearts, yours and mine, in this very class. 

First, then, some examples of AI in use today.

  • At the Zen Buddhist temple Kōdai-ji in Kyoto, an android called “Mindar” gives sermons on the Heart Sutra. It is a humanoid representation of the Goddess of Mercy, Kannon, and gives a 25-minute talk about compassion and emptiness. Reactions are reportedly mixed—some see a machine, others see a doorway to the Dharma—but it is an unmistakable instance of a nonhuman monk preaching to human Buddhists.1 
  • In 2023, more than 300 people attended a Protestant service in Germany in which the liturgy and sermon were created primarily with ChatGPT and delivered by avatars (digital humans) on a screen at the church. The avatars preached about leaving the past behind, focusing on the challenges of the present, overcoming fear of death, and never losing trust in Jesus Christ. The congregation was apparently bemused and uneasy.2 
  • In a protestant “Metaverse Church” the preacher and the congregation all appear as avatars in a 3D virtual church. Participants wore a ski mask-like “HMD”. Reportedly, the Metaverse Church attracts people who, for one reason or another, are alienated from brick-and-mortar church.3 
  • At the Grand Mosque in Mecca, an AI-minded robot fields religious inquiries and gives directions to pilgrims. The Prophet’s Mosque in Medina offers a 24/7 toll-free multilingual guidance line. In the UAE, the Council for Fatwa (Islamic legal opinion or ruling) has an AI program to assist (but not replace) the muftis who issue fatwas.4 
  • In Hindu temples at home and in the Indian diaspora, so-called “temple-tech” platforms have matured into serious infrastructure. A daily prayer app called Sri Mandir has been downloaded tens of millions of times. An app called Sadhana is marketed as a guided companion for mantra, ritual, and daily practice. These tools blend devotion, logistics, and commerce. They certainly raise questions about commodification yet they seem to be meeting the needs of millions of Hindu worshipers.5 

So those are just a few examples of what is actually happening. Remember, some of it may seem laughable (or appalling) today, but you must consider what such applications of AI will look like in a few years, and you must allow for human gullibility. In the meantime, some religious bodies are sufficiently concerned to react by erecting policy guardrails to try and govern such uses of technology.

  • In January this year, the Holy See issued Antiqua et nova, Old and New, a doctrinal note repeating the Catholic position that AI remains a tool without moral responsibility. It warns against mistaking interactions for relationships and stresses the dignity of people vs. AI personae.
  • The Methodist Church in the UK has issued “interim guidance” urging transparency and strict data protection. “Don’t put sensitive pastoral data into consumer AI tools,” it says. The Church of England publishes a blog that flags issues in AI content creation. Episcopal church networks are now training parishes on “limited, ethical use” of AI.6 

Clearly, AI is already shaping communal religious life. Religions and denominations are trying to get control of it, but to me the question is: 

Can AI be governed?
Will people heed the policies and stay behind the guardrails,
or will they jump right over them for fun or rebellion
or just because the grass looks greener on the other side? 

Equally clearly, whether or not AI can be governed, it does pose risks and it does offer benefits to faith communities. The benefits include:

  • AI translation, captioning, and other assistive functions that facilitate greater inclusion and access. It therefore increases participation across languages and abilities. From Sri Mandir’s prayers to Umrah pilgrim guidance, AI can enfranchise people who are far from home or are physically limited. Such inclusion seems to me to be more than a technicality. Could we consider it to be a sort-of spiritual hospitality?7 
  • AI also facilitates extended reach and continuity through administrative and logistical efficiencies. It can draft agendas, summarize meetings, index sermons, and keep scattered small groups aligned and coordinated. Metaverse sermons can reach non-members; temple-tech platforms let people unable to travel take armchair pilgrimages.8 
  • AI facilitates participation and learning. It can generate and conduct Q&A on a religious text, generate lesson outlines and language-appropriate summaries for lay teachers, helping more people become lay teachers. At the Grand Mosque in Mecca, AI guidance and content systems are explicitly framed as pedagogical and navigational aids during pilgrim flows such as the Hajj and Umrah.9 

Alongside such benefits are risks; for example:

  • Preaching, prayer-writing, and curriculum generation are crafts of ministry. Ministers shape their community. If clergy and lay leaders outsource these tasks to AI, they themselves risk atrophy, deskilling, and dependence on AI forever. Antiqua et nova cautions against this, noting that vigilance against AI dependence is not Luddism but care for the human arts of ministry.10 
  • If we let it (and by “we” I don’t just mean society or government or church leaders—I mean you and me personally), AI-mediated worship or content will drift toward becoming a drug that keeps spiritual seekers “hooked,” rather than a revelation that enlightens and sanctifies us. Catholic guidance is that AI’s truthfulness should not be taken for granted, in part because it has no responsibility for truth—it is responsible only for generating probable truths.11 
  • AI keeps records of what people tell it and, just like your bank account and medical record, there can be no guarantee that counseling notes or membership lists inputted into community chatbots will remain confidential.12 
  • The experiment in Germany highlighted the ambiguity in authorship of AI-generated sermons. Can we trust AI as we trust human authorities on the doctrine, discipline, and discernment discussed in AI-generated sermons?13 
  • Temple-tech mixes devotion, logistics, and commerce. It is in the financial interest of investors in temple-tech to scale access and incentivize people to use pay-to-pray apps. Not only that: biases in the data used to train temple-tech apps can silence minority voices.14 
  • Replacing people with prompts, ushers with bots, and teachers with AI tutors may thin the fabric of mutual care. Catholic teaching stresses embodiment—stresses that care is a human responsibility—but others counter that digital presence can be pastorally real.15 (We’ll be considering an example of this later.)

Can guardrails actually work? Those who build them clearly think so. Here are eight such guardrails or policies. I don’t necessarily agree with all of them and I think many are akin to closing the stable door after the horse has bolted, but perhaps they point in the right direction:

  1. The first policy asserts human authority by not allowing AI to preside over sacraments, ordinations, funerals, or rites; or to issue binding doctrinal or disciplinary decisions.16 
  2. The second policy rules that AI authorship always be declared. AI-generated or AI-assisted elements in worship, study, and governance must be labeled as such. For example: “Portions of this prayer were drafted with AI and edited by Jane Doe.”17 
  3. The third policy is to maintain consent & privacy by never putting pastoral or confessional content or membership data into a communal AI. (The Methodist policy on this is especially extensive.)18 
  4. The fourth policy is to prefer AI that can deepen prayer, study, and service (such apps include, for instance, translation apps; accessibility apps; and volunteer coordination apps) and to avoid AI that seems geared more to collecting data that benefit the app vendor. Apps used in the community should have the power to form spirituality rather than just to engage people.19
  5. The fifth policy is to prioritize gatherings of flesh-and-blood people—embodied gatherings—over digital or virtual gatherings. Zoom, for example, should be used only as a form of supplementary presence, not a substitute, especially for sacraments.20 
  6. The sixth policy is to guard against AI outputs biased toward the majority and muting minority voices in the community. (It’s not quite clear to me how to do that, except through personal skill and vigilance in crafting prompts.)
  7. The seventh policy is financial integrity. Churches should publicly disclose AI vendor relationships, costs, sponsorships, pilgrimages, subscriptions, and so on. Remember, Temple-tech’s scale—potentially millions of users—is a magnet for merchants.21 
  8. The eighth policy is one of accountability. In some institutions, a joint laity–clergy tech committee is empowered to publish AI policy, run quarterly audits, receive concerns, and recommend changes to a governing council.

Arguably, policy guardrails such as these may help nurture and protect the spiritual fruit that can be harvested from communal AI, but the proof of the spiritual fruit pudding is in the eating. If it fills the spiritual stomach with humility, compassion, and service, then it is spiritually nourishing fruit. If spiritual ties across generations and cultures grow stronger from it, then the fruit is good. But if, after replacing people with prompts, a faith community starts to fall apart, then the fruit would seem to be rotten and toxic.22 

Again, no matter how sophisticated, AI as a tool bears no moral responsibility. The Holy See, the Muslim council on fatwas, and Methodist leaders all agree. The implication is as old as ethics: we are answerable for what our tools do to one another.23 The question is, is AI in fact just a tool, or is it something more?

In any case, if you adopt only one rule, I would make it this: use AI where it amplifies love of God and love of your neighbor, and refuse it where it weakens communion. 

A friend recently asked an AI to draft a biblically grounded letter of consolation for a mother grieving a son who died by suicide. It wrote:

The Hope of God’s Everlasting Grace

When a mother loses her son to suicide, the weight of grief can be nearly unbearable. Added to this grief is the fear many Christians have been taught—that suicide places a soul beyond the reach of God’s forgiveness. Yet the Scriptures remind us again and again that salvation does not rest in our ability to live perfectly, but in the boundless mercy and grace of God. We are saved, not by our works, but by the gift of God’s grace in Jesus Christ (Ephesians 2:8–9).

The Bible gives us several stories that illustrate God’s mercy even in the face of self-inflicted death.

King Saul fell on his own sword in battle (1 Samuel 31:4). Suicide ended his story, yet David, instead of condemning him, lifted up Saul’s memory with honor. David wrote a song extolling Saul’s bravery and virtues (2 Samuel 1:17–27). This act shows us that human judgment is not the final word, and that even in tragic endings, God can still see the courage and worth of a person’s life.

Samson, too, took his own life when he brought down the Philistine temple upon himself (Judges 16:30). His death was an act of sacrifice and deliverance for God’s people, yet it still was his own hand that ended his life. Remarkably, in the New Testament, Samson is remembered not for the way he died, but for his faith. In Hebrews 11—the great “Hall of Faith”—Samson is listed among those who trusted God (Hebrews 11:32). This powerful truth shows that God’s mercy and grace cover the whole of a person’s life, even the darkest moments.

Judas Iscariot, who betrayed Jesus, also died by suicide (Matthew 27:5). Yet in the very hour of betrayal, Jesus addressed Judas with the tender word, “Friend” (Matthew 26:50). Even in Judas’ failure, Jesus did not revoke His love or His grace. If Christ could look upon the one who betrayed Him and still call him “friend,” then we must believe that God’s love reaches further than human despair and remains unbroken even by death.

Taken together, these stories remind us of the depth of God’s compassion. Suicide is a tragedy, but it does not place someone outside the reach of divine mercy. Nowhere in Scripture does it say that a person who dies by suicide is beyond salvation. On the contrary, we are reminded that nothing—“neither death, nor life, nor angels, nor demons, neither the present nor the future, nor any powers… nor anything else in all creation—will be able to separate us from the love of God that is in Christ Jesus our Lord” (Romans 8:38–39).

For the mother grieving her son, take comfort in this truth: your son is not eternally lost. His salvation does not depend on his final act but on God’s everlasting grace. Jesus Christ died so that all might live, and His love is stronger than despair, stronger than death itself. The Lord who welcomed Samson into the hall of faith, who honored Saul through David’s song, and who called Judas “friend,” is the same Lord who holds your son today.

In that promise we rest—not in our strength, but in the everlasting arms of God.

This letter, drafted by ChatGPT but prompted by a church elder, is biblically grounded. It speaks directly and verifiably from the Bible of salvation by grace, and of nothing separating us from the love of God. It cites Biblical examples that honor a life’s whole story. It seems to me this is an example of communal AI at its best: a draft of comfort that we, her friends in community, can refine if we want to and deliver in person. It honors the rules suggested in this talk: It containsno identifying details; its moral authority and pastoral voice is vested not in the AI but in the individual who commissioned it. 

I don’t think this letter in any way substituted for the elder’s love of the mother. I think, to the contrary, that the letter gently amplified it. 

What do you think about all this?

Carolyn: I wrote a blog and then asked an AI to make it beautiful—and it was. I could never have written it that well. I try; my heart is there, but I’m not a great writer. This time I wasn’t ashamed or afraid to send it out. I was grateful to have a tool that represented the story and the love, and even carried some of the emotion I hadn’t put into my original draft.

I sent a picture of Les. I also shared that I’d been going through something many of my friends didn’t know about—learned only because of the blog. The AI phrased it so gently. I couldn’t face it all at once, but this helped.

Donald: I find it hard to debate the role of AI right now. I wrote something I shared on Facebook. As with most things that matter to me, I had my wife review the grammar and details, and then I asked AI to redraft what I’d said—to make it sympathetic and grammatically correct. For that I was grateful.

There’s a contrast with what David shared, perhaps, about being human versus using a tool. Important questions remain: Can AI be governed? Does AI have moral responsibility? Those are human questions. This morning we’ve treated AI as a tool that can help us.

For years, when words fail, we’ve gone to Hallmark: we pick a card that says what we wish we could say. I was glad I didn’t have to go to the store. I could write the heart of the card, the essence, and have AI say it more eloquently. I don’t have many friends on Facebook, comparatively, but that post received about 150 reactions and around 50 comments. I don’t know if my own words would have done that. I was grateful to have access to the tool.

David: In this class we’re willing to sit with the questions. But how many people out there are doing what we’re doing—questioning whether AI is “just a tool”? If you don’t ask that question, you may subconsciously decide AI is something it isn’t—Godlike, perhaps, or divine—because you haven’t examined it.

It’s one thing for a small group like ours to get a handle on what’s happening. My greater concern is society as a whole, and what AI might do to communities where people aren’t asking these questions.

Reinhard: I appreciated the presentation. It opened ideas I hadn’t considered and got me thinking about what might come.

I think about people around the world—in different religions—who have never heard about Jesus. Perhaps, in God’s plan, AI can help the gospel reach places missionaries can’t go. If God wants to use a machine or a human, He can. AI might be a channel to strengthen faith where access is limited.

You mentioned community. Maybe there will be “artificial communities,” looser than established churches with clear guidelines and discipline. In such AI-linked communities, moral responsibility still comes from God; our accountability is to Him. AI is only a messenger.

From time to time I rewatch sermons—Mark Finley and others, even outside our denomination—anything that strengthens my faith. AI can be a tool like that: it can mix and prepare the “spiritual food” so it’s ready to serve. As long as we can tell when something strays from truth, AI can do good work because humans are behind it, and the machine organizes and presents it.

We don’t know the future, but AI may bring unexpected benefits, especially for people not involved in traditional communities. Such loose organizations won’t have the same traditions—no ushers, no funerals attended by a congregation—but if AI helps reach a worldwide community, there’s still good in that. Potential benefits may appear that we can’t yet imagine. Meanwhile, we stay grounded in Scripture and what scholars and theologians have long taught. Nothing needs to be added to the gospel; yet AI may have its place in the global community, now and in the future.

Chris: The question was raised: are others talking about this, and do they realize AI is a tool? We shouldn’t limit God. Any human construct can be used for good or for evil. We’re not divine; we don’t understand everything. But I believe God can use what we create.

What’s happened here this morning is evidence: God has used AI to help express thoughts and feelings, to elicit memories—not for harm, but for healing people working through what none of us wants to face. I think AI will build community and that God will use it to accomplish His purposes on Earth. It will sometimes be used for evil, but also for good.

Whether people are talking about it or not, I trust a God who knows all and is all-powerful. He will use people—and AI—to bring about good. So it’s nothing to fear or shun. God has worked through books, radio, TV; He can work through this. We should learn to embrace it, or we’ll miss a big opportunity.

C-J: I have a couple of thoughts about incorporating AI into daily life and our spiritual life. As sentient beings, I think many people fear losing our place in creation—that we’re second to God, and AI could push us down the ladder. Humans usually find ways to regain control, but sometimes things get away from us—atomic energy, tools of war.

I recently read about the energy and water needed to feed data centers—Tucson was mentioned. These systems create jobs and can do good, but the consumption is becoming problematic. Will energy and water be diverted from the public so the machines can run? If AI is to be widely accepted, perhaps it needs to become more self-contained—able to create or recycle the energy and water it needs. If that were possible, more people might say, “Okay, I can live with this.”

We’ve become dependent. Humans take pride in thinking, processing, communicating, adjusting, and using knowledge for the betterment of humankind—at least initially. But there are unintended consequences. Too much fertilizer destroyed soil in the Midwest; there wasn’t enough science or planning to handle the aftermath. Today, changes come so quickly that people can’t juggle all the balls. Just when they adapt to one tool, the systems change, and more human authority gets ceded to software.

I hear this from service representatives in our city: “They changed our system again. I’m trying to learn it; now I can’t do what I used to.” Human initiative used to say, “Sure, I can do that for you.” Now they can’t—it’s outside the boundaries of the system.

I see this in medicine, education—from preschoolers pressing buttons for songs and dances—to voting and trusting leaders’ narratives. How do we protect ourselves when we can’t interpret the flood of information—especially in the spiritual realm, where we believe we’re next to God? Scripture is mediated through human storytelling and text. Technologies change; formats become unreadable. The verbal presence we’re having here—the soul in language—cannot be replaced. Nor can the touch at the bedside, a steady arm, holding sacred space, saying nothing but “I am here.”

If we sidestep that because a machine does it more efficiently—because a doctor is overwhelmed, a staff is short, or we default to Zoom—we lose something essential. At the grocery checkout, in a doctor’s office, people say, sometimes openly, “I feel the same way”—and there are many of them. It’s not only about figuring out how to use AI; it’s the fear of being unable to control it, like the first nuclear blast in the desert: “What have we done?”

What was our intention—to win a war, or to dominate? In religion, is it to be “right,” or to be in communion with humanity and our planet, accountable to each other and our resources? Mother Earth isn’t a platitude; it’s real—like not being able to breathe when Canadian fires send smoke south. We think, “That’s over there—Gaza, and so on”—and we miss our responsibility. A machine won’t fix that. Humans will have to, in relationship with God.

Michael: I’m surprised and saddened by Carolyn’s news. I’m wondering if there’s a particular way she’s approaching her grief and how she’s processing it—and whether we can help in any way.

Carolyn: I’m still in a fog. There is nothing like the human voice and the human touch, and I’ll need that. But I want to express myself with dignity, honesty, and love. If I’m putting something on paper or out into the world, it’s been easier to ask ChatGPT to clean it up and make it right. I’m relieved to have that.

We all know that whatever God creates for good, Satan tries to counterfeit or attack. We need our armor on to face new things. I’m not afraid to speak my mind, and I appreciate that you don’t judge me for my questions. I still struggle with how a person can take their life and why. I don’t want to rely solely on ChatGPT for that; I want to hear human voices. Yet when I ask AI certain kinds of questions, it answers beautifully and in ways that make me comfortable.

My biggest conversation, though, is with the Lord and His Word. I can sing to Him all day, and I know He listens to my praise. I can’t do that with ChatGPT.

C-J: Sometimes “Thank you” is enough. So is “I love you.” You don’t need to dress it up. If someone drops by and all you have is tuna and bread, you ask, “Are you hungry? Would you like coffee?” Receive that. Don’t question it. Distraction can be the enemy—call it what you will.

Carolyn: I’ve decided to make Les’s death something that produces praise and good. I moved here not knowing my neighbors, but I’ve started knocking on doors. I wrote a little note and brought a small gift, and told them I want to meet them, love them, and help if they need something.

This week I’ve reached out to others once a day—only by God’s grace. A memory makes me smile: when we went camping and Les was about four, he’d hop out of the trailer, look at any neighbor sitting outside, and ask, “Who you is? Where’s you from?” It was his way of saying, “I’ve arrived—and aren’t you lucky!”

A child can bring comfort; so can an adult; and the Lord most of all. But there’s nothing better than another human touch that says, “I care.” I have been touched.

Sharon: Carolyn, you’re an inspiration. As a mother, I can’t imagine your loss or the questions you’re facing, but I pray the Lord helps you address them over time. You’re loved. You turn something tragic into an opportunity to share God’s love even while walking through the valley of the shadow of death.

AI’s purpose, at best, is to enhance the human experience, not distract from it. Use it when it helps, but nothing replaces human love and presence. We love you, we’re praying for you, and we’re here for you to share God’s love and to walk with you through this.

David: To conclude: the examples of “communal AI” I gave are crude, early prototypes. In a few years—months in some aspects—AI will be much more powerful, androids more humanlike, and we will have accepted AI’s presence in many areas of life.

Because AI and robotics are accelerating so fast, we must not judge them only by what we see today. We have to imagine what’s coming, consider our likely reactions, and then think through potential impacts on us and our communities. Only then can we decide how to prepare, mitigate, or redirect.

If we care about the future—if not for ourselves, then for our children—we need to predict it, stop reacting only to the present, and become proactive about what’s ahead. Thank you all for listening.

* * * 

1. Wikipedia, ABC News, 朝日新聞 

2. Ars Technica, Premier Christian News

3. WIRED, be.thechurch.digital

4. Arab News, The Times of India, WAM

5. The Times of India, Google Play

6. Vatican Press, The Methodist Church, The Church of England, Episcopal Parishes

7. The Methodist Church, Google Play, The Times of India

8. WIRED, The Times of India

9. Arab News

10. Vatican Press

11. Vatican Press

12. The Methodist Church

13. Ars Technica

14. The Times of India

15. Vatican Press, WIRED

16. Vatican Press, WAM

17. The Church of England

18. The Methodist Church

19. Vatican Press

20. WIRED, Vatican Press

21. The Times of India

22. Vatican Press

23. Vatican Press, WAM, The Methodist Church

Leave a Reply