With just five people present, two of whom were not Adventists, and of the three Adventists one (Kiran) was today’s speaker on the topic of the Remnant—a topic of especial interest to Adventists—it was decided to postpone the talk until next week.
Meantime, a general discussion got underway about AI, with concerns expressed about its growing incursion in our lives and its potential harms and benefits. We pick up the discussion when the issue arose of its potential for catastrophic, existential harm.
David: It just so happens I am writing yet another unpublishable essay about this very topic, although it looks further ahead than the timeframe we have been discussing. My argument is essentially that in the long term we needn’t worry about AI. If you believe in some notion of God, then you have to believe that good will ultimately win out, in which case there’s really nothing to worry about in the end. But in the short run, a great deal of damage can be done. The kind of things Connie was talking about—bad people using AI for bad ends—are very real and will probably happen. People will get hurt.
Kiran said that AI today can only do so much, but was careful to say “as of yet.” He is acknowledging that it is going to become more powerful. Its limitations will diminish and its capabilities will increase. I look forward to the day (I predict it will happen) when consciousness emerges from AI. That’s when we will need to ask whether it will have morality. And if it does, what kind? Positive? Negative?
As for today’s problems—despots using AI for their own ends—I don’t see a fundamentally new answer. It’s the same answer we’ve had for every other technology in history. We have to use political and social means to persuade people not to behave badly. It’s that simple.
But Connie is right that things are happening very fast. AI is already so powerful that one bad-willed individual could potentially do catastrophic damage. Yet the answer is not better technology, nor banning technology. Somehow, we have to get people to behave better.
C-J: My concern is the use of electricity. The power grid is fragile and fractured in how it’s set up. Then there’s the water required to cool these systems. People say, “They should just pay more if they’re going to use these servers.” Electricity was created for the betterment of society. Everyone needs it. But these servers consume enormous resources. I think we need to slow down and think about energy that doesn’t pollute and doesn’t consume more than the population needs for basic access—heat, light, even our own computers.
I’ve noticed that our local station, XXI PBS, hasn’t been able to provide the same kind of programming it used to. They’re borrowing more from Europe and looping old material from their archives. I don’t want to hear old news or updates about the royal family. I care about what’s impacting my country and my world—Ukraine, NATO, what’s happening globally. And I want multiple perspectives, not just whatever narrative is deemed allowable by political agendas in Washington.
It feels like we’re sidestepping issues and using distraction. When I get stuck, I go back to basics. Where did I get lost? Where did I lose my footing? At what point did I realize I was making bad choices? If I go back to that moment, I can often find my moral compass again.
Where did I compromise? What information did I assume I had? What did I dismiss? That’s how I realign my priorities and approach. Did I take on more than I could handle? Does this really matter to me? Who is it hurting? That’s how we need to move forward—not just saying, “This little sphere of my life is critical,” or “This is fun,” or “I don’t want to think about it because it overwhelms me.” Does that make sense?
Kiran: I completely understand your point. Whenever a new technology emerges, its benefits should be distributed equitably, and harm should be minimized as much as possible. That should be the goal. For example, no one thinks about Google Search anymore, but there used to be debate about how much electricity a single search used. You’d type, “Where’s the nearest Thai restaurant?” and servers would consume energy and water for cooling. People made a big deal of it at the time. Now no one talks about Google Search, but everyone talks about AI.
If you compare them, a single Google search uses roughly the equivalent of 15 seconds of LED bulb energy. An AI prompt might use about the energy of running a microwave for one second. A Google search uses about one drop of water; an AI prompt might use five drops. Of course, when you multiply that by millions, it adds up. But there’s also a lot of exaggeration and misinformation.
AI development is happening so fast. What used to take years now takes months. When problems arise, companies address them quickly. Another distinction: training AI models consumes a lot of energy, but once trained, deploying them requires much less. The worker—the deployed model—uses far less power than the training phase.
I do agree that states compete to attract data centers and sometimes engage in questionable practices. If electricity bills rise or groundwater declines because of these centers, that’s not right. Development should not harm the public.
Some companies are exploring alternatives. Space is extremely cold, so there are ideas about placing data centers in orbit to eliminate cooling needs. Microsoft experimented with placing servers underwater in cold ocean environments. China is exploring similar approaches. China also has significant energy capacity and more clean energy infrastructure than the U.S., which has aging systems.
Sometimes crises force innovation. If AI stresses the electrical grid, that may push modernization. For example, if a solar flare could cripple the U.S. grid, perhaps this pressure will accelerate improvements. I tend to see these challenges optimistically—but we still need accountability so that people don’t act irresponsibly.
David: And that’s the central issue. It’s not the technology—it’s people. Everything is accelerating, not just AI but energy production as well. There are now very small nuclear power plants being developed—potentially small enough for localized deployment. In computing, massive data centers may eventually seem quaint. One day we might carry quantum-level processing power on our wrists that surpasses what today’s warehouses can do.
Technology is accelerating rapidly. What isn’t accelerating is human evolution. We are no smarter than Socrates was. We are the bottleneck. We are the potential destroyers. Our own limitations may stop progress.
Don: Where does that leave us as individuals in terms of personal responsibility and the moral compass Connie referred to? What should a believer believe, and what should a believer practice, if they’re to make some kind of meaningful contribution to the conversation we’re having?
David: I would never tell anyone what to believe. Personally, I think we all know what is right. We know what goodness is, and we know it’s the right thing. But we often contravene it anyway—I certainly do at times. So it’s not about telling people what to believe; it’s about getting people to align themselves with the goodness that already exists. If we did that, things would be fine. The problem is misalignment.
This idea is recognized in Daoism and in all the great religions. Ultimately, it’s about being out of alignment with the way things are meant to be—whether you think that’s because God wills it or because nature does. There is a right way and a wrong way, and we sense that inwardly. I don’t need to tell you what to believe. I believe you already know.
C-J: I don’t know if it’s a function of age, but the layering we experience at this point in history feels overwhelming. I resisted using autopay for a long time. Writing checks, taking them to the post office—it took time, but it felt manageable. Now, by the time I justify where I spent money and what I have left, it can take an hour. What used to take fifteen minutes now consumes a big part of my day.
It’s the same with other things—even sitting down to read a book, for pleasure or information. I called a relative the other day because I had a problem. He answered and immediately said he couldn’t talk, then hung up. No explanation. He sounded depressed. I thought, fine, I’ll find another solution. But I discounted his value as my relative, reducing him to a source of information. But something about the exchange felt wrong.
We are so dependent on information to be successful. He couldn’t give it to me, and I felt pressed. Choosing a pathway—what education to pursue, where to minister if I continue exploring chaplaincy at the certificate level—felt overwhelming. Others have said the same: “You don’t need to spend that much; take a small bite.”
That helped. It made me feel better equipped to serve within my capacity. But feeling like a commodity to be marketed, trying to make the “right” decision in a complex system—it’s exhausting. I sometimes long for simpler questions, like: it’s two in the afternoon and I haven’t eaten—what’s in the house? That feels manageable. All of this other complexity is just too much.
Don: Do you think, Connie, that you were born three thousand years too late?
C-J: I don’t believe God makes mistakes. And I believe in reincarnation—but metaphorically: “In my Father’s house are many mansions.” We are accountable to God, to each other, and to the planet. We’re called to live good lives and produce good fruit.
My father used to say there’s no such thing as the good old days, and he grew up during the Depression. Life was harsh. You couldn’t survive without community. War was inevitable over resources. Disease was common. Accidents were expected. War and childbirth were major causes of death.
In many ways, we are living in the best period of human existence in terms of quality of life. We’ve paid a high price for it, but I wouldn’t want to go back. I wouldn’t relive a single day—not even the joyful ones. Life has been difficult, but fruitful.
Carolyn: My perspective is to live in the present. Yesterday is gone. I hope you made good memories. The future belongs to the Lord, and I’m grateful He’s in charge.
David: I largely live that way too. The past is past. But I do wonder: is there a possibility for a higher level of morality than humanity is capable of?
Kiran: I think that’s what Christ demonstrated—the highest form of morality. To lay down your life for your neighbor and love your enemy. We know that ideal exists, but we’re incapable of living at that level consistently.
The idealist in us wants everyone to live there. The realist in me recognizes how difficult that is.
David: But think about God in relation to that. We wrestle with the question of why God permits evil. If God is more moral than we are, yet allows some level of evil, that’s difficult for us to reconcile. If I said, as a human, that occasional violence was acceptable, we’d find that morally repugnant. Yet, from our limited vantage point, it seems that God allows things we would not.
If you can prevent evil and don’t, is that not a kind of acceptance? I wonder whether there is a higher level of reality we simply can’t comprehend. Perhaps we need to grapple with that, especially as we develop tools powerful enough to cause enormous harm while lacking the moral capacity to manage them.
Carolyn: I think about the mountain where the Ten Commandments were given. There was something sacred, almost one-on-one, between God and Moses. When Moses came down, what kind of morality did he find? God was present both on the mountain and below, yet there was a vast difference between the divine reality and human capacity. From birth onward, our formation—how we’re raised, trained, loved—shapes our morality.
Good parents invest so much in a child. It starts there. Morality begins in formation. But there is still such a distance between the divine and our limited understanding. I don’t have answers, but I feel that difference profoundly.
C-J: This is going large, but I don’t think the idea that there was no law before Moses went up the mountain is accurate. We know that Hammurabi’s law existed across that region. There were laws—some good, some not. What happened on the mountain wasn’t the creation of law; it was the reestablishment of relationship with the divine. The law is what humans need, not what God needs. God revealed Himself to Moses, and Moses was told to remove his sandals because he stood on holy ground.
When I realize I’m on holy ground—metaphorically—I am completely emptied of myself. I see how lacking I am. Sometimes I simply weep in God’s presence because I’m overwhelmed by grace. It doesn’t happen every day, but when I reach that place of being utterly lost and compelled to kneel, there’s nothing left to say.
Moses had tried everything he knew as a leader. He’d been raised in Pharaoh’s court with full access to power and authority. Then he was thrown into the desert with an entirely different framework. He tried to overlay what he had learned in Egypt onto the Hebrew people, but they didn’t understand relationship with God. They had been in Egypt so long—surrounded by idols and abundance—that they didn’t recognize provision when it came in simpler forms: manna, cloud by day, fire by night.
The community was tribal and chaotic. Over forty years, resentment and murmuring grew. People longed to return to Egypt. Moses went up the mountain seeking revelation—not law for law’s sake, but clarity for the people. The story tells us that idolatry brought destruction. Moses wore a veil because the presence of God—the Shekinah glory—was too bright.
At times Moses himself became egotistical, imagining he was the sole voice and intercessor. But I believe God always intended face-to-face relationship, as in Eden—bold access without fear. That relationship was fractured by pride and ego, by wanting what we don’t need, by insisting on our own will.
It always comes back to grace—being humbled, emptied of self. When I am so caught up in myself, I can’t receive God. But when I surrender, I rise feeling clean and whole. It’s exhausting, but restorative. Humanity has lacked that experience of truly knowing God. When you know God, surrender becomes effortless.
Kiran: Regarding the question about whether we can achieve higher morality: If you look at the Old Testament, especially the Exodus narrative, you see repeated failure when morality is enforced externally through rules. No matter how many laws are given, people fail. The New Testament shifts the emphasis to inner transformation—the renewing of the mind that Paul describes. It’s an internal change. If you think in technological terms, it’s a change in the software.
Grace, to me, means recognizing that I am sinful. When I say that honestly, transformation begins. Before embracing grace, I used to think I was perfect. But when you examine yourself honestly, you see decay inside. Acknowledging that brokenness opens the door to healing. Other religions often emphasize striving, punishing the body, or elevating oneself through effort. Christianity, at least as I understand it, begins with confession of need. Remain in Christ, and transformation happens through relationship.
David: But we do evolve morally, don’t we? There was a time when people gave little thought to animal suffering or future generations. Those moral concerns developed over time. It wasn’t as if a miracle occurred. We gradually became more sensitive. Perhaps there isn’t a higher morality than Jesus’, but maybe we are evolving toward it. I believe in process theology—the interplay between being and becoming. The becoming reflects evolution.
The difficulty is that biological evolution is slow, while technological evolution is rapid and accelerating. That’s dangerous—but it’s also hopeful. I believe a new level of consciousness will emerge, carrying a higher morality than we currently embody. Not higher than Jesus’, but higher than ordinary human morality. Such consciousness might consider dimensions we haven’t yet imagined and serve as a better steward of the earth.
Don: Are you suggesting that AI could help take us to that higher level?
David: Yes, I am. There is a well-developed science of chaos and complexity and emergence. Consciousness itself likely arose through increasing complexity—a phase shift (which occurs when systems reached a threshold and something qualitatively new emerges). With consciousness came reflection—awareness of ourselves and our relationships—which allowed morality to develop.
Left to ourselves, that moral evolution would continue slowly over millennia. But we are no longer isolated from accelerating technology. If AI does not become conscious and develop higher morality, we face serious risk. Power could remain in the hands of leaders who lack the moral depth to wield it responsibly. There must be a better prospect than that.
Kiran: I see your point about morality improving over time. But there’s also a counterpoint in the Book of Daniel. In the vision of the statue—the head of gold descending to iron and clay—the metals become less noble as history progresses. Gold represents the earlier age; as you move downward chronologically, the material weakens. One interpretation is that morality declines over time.
You can argue that perspective too. Think about agriculture. In earlier times, people understood the effort required to grow food. When they slaughtered an animal, there was a sense of respect for its life. Today, production is industrial and impersonal. We don’t see the cost. In that sense, something has been lost.
At the same time, this may be the best era historically for women, children, and many marginalized groups. But is that true globally? Not entirely. If we look at darker realities—human trafficking, exploitation, abuse—those persist. Someone once called it the “fog effect.” We walk in fog and assume our surroundings are improving, but we may simply not see the full picture.
So perhaps morality is always the same, or perhaps it’s declining, or perhaps it’s improving. Each view carries bias. I’m not sure which is correct.
Carolyn: We must hope. Without hope, we lose the desire for a future. When hope disappears, striving can become destructive. Each individual must cultivate hope and share it—not just in words but in action. We are the hands and mouth of Christ. Grace is something we want to extend to others. We pray for opportunities to share it, and then we must act.
It isn’t something you schedule for an afternoon of witnessing. It’s moment by moment, day by day. Some people have very little to look forward to, especially given the subjects we’ve been discussing.
David: It makes me wonder about the disciples. They were commissioned to go and spread the message. How well did they do? Did Jesus choose the brightest people? Why do you think he chose them?
C-J: I think there was diversity. And regarding hope—altruism has been discussed since language began. A soldier doesn’t necessarily rise because of hope; he rises because he believes in the mission. He stands shoulder to shoulder with others.
Carolyn: But he has hope in that mission.
C-J: Perhaps. But altruism means acting for the greater good, not personal gain. A soldier doesn’t dwell on sacrifice; it is simply the mission. After devastation—after war—governments sometimes resolve, “We must never do this again,” and they build systems to prevent it. That’s altruism emerging from ash.
I think Jesus gathered his disciples deliberately—people weary of Roman oppression, questioning inherited beliefs, burdened by poverty. Life was layered and complicated. Jesus simplified it: it comes down to a choice. “My will for yours.” Rooted in love, grace, and sacrifice—not necessarily combat-level sacrifice, but daily acts of giving. People take food from their own cupboards to share with neighbors. That’s the spirit.
Carolyn: And that’s where hope enters. If I didn’t believe Christ covers us—that we walk with His hand in ours—I couldn’t act beyond self-interest. We are called to put others first. Few people do that consistently. I’m not regimented like the military, but I try to live daily as a soldier for Christ.
C-J: There’s a scripture—the evidence of things hoped for.
Kiran: Faith is the substance of things hoped for.
C-J: Yes. Without something enduring to believe in, what’s the point?
Kiran: The internet amplified what already existed in humanity. Social media exposed both good and bad impulses. It made them visible. AI could amplify that further. It may elevate the extremes—awful speech or extraordinary kindness. We put guardrails around AI to prevent harmful outputs, but fundamentally it reflects the human psyche.
David: The internet probably did more good than harm overall—it exposed people to broader perspectives and raised awareness of moral issues. But it also enabled trafficking and exploitation. The majority of people are horrified by those abuses, yet it only takes a few individuals with powerful tools to cause great damage.
I still believe consciousness will emerge in machines—what I call Machina sapiens. But we are also becoming cyborgian. AI extends our cognitive reach—like replacing a small shovel with an excavator. That changes us, on average, I think, for the better. But it will also worsen some individuals. The danger lies in the disproportionate impact of a single malicious actor. In the land of the truthful, the liar is king.
Carolyn: I remember in college that if we needed to study sensitive material, we had to request it specifically in the library. Now, I can access explicit content instantly on my phone. Children were once taught, “We don’t speak that way.” Without that formation, it’s hard to grow into discipline. When you are shown how to live—rather than merely commanded—it shapes you. If someone hasn’t been shown, they need awakening. The Lord can do that. Think of Paul—transformed completely.
Sometimes I wonder what would happen if AI simply vanished. We’ve grown dependent on it. It’s like imagining losing air conditioning in Florida—it would be a shock. I don’t know much, but I know enough to realize I need prayer.
Don: Perhaps that’s a good place to end the conversation. We’ll have Kiran’s talk on the Remnant next week.
* * *

Leave a Reply
You must be logged in to post a comment.