Keeping Learning Human
Across Calvin University, faculty and staff are confronting the rise of artificial intelligence with a mix of pragmatism and conviction, asking what this technology means for teaching and learning. These campus leaders describe a moment that demands discernment, clarity, and a renewed commitment to the practices that form thoughtful, ethical, and adaptable graduates.
"The fundamental technologies of AI are a gift of God that we must cultivate with his wisdom." – Ken Arnold, associate professor of computer science
Stewarding AI Through a Reformed Framework
Derek Schuurman, professor and department chair of computer science
The remarkable advances in AI have generated a range of predictions—from a coming Utopia to dire warnings of human extinction. As Reformed Christians, we recognize that AI represents part of the latent possibilities in God’s good creation, but a possibility that can be misdirected by sin. While many researchers focus on what AI can do, we need to begin to discern what we ought to do with AI. As Christians, we are called to exercise freedom and responsibility to direct AI in ways that answer God’s call to love our neighbor and to care for the earth and its creatures.
Calvin engineers and computer science students have been taught to use a set of design norms that are derived from Reformed Christian philosophy and can be used to guide responsible engineering and inform AI design decisions. These include considerations like justice, cultural appropriateness, social norms, aesthetics, care, trust, and stewardship. These norms help direct technology, like AI, in ways that can bring shalom nearer.
One of my dreams is to establish a Center for Faith and Technology at Calvin to enable the university to present a public, credible, and Christian voice on AI and other new emerging technology issues. Such a center could serve the church and Christian education but also speak into the shaping of public policy with a voice that is both biblical and relevant. I think Calvin has both the technical as well as the theological and philosophical depth to provide a clear, moral voice amid the ongoing disruptions as AI continues to unfold.
Recovering the Human Dimension of Learning
Katie Day Good, associate professor of communication
AI has complicated college life, but it has also prompted faculty and students to reflect more deeply on the value of a college education. AI can churn out information and imagery in a sophisticated, humanlike manner, but it cannot think, feel, or empathize. Those capacities emerge in a human community and particularly a Christian, liberal arts community like Calvin.
This perspective comes not only from faculty but from students. In the past year, I’ve been heartened by the number of student-authored op-eds in the Chimes, blog posts, and AI guidance documents that urge a discerning, cautious, and faithful approach to AI. They suggest that students want more than practical skills in AI use; they also want the cultural, historical, theological, and domain knowledge to live and work wisely in a world of sophisticated technology.
Paradoxically, equipping students for wisdom in the AI age requires giving them some reprieve from the digital platforms and devices that have come to dominate daily life. While using AI remains essential in some classes and computers should always be available for students with accommodations, it is also important for students to have the space and time to think, speak, write, and relate to others without AI’s assistance.
For this reason, many of my classes are intentionally low-tech. I place greater importance on oral rhetoric, presentations, and discussions than I did in the past. Students complete most writing in-class, by hand, and sometimes over multiple sessions. Far from feeling like a step backward toward antiquated methods, this move has felt instead like a step forward toward reaffirming students’ authentic voice
A 'Brain-First, AI-Second' Approach
Philip Johnson, associate professor of business
Within the School of Business (CUSB), we’ve adopted a “brain-first, AI-second” approach to ensure that technology supports—not replaces—human learning. Like most people, students naturally gravitate toward efficient answers, so we hold honest conversations with them about when AI is helpful and when it undermines learning. Sometimes the guidance is as simple as, “Don’t use AI on this assignment because you need to learn XX and YY, and AI will prevent that.” Defining purpose is essential.
At the same time, we also create AI-powered agents in some courses to help students study, negotiate, or receive faster feedback. These tools are used heavily across CUSB and, when applied thoughtfully, enhance the learning process.
I believe the kind of learning we cultivate at Calvin—learning that embraces ambiguity, wrestles with uncertainty, and forms whole persons—is increasingly essential in an AI-saturated world. Our mission to equip graduates as Christ’s agents of renewal is not only relevant but urgently needed.
The ethical questions surrounding AI are real. AI can displace or dehumanize work, and we must resist that even as we acknowledge the need for efficiency. AI also consumes significant power and resources, raising environmental concerns that must be weighed alongside its benefits. These tensions require thoughtful, principled engagement.
The theological foundation that shapes my teaching—that we are image-bearers of God called to be like God in relation to others—remains unchanged. What has changed is the context in which we apply this. One of the best parts of my work is wrestling with questions alongside colleagues who bring deep expertise and good-faith engagement.
A Sophisticated Tool for Learning
Emily Bosscher, peer support coordinator and academic counselor
Like so many other aspects of education, ethics plays a crucial role in the use of AI. The more we talk about AI as a tool and help students understand the value of using it appropriately, perhaps the better we can be about resisting the temptation to let it “do our work for us.”
For a generation that has mastered the life-hack (and in many good ways!), this is a place where a hack should not be about having AI give the answers, but figuring out how to use it to hack effective and efficient learning (and I believe this is possible).
In the tutoring center, we use AI to review concepts, simplify difficult ideas, or offer study skill support. For all its capabilities, ChatGPT is not going to be present with students during a test, so it is still crucial that students learn how to apply processes and formulas for themselves.
Writing Should Be Hard
Kristine Johnson, professor of English, university rhetoric director
When I became the Rhetoric Center director, my then third grader thought the center needed a motto. He wrote “writing is hard but fun” on the whiteboard, and we never erased it. The second part of the sentence may feel debatable, but the first part is true: writing is hard. Because writing is hard, it is a way of learning. Through the very act of writing, we discover new ideas, and we understand our ideas in new ways.
Generative AI allows writers to skip over the hard process straight to a final product. I argue that there is value in learning to write the hard way, and this value is at the heart of a liberal arts education. One goal of our core curriculum is that students will “communicate effectively using rhetorical strategies.” Using rhetorical strategies means asking these kinds of questions: is my argument interesting? Are my ideas in the best order to make my point? Am I considering the culture of my audience? Another core goal is for students to “critically evaluate sources using current information literacy skills.” We want students to use information responsibly, and we want them to value truth.
In the liberal arts curriculum, it is appropriate and beneficial for students to learn without generative AI as a primary tool. Learning the hard way helps students develop a flexible approach to communication, to grow in curiosity, reflection, and empathy. These students will know how to adapt AI-generated text, how to fact-check, and, critically, they will know that they must do these things to be ethical, effective communicators.
Preparing Students for Christian Vocation
Courtney Banks-Tatum, director of the Career Center
At the Career Center, we help students understand their work not merely as economic survival, but as participation in God’s redemptive work. This theological grounding equips graduates to approach their careers with humility, courage, and a commitment to the common good—qualities that matter deeply in workplaces facing ethical, social, and technological disruption.
When used well, AI can enhance the student experience as a tool for informed decision-making. It helps students navigate job uncertainty by providing timely labor-market insights and exploratory tools that connect academic interests with emerging opportunities. It can also assist with résumé feedback and interview preparation. Yet AI cannot replace the human work of mentoring students as they discern calling, values, and direction. Its proper role is to complement—not substitute for—relationship and responsibility.
In a labor market where technical skills evolve rapidly and can quickly become outdated, liberal arts learning cultivates durable capacities: critical thinking, clear communication, ethical judgment, and the ability to engage complexity with wisdom. Employers consistently note these strengths in Calvin graduates.
The most employable graduates are not trained for a single role but equipped for lifelong learning—combining transferable skills with vocational discernment rooted in a Christian understanding of calling.
The Real Question is not Technological, but Human
Craig Mattson, Arthur DeKruyter Chair in Faith and Communication
When ChatGPT burst on the scene back in the early 2020s, I kept hoping that somehow we would get beyond it. Maybe it’s all hype, I thought. Or maybe it’s going to be great, and we’ll all become massively productive.
But what I really wanted to know was, how do we get beyond this exhausting toggle of naïve optimism and deep anxiety?
You can’t put the question into a chatbot. Well, you can, but only if you want an infographic or a plastically perfect op-ed. Besides, I’ve come to think of “how do we get beyond this” as the wrong question. In an age when large language models manage to be uncannily personable and scarily efficient, the more important question is Wendell Berry’s: “What are humans for?”
Answering that question doesn’t enable us to escape our technological moment. But it does make us aware of what is larger than the moment.
I’m writing this in a coffee shop. Across from me, a couple is holding hands. At another table, a sixty-something man sits with a slant of sun falling across his face as he listens to his conversation partner. That handclasp and that listening presence are simple realities so much larger than any concept our most powerful language model could use to describe them.
For Christians, getting to what’s beyond our technological moment entails attention to the Incarnation. There, we find God’s determination to go beyond self-interestedness to share in suffering, grief, joy, and uncertainty. However AI enables or disables our species, I hope we can keep finding ways to move in that same way—beyond the self into generosity and care for one another.
Interested in learning more about this topic? You're invited!
Wisdom in the Age of AI Conference
Thursday, Oct. 8 - Saturday, Oct. 10, 2026
Calvin University
Join top Christian thinkers, practitioners, and the broader public to explore how faith can shape the responsible development and use of AI, while advancing human flourishing and contributing to shalom.