Artificial Intelligence and Christian Education

(republish from Australian Christian Teachers Journal)

Education has been significantly disrupted by emerging technologies at various points. This mirrors the way in which the patterns and rhythms of society have been impacted by new technologies. However, the impact on education is most pronounced with technologies associated with communication and information. This leads us to the big question of this cultural moment: Will artificial intelligence—with its exponentially increased proficiency in facilitating communication and information—be the most disrupting technology to have impacted education and schooling? This article explores this question and considers how Christian schools seeking to unfold a distinctive Christian education might respond faithfully.

Will artificial intelligence—with its exponentially increased proficiency in facilitating communication and information—be the most disrupting technology to have impacted education and schooling?

Any discussion of technology—its impact and our Christian response—will need to consider if technology is merely a neutral tool. The nature of a faithful response pivots on this question. 

An Instrumentalist Approach to Technology

Assertions like this are common: “The technology itself is not the problem; the issue is when human sinful nature uses it in ungodly, self-serving ways.” I have heard versions of this expressed by teaching colleagues when reflecting on the possible integration of artificial intelligence into learning and teaching practice. This instrumentalist position under-appreciates the shaping effect that the mere existence of a technology has on society and individuals, even when it is being used wisely and with godly intention. Media commentators and sociologists suggest that this neutral view is a naive view—even if it is the view we would prefer to be true. Carr suggests, “The idea that we are somehow controlled by our tools is anathema to most people” (46). 

Marshall McLuhan, famous for stating that “the medium is the message” (9), also makes the following bold assertion about an instrumentalist position:

Our conventional response to all media, namely that it is how they are used that counts, is the numb stance of the technological idiot. For the “content” of a medium is just the juicy piece of meat carried by the burglar to distract the watchdog of the mind. (18)

New technologies, especially those with widespread societal integration, are not merely additive. We don’t simply get the existing society plus the new technology; rather, they change the whole ecology of a society

New technologies, especially those with widespread societal integration, are not merely additive. We don’t simply get the existing society plus the new technology; rather, they change the whole ecology of a society—in the same way that rabbits, as Challies suggests, were not just an addition to Australia but completely and unalterably changed the entire ecological landscape. He continues:

A technology changes the environment it operates in. It changes the way we perceive the world. It changes the way we understand ourselves. . . . We are often oblivious to this kind of systemic change. The generation that spans these technological transformations may recognise that such changes are happening, but those who are born into them are blinded to them. (40)

Social media, as an example, nudges users (and society), by design, toward self-promotion. The design architecture, the language of the interface, and the way it’s promoted all push a narcissistic framing of social interaction. Even using social media in a godly way may well result in a normalizing of narcissism that is both subtle and unintended, to the point where an ecological shift has occurred where narcissism becomes a character virtue (Twenge and Campbell). The list of possible examples, particularly for social media, of this non-neutrality is almost endless. 

The non-neutrality of technology—or the shaping influence of technology even when we are using it well—is not uniformly distributed. Two factors impact the shaping effect on individuals and culture. The first factor is how widespread the implementation is within society. Some technologies are not of interest to a significant proportion of the society for a variety of reasons, and some technologies are not financially viable for a significant portion of society. The widespread shaping impact of these will have obvious limitations. The second factor, and most worthy of note when it comes to artificial intelligence, is that the closer the technology is to replacing or mediating fundamental aspects of our humanness, the greater the non-neutrality and need for deeper layers of discernment to protect our humanness and our expression of our being made in the image of God.

Artificial intelligence, with its offer to reduce the friction of aspects of human functioning and human being, is seen not only as attractive but highly captivating to every human soul. It is also within the easy (affordable) reach of all due to its integration into existing platforms. The non-neutrality of artificial intelligence is unprecedented.

Christian Response: Discernment and Responsibility

The ability that God has given us to invent and innovate, to produce and promote, is a  creational blessing. He has woven into this world a rich, latent potential that includes cultural pursuits like technological innovation (Wolters). The Bible doesn’t give us cause for seeing technology as evil or for embracing it with such optimism that it becomes a savior. Schurmann suggests that “a trust in technology, sometimes referred to as technicism, is essentially a form of idolatry” (12). 

However, the existence of these dual biblical positions doesn’t lead us to a middle-ground technological neutrality. The full biblical narrative has us understand that the effects of the fall are so widespread that the creation (including its cultural potential) groans under its weight. Therefore, even when technological artifacts are developed free from self-serving, profit-hungry, power-maintaining motives within the designers, there will still be a “pattern of this world” woven into the technology. The taint of the fall goes deep into every nook and cranny of the dynamics of creation. There is a darkness working in concert with a suite of hollow and deceptive philosophies that will take captive the undiscerning. 

The Christian life, among other things, is therefore a call to discernment and wisdom—including when and how we engage with technology. In God’s economy, wisdom is a rich idea that also includes, as fulfilled in Jesus, a love for neighbor and a desire to see neighborhoods—and by extension society—transformed for human flourishing. When the Christian is involved with technological development/implementation, she ought to be drawn to a vision for the technology motivated by human flourishing (love for neighbor).

The first step in responsible design is to recognize the non-neutrality of technology. Second, we must understand that as soon as we develop any new technology, we uncover a new class of responsibilities (Harris and Raskin, “AI Dilemma”). In the excitement of a new technology, it takes discipline and wisdom to stop and ask questions about implications and corresponding responsibilities. Technological commentators have suggested various question templates to help this process. Postman (“Surrender of Culture”), as pioneer in this field, proposes six critical questions:

  1. What is the problem to which this technology is the solution? 
  2. Whose problem is it? 
  3. Which people and what institutions might be most seriously harmed by a technological solution? 
  4. What new problems might be created because we have solved this problem? 
  5. What sort of people and institutions might acquire special economic and political power because of technological change? 
  6. What changes in language are being enforced by new technologies, and what is being gained and lost by such changes? 

Who is asking these and other pertinent questions in the case of predictive, large language model artificial intelligence?

This is an abridged version of this article. To read more, subscribe to the print or digital edition of Christian Educators Journal.



Chris Parker is the author of the book The Frog and the Fish: Reflections  on Work, Technology, Sex, Stuff, Truth, and Happiness, written for young adults and parents (short–listed for Australian Christian Book of the Year). He has a particular interest in the shaping effect of technology on life and faith. Chris and his wife, Coco, live in the Blue Mountains west of Sydney, Australia where he serves the families of Wycliffe Christian School as Christian Foundations Leader.


Works Cited

Carr, N. The Shallows: What the Internet Is Doing to Our Brains. Norton, 2010.

Challies, T. The Next Story: Life and Faith after the Digital Explosion. Zondervan, 2011

T. Harris, and A. Raskin, hosts. “The AI Dilemma.” Your Undivided Attention. Centre for Humane Technology, 24 March 2023, www.humanetech.com/podcast/the-ai-dilemma.

T. Harris, and A. Raskin, hosts. “Esther Perel on Artificial Intimacy.” Your Undivided Attention. Centre for Humane Technology, 17 August 2023, www.humanetech.com/podcast/esther-perel-on-artificial-intimacy.

McLuhan, M. Understanding Media: The Extensions of Man. MIT Press, 1964.

Postman, N. “The Surrender of Culture to Technology.” Lecture, 11 March 1997, College of DuPage, Glen Ellyn, IL, www.youtube.com/watch?v=hlrv7DIHllE&t=3034s.

Schurmann, D. “ChatGTP and the Rise of AI.” Christian Teachers Journal, vol. 31, no. 3, 2023, pp. 10–13.

Twenge, J. M., and W. K. Campbell. The Narcissism Epidemic: Living in the Age of Entitlement. Simon & Schuster, 2009.

Wolters, A. M. Creation Regained: Biblical Basics for a Reformational Worldview. Eerdmans, 2005.