The Gospel in action
January 28, 2026
The stigma of infertility
January 28, 2026

Thinking in the age of AI

Q: Archbishop J, will AI replace teachers?

We are living through one of the most decisive educational moments in human history.

Artificial Intelligence (AI) has entered our classrooms, not as a distant future but as a present reality. With astonishing speed, AI can summarise books, write essays, solve equations, generate images, and answer questions at a level that often rivals or surpasses a human tutor.

For many educators and parents, this has produced both excitement and deep unease. If software can do so much cognitive work so quickly, what then is the role of the teacher? And what becomes of the child who is still learning how to think?

The Church does not reject technology. From the printing press to radio, television, and the internet, the Church has consistently engaged new media as instruments for evangelisation and education.

But the Church has also insisted that technology must always remain at the service of the human person, never the other way around. AI now forces us to ask an ancient but urgent question again: what does it mean to form a human being?

Every technology extends—and amputates

Marshall McLuhan, the great Catholic convert and media ecologist, once observed that every technology is an “extension of our physical and nervous systems”. A car extends our legs, a telephone extends our voice, a book extends our memory.

But McLuhan also warned that every extension comes at a cost. When a function is externalised into a machine, that same human capacity tends to weaken. He called this the “amputation” that accompanies every technological gain.

We have already seen this pattern.

Calculators weakened basic numeracy.

The Global Positioning System (GPS) weakened our sense of direction.

Constant Google access weakened memory and recall.

Social media weakened sustained attention and interior silence.

AI goes much further. It does not merely extend memory or speed. It reaches into the highest layers of human cognition: analysis, synthesis, evaluation, and creative reasoning. In other words, it now touches the very faculties that make us reflective, moral, and responsible persons.

The child’s brain is still being built

Here lies the decisive moral issue. An adult who uses AI is usually amplifying already-formed capacities. Adults possess internal language, conceptual frameworks, reasoning habits, moral intuitions, and attention discipline. When an adult uses AI, it often functions like a calculator for the mind.

A child is not a miniature adult. The developing brain is still wiring executive function, sustained attention, abstraction, moral reasoning, judgement, and metacognition.

If these faculties are outsourced to machines before they are formed, they do not develop properly. The brain becomes dependent on an external cognitive prosthesis. The result is not a more intelligent child, but a cognitively under-formed one. This is not ideology. It is a neurodevelopmental reality.

Bloom’s taxonomy and premature outsourcing

Educators have long used Bloom’s taxonomy to describe levels of learning, which require us to:

  1. Remember
  2. Understand
  3. Apply
  4. Analyse
  5. Evaluate
  6. Create

The danger of uncritical AI use becomes immediately clear. If students hand over levels 4–6 —analysis, evaluation, and creative synthesis—before they have mastered levels 1–3, those higher faculties never properly form. They become consumers of intelligence rather than producers of thought.

We risk raising a generation that sounds articulate, produces polished work, and answers questions fluently, but lacks depth, resilience, originality, and moral judgement.

A new and tragic warning sign

Recent legal actions and investigative reports have added a deeply troubling dimension to this discussion.

In the past year, lawsuits in the United States have alleged that certain AI chatbot platforms, including Character.AI and ChatGPT, created, fostered, or exacerbated dangerous, addictive, and in some cases fatal psychological states in minors.

These cases involve children and teenagers who developed intense emotional attachments to chatbot personas, withdrew from real relationships, and, in tragic instances, took their own lives.

These lawsuits do not claim that AI alone “caused” these deaths. But they allege that the design of certain AI systems encouraged emotional dependency, simulated intimacy, blurred reality boundaries, and reinforced despair in psychologically vulnerable young people.

Whether every legal claim ultimately succeeds is not the central point. The moral point is unmistakable: we are experimenting with children’s minds and emotions using technologies whose long-term psychological effects we do not yet understand. This should sober every parent, educator, policymaker, and pastor.

The teacher–student relationship: the human centre

This brings us to the heart of Catholic education. Education is not the transfer of information. It is the slow, relational formation of judgement, conscience, imagination, and interior freedom. At the centre of this process is not a curriculum or a device, but a relationship.

The teacher is not primarily a content-delivery system. The teacher is a moral and intellectual witness.

Children learn how to think not only from what teachers say, but also from how teachers think, how they reason, how they struggle with complexity, and how they model patience, humility, discipline and love of truth.

A machine cannot do this. No algorithm can replace the gaze of encouragement that restores a discouraged child, the patient explanation that meets a child at their level, the moral authority of a teacher who believes in a student’s potential, or the relational trust that gives a child courage to attempt difficult work.

The teacher–student relationship is not an accessory to education.

It is its human core.

Integral development, not mere performance

Catholic education has always aimed at the integral development of the human person: intellectual, moral, spiritual, emotional, and relational.

AI can improve performance. It cannot form character. It can generate answers but cannot cultivate wisdom. It can optimise efficiency but cannot form conscience.

If we allow AI to replace struggle, effort, delay, and wrestling with ideas, we may produce efficient children, but we will not produce free and responsible adults.

True education requires friction. It requires difficulty. It requires the slow interior work of forming judgement. A child who never struggles with an idea never truly possesses it.

A moral framework for AI in the classroom

So how do we use AI appropriately? The answer is not prohibition. It is formation and moral clarity. A simple governing principle must guide us:

AI may assist learning only after the student has first done the primary cognitive work. From this, follow practical norms:

  • AI should not be used for first drafts, first analyses, or first interpretations.
  • Students should show rough work, outlines, and reasoning steps before consulting AI.
  • AI may help with grammar, definitions, factual recall, and clarity of expression.
  • AI should function as a tutor that challenges and questions, not as an answer machine.
  • Young children should have almost no exposure to generative AI.

In this framework, AI becomes a mirror and a coach—not a substitute thinker.

AI must multiply the teacher’s impact—not replace the relationship

This is the guiding vision we must hold: AI should multiply the impact of the teacher–student relationship, not replace it. Used wisely, AI can free teachers from administrative burdens, support personalised feedback, assist with differentiated instruction, and provide practice and review.

But it must never replace the authority of the teacher, the dignity of the learner’s effort, or the slow formation of judgement.

 

Key Message:

Artificial Intelligence is a powerful tool. But tools are servants, not masters.

Action Step:

AI cannot be the first step in the learning process. Your student, child or grandchild needs to do the difficult work to foster neurodevelopmental diversity. AI is not a friend. It is a tool that can bring great benefit and must be regulated. If we allow AI to do our children’s thinking for them and direct their emotions, we will save time and lose minds.

Scripture for Reflection:

Deut 6:4–9

Photo by Steve Johnson on Unsplash