We’ve all been there. Whether it’s a freshman in their first week of high school or a senior cramming the night before semester one grades are finalized, we’ve all found ourselves at 11:00 PM — leg shaking, head pounding, eyes blurry—thinking, I wish someone else would do this assignment for me.
In all honesty, I’m there right now, writing this article. And the temptation is stronger than ever, because for the first time in history, someone could do this assignment for me. Not an annoyed friend. Not an overpriced tutor. But something faster, more efficient, and available 24/7: artificial intelligence.
AI has fundamentally changed the way teachers and students approach school, especially in the English department. Weary teachers are trading laptops for pen and paper again, trying to answer the question at the heart of this whole shift: Is AI a powerful educational tool or just a shortcut? Friends Academy has been trying to answer this question as an institution. However, with an issue as complicated and multifaceted as AI, there are bound to be some differing opinions ultimately influencing what the future of AI at Friends looks like. I decided to seek out these perspectives, interviewing three extraordinary and influential voices in our community: English Department Chair Brad Wetherell, AP Literature teacher Leanne Loizides, and Head of School Paul Stellato. To my surprise, these three voices gave differing views, each rooted in sound values and logic.
Let’s start with what everyone agreed on. It’s undeniable to each person I spoke with that AI is here to stay. The train has left the station, and artificial intelligence is rapidly becoming a part of modern life, especially in education. As Head of School Paul Stellato put it, “This is here to stay, and we ought to wrap our arms around it and turn it to our advantage and our students’ advantage.” He believes Friends Academy must take a proactive approach to AI integration. Brad Wetherell echoed that belief, calling AI “a real emerging technology that isn’t going away, and it will be a part of our students’ lives in the future.” For these educators, the question is no longer if AI belongs in schools, but how we move forward with it, a key aspect being assessment.
For teachers like Wetherell and Loizides, AI has most visibly impacted one area: assessment. In a world where ChatGPT can spit out a five-paragraph essay in seconds, teachers are rethinking what authentic student work looks like. “We recently had students handwrite their Frankenstein essays,” said Ms. Loizides. “And I felt such a wave of relief because I knew it was entirely the student’s work… it felt really authentic.” Still, she acknowledged the importance of revision and reflection, adding, “Process essays are one of the most valuable ways of learning writing… we need to find ways to preserve that.” Wetherell agrees, pointing to a growing shift in education: “Many teachers here and at schools across the country see a shift toward assessing the process, not just the final product… AI can easily generate the finished product.”
Together, their views highlight a shared priority: preserving the critical thinking and learning that comes from writing without sacrificing the authenticity or depth of the final product. In this new landscape, the goal isn’t just to avoid AI shortcuts, but to build assignments that foster both meaningful process and original outcomes.
Here’s where opinions start to diverge: What role should AI play in student thinking? Wetherell envisions a future where AI isn’t the enemy of learning, but a partner in it, so long as it’s used with intention. He sees potential for AI to act as an electronic tutor: something that helps students clarify their ideas, test their arguments, and deepen their understanding of complex texts. But he’s also clear about the danger of misuse. “If AI is used to generate their ideas instead of a tool to help them question the ideas they generate themselves,” he warned, “then they’re sacrificing the very thing they came here to achieve.” For him, the real value lies not in AI doing the work, but in helping students do better work themselves.
Loizides shares many of the same concerns but approaches the issue from the lens of pedagogy and student development. She isn’t anti-AI, in fact, she sees its potential to enhance feedback and provide useful scaffolding. But she’s wary of overestimating students’ ability to self-regulate. “At the moment, I think it’s more of a shortcut,” she admitted. “That puts a lot of responsibility on a young person.” She emphasized how easy it is for students—especially those still developing core writing and thinking skills—to lean too hard on AI before those muscles are fully formed. Her approach reflects a deep respect for the craft of writing and a desire to make sure students don’t skip the hard, meaningful work of learning to think and express themselves clearly.
Meanwhile, Stellato frames AI not as a threat, but as a new form of literacy – one that students must master if they hope to succeed in college and beyond. He doesn’t deny the risks, but believes the school’s job is to prepare students for the real world, not retreat from it. “We just got to get in the game,” he said. “The solution isn’t ‘everyone put your computers away.’ Our students are never going to encounter that again.”
Together, their perspectives reflect a shared understanding: AI’s impact depends on how it’s used. When it enhances student thinking, it can be powerful. When it replaces it, it becomes a shortcut with long-term consequences. The challenge for Friends Academy is not whether to use AI, but how to teach students to use it wisely, intentionally, and responsibly.
That tension between support and substitution naturally leads to one of the most common, and contested, questions in the AI-in-education debate: Is AI just like a calculator for writing? It’s an analogy that’s been floated by many, but when I brought it up in my interviews, each educator pushed back, some more sharply than others. “I strongly disagree with that comparison,” said Wetherell. “We don’t give calculators to students until they’ve mastered multiplication. If you give students unfettered access to AI before they’ve developed critical thinking and writing, they’ll never build those muscles.” “I don’t think it’s analogous,” added Loizides. “I think with a calculator, it’s just a tool and you have to know what to input. Whereas AI can actually do the work for you and take away that—any sort of independent processes.” Stellato offered a broader lens, acknowledging the logic behind the comparison, but emphasizing that the urgency of the moment demands adaptation. “Yes, there are students who use calculators before mastering fundamentals. Similarly I would love for every student to have mastered the mechanics of writing before they ever hit return on their computer. But we don’t have time to wait. It’s here. We need to adjust now.”
In their own ways, all three agreed: AI is far more complex than a tool for speed or convenience. It’s not just a calculator. It’s something that can fundamentally reshape the thinking process, and that means educators must treat it with more care, and more intention.
For Head of School Paul Stellato, the rise of AI is not just a curriculum shift, it’s a cultural reckoning. While many schools scramble to keep up with the pace of change, Stellato urges Friends Academy to focus less on what AI might become and more on what students need right now. “We need to stop trying to build a solution for what AI might be in a year,” he said. “We need to build for what’s in front of us, and stay flexible.” For him, that means crafting a policy not around fear, but around student growth. He believes that if the school can remain grounded in its mission—developing thoughtful, capable, and ethical learners- then the technology can serve that mission, not threaten it. “If we build a policy that meets the needs of every student, every faculty member will be thrilled.”
But Stellato is also clear: AI must not replace the deeply human relationships and reflection that define a Friends Academy education. In fact, he believes the future lies in a hybrid approach- one where mentorship and machine intelligence can coexist, each strengthening the other. “You get from Mr. Brennan things AI can never give you. And AI will give you things Mr. Brennan can’t,” he said. “The goal is to ensure students get the best of both.”
In other words, the future of AI at Friends Academy isn’t just about tech. It’s about trust, balance, and the belief that innovation and human connection aren’t mutually exclusive, they’re both essential.
As I wrapped up these conversations, one thing became clear: the future of AI at Friends Academy won’t be defined by panic or passivity, it will be shaped by thoughtful people who care deeply about learning. Whether through a handwritten essay, a reimagined assessment model, or a carefully crafted school policy, the goal is the same: to help students become better thinkers, not just faster writers. AI is a tool, yes, but it’s also a test. A test of how we preserve curiosity, challenge complacency, and prepare for a world where machines can generate answers, but only humans can ask the right questions. The road ahead will require courage, creativity, and conversation, and thankfully, Friends Academy has all three.