Skip To Main Content

 

WANT TO LISTEN TO THE AUDIO VERSION OF THIS POST? CLICK THE PLAYER BELOW TO HEAR HEAD OF SCHOOL ANDREA KASSAR READ IT.

 

The pursuit of some truth, saying something that is real, being genuinely understood better for conveying that real thing (and understanding others better for the same reason), feels like maybe all we’ve got ultimately. That AI takes us further away from this feels like an existential regression.

Hamlet has been on my mind recently, and not just because of Taylor Swift’s new album that references Ophelia in the first track. In what is maybe a strange irony, it’s been on my mind because of ChatGPT.

I first encountered the play in a Shakespeare elective in 12th grade, and like many young students, became fascinated by so much: the motif of ears throughout the play, the relationship between words/thoughts and (delayed) action, the so-intense Gertrude closet scene. But perhaps the thing I remember most from that first reading of the play was the concept my teacher emphasized as it related to the play-within-the-play scene: holding the mirror up to nature. To be specific, as Hamlet is trying to “catch the conscience of the King” (II, ii, 600-601), he instructs the players to “hold as ‘twere the mirror up to nature” (III, ii, 22) —to create a performance that reflects real life so much that it makes the audience see the truth. And of course, good theater—or good art in general— often does just this; it gives us a mirror to see ourselves, to see nature or real life.

The word that sticks in my head from our robust class discussions of this scene is “verisimilitude,” which means the appearance of being true or real (veri – true; simil – like). I remember thinking a lot about the idea that while verisimilitude can be helpful (as it is to Hamlet in exposing the king), it is not reality. It is what “seems” (as Hamlet and Gertrude debate in I, ii, 75-76). It is, at best, an approximation that tricks our senses. It reminded me of other interesting words I had recently encountered with similar connotations of mimicry, such as “ersatz” (a substitute, typically an inferior one). 

These days, as I read article after article about AI, I have my high school Hamlet class discussions in mind. As a head of school, I know that it is our responsibility to learn about, teach about, and be part of the very important conversations around the whathowwhen, why, and ethics of all aspects of AI. This is especially meaningful and vital in a girls’ school because the form and content that shapes the future of AI must have women’s voices and ideas as equitably as possible in the mix. Keeping our heads in the sand is not an option; that would be an especially dangerous posture in a forward-thinking girls’ school. And, to be honest, one of the things I have loved most about learning about AI and all its increasingly mind-boggling capabilities is that it is teaching me (all of us) more about what it means to be human—about the boundaries of humanity.

As a student and teacher of the humanities, this is endlessly interesting to me. And yet, at the same time, of course, this also makes us consider more deeply the potential threats to our human integrity that AI poses and how all of this connects to the classroom environment, to teaching and learning, and to the student-teacher relationship. What happens when, for instance, the teaching and learning of writing must contend with the lure of ChatGPT, a tool whose purpose for writing is, at best, verisimilitude? And what is it about verisimilitude itself that is potentially so harmful to how students learn to write and to the classroom experience in general?

What is the goal of writing is maybe too big a question to grapple with in this post (or in general!), but… just for a second… it seems to me that at its most core, teaching writing is teaching young people how to get as close as possible to communicating the thing itself—the thing, the truth, the idea they are trying to get across. While words themselves are symbols/abstractions that can sometimes limit or distance us from the thing itself, we must teach writing because it is one of the most earnest ways we have as humans to connect with each other—that is, to send my concept of the thing itself to you in the hopes that you then come to better understand both my concept of the thing itself and also me, the person. And ideally that is reciprocated. To understand and to be understood. Because words themselves are abstractions and have so many connotations, this “thing to thing” content transfer is never perfect or direct in the most mathematical sense, but it can be pretty darn good for human connection—the stuff of poetry.

 But AI further distances and disrupts this “thing to thing” transfer process because the locus of origin is no longer one human’s mind/heart/soul but rather all the things that were ever written. It is an attempt to create a verisimilitude of both the original source (the writer’s mind) and the content of the thing they are writing about. So now, we have entered into a triangular relationship as writers and readers, not only with the form or symbolic nature of a word (which has always been the case with written communication) but now also with all the writing that ever was, and AI’s ability to create an ersatz version of both the writer and the content. None of this is great for our individual subjectivity, for our human connection; it leaves a kind of lonely and empty feeling, because it favors the imitation of the thing over the thing itself, and therefore the performance of humanity over humanity itself.

And as we get older… or at least, to speak from the “I perspective”… as I get older, the thing itself feels so vital. The pursuit of some truth, saying something that is real, being genuinely understood better for conveying that real thing (and understanding others better for the same reason), feels like maybe all we’ve got ultimately. That AI takes us further away from this feels like an existential regression. There have been zillions of articles on AI over the last few years, but the two I have read that get most meaningfully at these ideas were both in the New York Times and both written by professors of creative writing: “I Teach Creative Writing. This Is What A.I. Is Doing to Students” by Meghan O’Rourke and “I Teach Memoir Writing. Don’t Outsource Your Life Story to A.I.” by Tom McAllister. Both articles discuss the idea that writing connects a human being to aliveness. “I am a writer because I know of no art form or technology more capable than the book of expanding my sense of what it means to be alive” (O’Rourke) and, “This process (of writing memoir) is—and I don’t say this lightly—an act that makes the author more fully alive” (McAllister). Writing connects me to aliveness because when I have ideas bouncing around and collecting up in my mind, writing them down and seeing them physically on a page feels like a relief that is connected to something freer, and that feels alive. And sometimes, it’s more about construction and creation—there was nothing and now there’s something— which is also life-affirming because there it is, and I had to do with that.

None of this type of aliveness happens when the thing that is alive is not doing the writing.

But to focus back in on the classroom, there is new and important research being done on the impact of AI on the student-teacher relationship with regards to trust on both sides: Another AI Side Effect: Erosion of Student-Teacher Trust – The 74. This is one of the most heartbreaking aspects of AI both for educators and for students. Maybe one of the reasons that AI is causing mistrust in the classroom is because the direct bond of communication and exchange of ideas between a teacher and a student—through both the content of writing and the teaching of writing—has now itself entered into the same type of potentially triangular relationship I discussed earlier. With a verisimilitude of good ideas. And when direct student-teacher communication has the potential to get disrupted so easily, that’s a barrier, a distancing—and distancing is inherently something of mistrust. Because authenticity builds trust, directness and human connection build trust; triangulation, distance, and ersatz ideas erode it.

 And maybe this feels especially heavy to teachers of the humanities (at least, that’s what I’ve seen) because they are the ones most explicitly teaching writing, but maybe it’s more than that. Teachers’ upset about AI is often couched as a concern about “cheating”—but the worry and emotions of this seem deeper than that, deeper than a concern about academic dishonesty. There is something more poignant here. Humanities teachers are the ones who have built their interests and careers and areas of study on the human. That AI has the potential to disrupt and triangulate the direct human connections (as well as the prioritization of humanity) in the humanities classroom is an existential issue. I suspect it is the potential turning away from the human that is the most upsetting part of this to those who have built their lives and ways of thinking and being around humanism—both in the content and the form of their teaching.

So, what is there to do about all of this? That’s a question I am often asked as a head of school, and anyone would be disingenuous if they told you they knew the answer (ChatGPT might, but that would be verisimilitude at its finest!). While there is no easy or right answer, and the “answer” is ever-changing and evolving anyway as the technologies rapidly evolve, I know for sure that trust is the cornerstone of any relationship, including that of student and teacher. And so, my best advice here—and the one that I have given to my academic team and teachers as I share articles and discuss these complex topics—is to bring the students (in age-appropriate ways) into all this thinking. Read these good articles with students and discuss them. What do they think? Why do they think learning to write and think for themselves is important? Come up with some shared ideas about them together as a class. Because AI is always going to be here now with the potential to distance, and so we have to be ever more intentional about trust building in the classroom, about human-building in the classroom. Taking the time to have these important conversations, establishing trust, communicating directly, and getting to know each other as human thinkers, learners, and writers makes a real difference in bringing students and teachers to the same side as a starting point for collective learning and growth. And when you are on the same side in a classroom, a mirror or mimic no longer makes sense. Verisimilitude is not invited in.
 


ABOUT THE AUTHOR:
Westridge Head of School Andrea Kassar

Andrea Kassar is the head of Westridge School. She has more than 20 years’ experience as an administrator and teacher at girls’ schools in New York, and is a graduate of Brearley, a K-12 girls’ school in Manhattan. She holds a B.A. in English and psychology from the University of Chicago, an M.A. in English and comparative literature from Columbia University, and an M.A. in psychology from the New School for Social Research. She is also the parent of three children—Lucy '25, Billy, and Cecily.

 

Read more recent posts: