Replace your TAs with robots? It sounds like a short story rejected by a science fiction magazine, but a dean at Boston University (BU) seems open to the idea.
In an email to faculty during an ongoing graduate student union strike, Stan Sclaroff, the dean of BU’s college of arts and sciences, wrote that while their graduate student assistants were on strike, faculty may “Engage generative AI tools to give feedback or facilitate ‘discussion’ on readings or assignments.” Granted, this was one of several suggestions for continuing normal coursework during the strike, including such sage wisdom as “combine discussion sections,” “assign readings,” and my personal favorite, “consider alternate assignments such as viewing a video/film.”
Beyond the obviously lackluster response to the legitimate concern of how to continue instruction at a university that relies heavily on its TAs, the possible use of artificial intelligence (AI) tools to replace paid, human workers in higher education is concerning.
It speaks to two key issues in the model that many prestigious universities like BU practice: over-reliance on underpaid graduate students and overlooking the importance of the in-classroom experience. If higher education continues down this path, experience, outcomes and enrollment will suffer in the long term.
As the Daily Beast pointed out, the higher end of graduate student stipends at BU comes to around $40,000 per year, a far reach from the living wage in Boston. Meanwhile, the university’s total yearly cost for undergraduates has crossed the $90K mark. If elite universities use costs to justify raising tuition whilegraduate students are in the classroom educating a sizable proportion of the undergraduates paying that tuition, how can they justify not paying graduate students a living wage? Frankly, if your university runs on graduate students working crazy hours in labs and running the discussions in your most important introductory courses and still not making enough to make ends meet, you deserve to fail.
So, where does AI fit into all this?
One of the major promises of the recent AI revolution, driven by large language models (LLMs) like OpenAI’s ChatGPT, has been that these technologies would drive the cost of information synthesis and delivery to near zero. For universities looking to save costs, this seems great. Why pay a faculty member or graduate student to deliver content to students, create assignments or assess learning, when for just $20/month, a ChatGPT subscription can give you the same performance.
This is a false equivalency, of course. Graduate students can be highly effective lecturers, answer questions effectively during discussion sections and provide detailed feedback on student assignments. The notion that their work can be replaced by a technology that continues to suffer from inaccurate “hallucinations,” plagiarism and damaging bias is misguided and insulting.
Fundamentally, if a network of people who are supposed to be full-time students and researchers is the only thing between a functioning university and a glorified YouTube how-to course, your learning model is flawed. Graduate students deserve better, both in pay and working conditions.
Another concerning trend that reliance on AI tools suggests is the downfall of the classroom experience. Perhaps I am biased, since I attend Dickinson, a small liberal-arts college where most of our classes are discussion-based. However,the BU dean’s guidance applied to discussion sections of larger classes as well so the concern still applies. The whole point of discussion-based classes and sections is to construct a better understanding of a topic collaboratively among a group of students and an instructor. If we outsource that to tools with all the flaws of a technology built under Silicon Valley’s “Move fast and break things” philosophy, we miss the whole point of building collective human comprehension.
The point of this column is not to debate the merits of AI development, nor the capabilities of modern AI tools (I used ChatGPT to help come up with headlines for this piece). And there are ways to do this right – even here at Dickinson, students and professors are working on ways to incorporate AI tools into the language learning experience. But if you are a higher ed administrator and you want to devalue the work of your graduate students and degrade the classroom experience of your undergraduates by adopting under-tested and ineffective technologies, please stop.
Pay your graduate students what they are worth. Don’t try to slap emerging technologies as a catch-all over flawed existing systems. And always, always, always center the student experience when making decisions that are going to fundamentally change the way they learn.
Got it, Stan?