The student news site of Dickinson College.

The Dickinsonian

The student news site of Dickinson College.

The Dickinsonian

The student news site of Dickinson College.

The Dickinsonian

AI

As a history major, one of the most important skills my professors have always emphasized and required me to develop is the ability to write. But more so, we must learn how to communicate well. We must be efficient, effective and concise in communicating our ideas to an array of audiences, from our professors to the general public. We must be able to comprehend, synthesize and analyze relevant information to support our theses and contribute to our own knowledge of a subject as well as the understanding of others. It is a vital skill in the discipline, and in life overall. Effective communication is not only important for academia, but virtually every other industry, too. In a world where everyone seems to be obsessed with telling history and humanities majors that we have no chance of getting a well-paying job after graduation, this training gives us the flexibility and skills to enter a wide array of career paths.

However, even writing skills have come under scrutiny lately, diminished and disregarded with the widespread availability of text-generative AI like ChatGPT. I have heard a few people ask why they should have to spend an extended period working on a paper for class when it can be wholly written for them in minutes or seconds. Though these comments have largely been made in jest, they raise unfortunate implications for the future of writing as a skill, and all that comes with it—comprehension, analysis, and nuanced understanding of a text and subject. 

Use of generative AI for writing in any humanities discipline, for any reason, undermines a major point of our degree. ChatGPT is not human; it does not actually understand the content it is writing about and it is not actually guaranteed to produce a logical or even “correct” answer to any of the prompts it is given. Despite the moniker “artificial intelligence,” AI programs like ChatGPT are not capable of engaging in intellectual dialogue—though it mimics a human’s writing style and can provide a variety of answers to given prompts. Still, I fail to see how it could be superior to any nuanced human understanding of a subject, particularly when asked to answer, say, an essay prompt from class. If you put the work into the course, you most definitely have a better understanding of what your professor is looking for, and what historical events or examples you ought to reference than ChatGPT ever could.

(I will pause right now and insert a disclaimer that I have absolutely no idea how artificial intelligence is used in other disciplines, including STEM, where I am sure that AI programs may well be integral to the work that those scholars do. If anyone would like to publish a reply to this opinion and elaborate on or defend the use of AI in their own discipline, I will be interested to read about your perspective.)

Over the past few semesters, many of my professors have begun to include warnings against the use of AI in their syllabi, ranging from sentiments like “it’s okay to use ChatGPT for brainstorming and coming up with ideas,” to “Any use of Al for any reason for assignments in this course will result in automatic failure and constitutes a complete violation of academic integrity.”

The latter policy is harsh, but I must confess that—at least as far as it comes to the humanities—it is the position I take as well.

I get it. Writing papers is tough. It really sucks to sit down the night before a big paper is due and realize that the doc you started for it has exactly two words typed on the page, and those two words are your name. I’ve been there.

But why on earth would you make the decision to go for a Bachelor of Arts if you’re not actually interested in learning any of the skills that are so integral to the humanities? Seriously. It’s disrespectful to your own educational and intellectual development, to your professors, and to whoever is paying your college tuition (whether that be you or someone else).

Even for use in brainstorming or generating lists or prompts, there’s no actual guarantee that ChatGPT will give you a viable, factual answer. The chance that ChatGPT might generate a completely nonsensical and false answer, even if you just ask it to tell you about some historical event or create a Chicago-style bibliography, is very high. If you want to get a viable assignment or paper, you have to go back and re-check everything that ChatGPT generated for you. At that point, why not just write the thing yourself? If you’ve been going to class, you probably have the capacity to write whatever prompt your professor gave you. If you’ve chosen to attend college, that means you’ve chosen to do the homework, too.

The use of Al in the classroom setting violates the standards of academic integrity that are vital to the college setting. When you write a paper in class, the universal expectation is that you effectively express an understanding of a subject based on comprehension from your own research or class readings and discussions. You set forth an argument about the subject based on this understanding and your own convictions. You prove your argument by analyzing specific contextual evidence that you gathered by researching secondary (academic) and primary sources. Then, you carefully cite these sources wherever you use them in your essay. These are the steps students must stake to ensure our work avoids plagiarism and preserves our own academic integrity.

Yes, professors have varying standards about AI use, which school administration is navigating as the issue evolves, but in my opinion, the use of AI in any step of this process undermines the development of reading comprehension and writing skills, and violates the School’s standards of academic integrity and anti-plagiarism. 

The Student Handbook states clearly, “The most serious degree of plagiarism involves the wholesale and deceptive borrowing of written material from sources such as published authors, websites, other students, or paper-for-hire services. Students who submit papers or significant sections of papers that they did not write themselves are committing this type of violation.”

Even if you cite the use of ChatGPT in a paper, the problem remains that you did not actually write the paper yourself. It wouldn’t be acceptable to submit a paper your friend wrote, even if the whole paper is otherwise cited correctly and you tell the professor that your friend is the one who wrote it for you. Why is ChatGPT any different?

View Comments (1)
More to Discover

Comments (1)

The Dickinsonian strives to provide a forum for lively and respectful discussion among members of the Dickinson College community. We reserve the right to remove any comments that we do not adhere to our community standards.
All The Dickinsonian Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *

  • J

    Joel ReimerFeb 8, 2024 at 9:08 pm

    So well argued and very well written. I too am (was) a History major in college and learned the importance of good writing skills in all sorts of life situations. I started journalism, which led to corporate communications, which led to marketing, and so on. It is such an important skill. Thank you for sharing, and capturing the danger of relying exclusively on AI in the future.

    Reply