News

Garber Announces Advisory Committee for Harvard Law School Dean Search

News

First Harvard Prize Book in Kosovo Established by Harvard Alumni

News

Ryan Murdock ’25 Remembered as Dedicated Advocate and Caring Friend

News

Harvard Faculty Appeal Temporary Suspensions From Widener Library

News

Man Who Managed Clients for High-End Cambridge Brothel Network Pleads Guilty

‘Struggling to Keep Up’: Harvard Students and Faculty Grapple with Impact of Generative AI in Classrooms

Harvard students and faculty are adapting to the school’s policies on generative artificial intelligence tools.
Harvard students and faculty are adapting to the school’s policies on generative artificial intelligence tools. By Sophia Salamanca
By Camilla J. Martinez and Tiffani A. Mezitis, Crimson Staff Writers

From evolving syllabi to entirely new course offerings, Harvard’s classrooms are adapting in real-time to the widespread impact of ChatGPT and other generative artificial intelligence tools.

Though the Office of Undergraduate Education has not imposed a University-wide policy on AI use, it has provided professors with guidance to help them establish their own rules regarding generative AI in their courses.

The OUE’s guidance offers three different approaches for professors to choose from: a “maximally restrictive” policy, which bans the use of any generative AI tools; a “fully-encouraging” policy, which promotes the use of these tools; and a mixed approach that lies in between.

Nick D.G. Apostolicas ’25 said his classes have taken different stances toward AI depending both on the instructors’ “personal opinion” and on the potential applications of the technology to the course’s assignments.

“Each of my classes has not only explicitly mentioned it within their syllabus, but also has mentioned it verbally within lectures,” Apostolicas said of generative AI.

Some instructors are taking a more hard-line approach. Abril Rodriguez-Diaz ’26 said her class, Government 1780: “International Political Economy,” has banned the use of generative AI and ChatGPT except in the preliminary stages of the writing process.

“My professor, Jeffry Frieden, who was the head of the Gov Department last year, said that we expect that all work you submit for this course will be your own,” she said. “Violation of this policy will be considered academic misconduct.”

Rodriguez-Diaz said she believes strict policies are “definitely warranted.”

“Professors should be more careful than they already are,” she said.

Similarly, Apostolicas said he finds restrictive policies “semi-warranted,” adding that he believes the tools can “stunt your growth if you become reliant on it to do your work.”

Assistant Dean of Science Education Logan S. McCarty ’96, who is teaching an Applied Math course and a Physics course this semester, said he encourages using generative AI in his classes as long as students cite any use and make substantial changes to the initial output.

“Part of what we’re doing, in all of these cases, is trying to model how people in the professional world, in the academic world, are going to be using these tools in a realistic and responsible and ethical way in the future and try to establish norms about that now,” McCarty said.

Dan V. Purizhansky ’24 — who said he supports more restrictive generative AI use guidelines — said he believes Harvard should have a “more clear University-wide policy.”

“What I don’t like really is that the University just kind of bestowed on the classes the responsibility of setting their own policies,” Purizhansky said.

In addition to syllabus changes, new courses have emerged around generative AI topics. Harvard introduced a spring 2024 course called General Education 1188: “Rise of the Machines? Understanding and Using Generative AI” taught by Dean of Science Christopher W. Stubbs and McCarty.

Stubbs said one of the primary objectives of the course is to “teach people to use these tools in a responsible and ethical way.”

“One of the interesting parts of this is that the technology and its capabilities are evolving so fast that it basically outstrips the ability of institutions like Harvard to react,” Stubbs said. “We’re struggling to keep up.”

Many students said the rapidly developing technology can facilitate learning in new ways.

Nathan J. Li ’25 wrote in an email that he has found that ChatGPT can help with “improving your writing skills and providing rough explanations for class concepts.”

Purizhansky said ChatGPT can help “explain a concept to you in simpler terms” than a textbook might.

“It can help you make up for some shortcomings in your knowledge,” he said.

McCarty defended ChatGPT as a valuable tool for understanding difficult concepts, pointing to the ability to “go back and forth with a dialogue” due to the chatbot-style interaction.

“Now, granted, it might make mistakes, but it’s often a much better place to start actually than Wikipedia,” McCarty said.

Despite advancements in AI technology, many said they will continue to rely primarily on traditional means of learning and teaching.

“Going to office hours or section is a better use of my time than just sending in 10 responses to ChatGPT,” Purizhansky said.

McCarty said he is skeptical that current generative AI technology will replace course assistants and teaching fellows.

“I don’t think they could be used as an expert tutor because they’re not reliable enough to actually give you correct answers,” he said.

But all agreed that the Harvard classroom is transforming due to the new technology.

“Welcome to living through a revolution,” Stubbs said.

“It could well be more disruptive than the Internet,” McCarty added.

—Staff writer Camilla J. Martinez can be reached at camilla.martinez@thecrimson.com. Follow her on X @camillajinm.

—Staff writer Tiffani A. Mezitis can be reached at tiffani.mezitis@thecrimson.com.

Want to keep up with breaking news? Subscribe to our email newsletter.

Tags
CollegeFASScienceArtificial Intelligence