SU professors rethink policies as generative AI tools spread in classrooms
AI tools have become mainstream for college students, threatening academic integrity and independent learning. Professors must now decide whether or not to permit AI use in their classrooms amid its growing influence. Maria Masek | Contributing Illustrator
Get the latest Syracuse news delivered right to your inbox.
Subscribe to our newsletter here.
Professor Adam Peruta doesn’t just encourage the use of artificial intelligence in the classroom — he requires it.
The Newhouse School of Public Communications professor, who directs the Advanced Media Management graduate program, said he sees AI as “transformative.” In his classroom, Peruta said AI can be effective for brainstorming ideas and testing headlines for search engine optimization. His goal is to effectively teach his students how to use AI “responsibly and ethically,” he said.
In Syracuse University’s required syllabus language — first introduced for the fall 2024 semester — professors set explicit expectations on AI between three categories: banning AI completely, allowing only limited uses with permission or adopting an open policy that permits AI with disclosure.
With the rise of generative AI in higher education threatening academic integrity and independent learning, professors must now decide whether or not to permit AI use in their classrooms amid its growing presence in the workplace and influence on the necessary post-graduation skillsets.
Recent surveys suggest AI tool use among college students has become mainstream. A global survey conducted by the Digital Education Council in August 2024 found that about 86% of students report using AI for their classes, with more than half using AI tools at least weekly.
There’s also a gap in student confidence — many feel unprepared or unequipped to use AI tools responsibly or ethically, according to the survey.
Douglas Yung, a teaching professor in biomedical and chemical engineering, leads a teaching series through the Center for Teaching and Learning Excellence’s about responsible AI use and hosts weekly drop-ins, helping faculty incorporate AI into their instruction.
In his classes, Yung categorizes assignments as AI-free, AI-assisted or AI-integrated, giving specific instructions for each assignment. He said he requires students to disclose when and how they use it.
On Monday, SU announced a partnership with Anthropic, an AI company founded in 2021 by former OpenAI employees, to give students and faculty members access to Claude for education, an AI search engine made for academic environments.
Claude prompts conversations that guide learning rather than directly answering questions, using Socratic questioning and emphasizing core concepts, according to its website. SU is one of the first universities in the United States to grant campus-wide access to Claude.
Nina Brown, an associate professor who teaches communications law at Newhouse, views AI use as unavoidable and favors more flexibility in setting course AI policies. Comparing the technology to the internet, she says professors should let students experiment while being transparent.
Peruta argues that higher education needs to catch up to AI. He frames responsible AI use not as an optional skill, but as a requirement for all students heading into media careers.
“I want students to be able to experiment and just be honest about it and be forthcoming,” Brown said. “(To) include a disclosure when they’re doing it, and it would be an open and evolving conversation.”
Other professors take a more cautious stance.
Dana Spiotta, a novelist and SU English professor, said AI has no place in her fiction workshops. For her, creative writing is about the struggle to find the right words, a process she says AI undermines.
“Writing is thinking,” Spiotta said. “Using AI to help you draft a piece takes away the most important, valuable part of the process.”
In her classroom, she also forbids using AI tools to summarize texts. Close reading, she said, is core to her classes, and using AI robs students of the chance to discover meaning in texts and learn on their own.
Spiotta doubts AI can generate language that captures the originality creative writing demands. Because it draws from existing text, the output falls back on clichés and predictable phrasing, she said.
According to a student survey from Inside Higher Ed, many students believe AI helps with idea generation, studying for exams, clarifying difficult concepts and polishing writing. The survey shows that while fewer students said they have AI write full assignment texts, many still use it to assist in editing and brainstorming.

Zoey Grimes | Design Editor
OpenAI launched “study mode” last month, built to act as a tutor and study guide maker rather than a machine. Chegg, a homework help and textbook resource website, confirmed its 22% workforce layoff in May was driven by the rise of generative AI as a study tool.
Spiotta said the responsibility falls on students not to “cheat themselves out of the opportunity to learn.”
For Brown, the key is to create a classroom where students can admit to using AI without fear while also learning how to ethically apply it. She said this approach better prepares students for workplaces where AI use is becoming routine.
Similarly to Spiotta, Mark Brockway, an assistant teaching professor in political science, said he doesn’t engage deeply with AI in his classes as his teaching style naturally limits its impact.
Outside of the classroom, Brockway incorporates workshops, activities and experiences that AI cannot replicate, noting that it “is better for (students) in the long run.”
Claire Bai, an assistant professor in advertising, allows her students to use AI in ADV 509: “Advertising Research and Planning,” a course focused on research techniques. She said the choice reflects how relevant the technology already is to her field.
“AI in general, even for my personal use, is still a good tool,” Bai said. “It can give you ideas, directions, but it’s not 100% accurate.”
Bai supports a more accepting AI policy at SU, arguing that restrictions won’t stop students from using the tools anyway. Instead, she said professors should focus on teaching students how to work with AI responsibly.
Brown said SU’s three-tier framework may be too rigid, leaving little room for faculty to tailor policies to their disciplines.
“I don’t like those three buckets for myself,” Brown said. “I feel like I understand the technology well enough and I understand our academic integrity policy well enough that I could probably craft something that is better for my particular courses.”
Yung said he values SU’s three-option framework, saying it puts “clarity and learning outcomes” first. He said it lets faculty set policy for each assignment and requires students to disclose their use of AI.
Both faculty and students would benefit from hands-on experience with industry-specific AI tools, Bai said, testing their strengths and weaknesses, then sharing those insights in the classroom.
In Yung’s capstone design course, he’s piloting an assignment where student teams treat AI like a member of the group. Each team assigns an AI chatbot a role, defines its decision-making powers and creates guardrails for what it can and can’t do. Students also keep a log of prompts, outputs and edits, he said.
“AI use is outcome-driven,” Yung said. “If it deepens learning, I invite it with disclosure. If it short-circuits core skills, I restrict it.”
Yung said he wants to encourage students to think critically about when AI adds value and when it doesn’t. By documenting both benefits and risks, he said students learn how to evaluate AI the same way they would assess any other tool.
The 2025 AI Index Report from Stanford University highlights that AI adoption in education settings is increasing, notably with AI tools made for education. With this rise, both Brown and Peruta recognize the need for thoughtful integration.
“It’s really important that professors figure out a way to embrace at least some AI in their class,” Brown said. “This is not something that is a fad, it’s like the internet.”


