
Over the course of the last three years, generative artificial intelligence has been utilized more and more by students, raising questions around academic integrity and how much students are actually learning. AI has made it so that teachers everywhere have to consider changing their classroom policies.
At the School of the Art Institute of Chicago, some professors have tackled this with small measures like switching assignments to paper, while others ban all technology and take-home assignments. A few have leaned into using AI to teach students how to use it as a tool.
Before 2014, complex learning done by AI was mainly used for analysis. Since then, it’s been increasingly used to generate content. However, with increased use, many find issues with large AI companies. Data centers have high carbon emissions, use a lot of electricity, and take up huge amounts of space. Large AI models like Anthropic are trained using data from the internet without permission from artists, authors, and creators. There is a major concern that AI will cause further job displacement in software, business, art, engineering, coding, and plenty of other industries. AI can enable surveillance on users, which includes military surveillance. AI systems have been found to be biased. Sam Altman, the president of OpenAI (which makes ChatGPT), donated millions to Trump’s super PAC, MAGA Inc.
According to a study by Forbes drawing from 1,100 students in the U.S. in two-year colleges, four-year universities, and graduate programs, 90% of college students used AI in 2025. Almost 75% said their usage increased over the course of the year.
Teachers at SAIC are allowed to curate their own policies surrounding AI for their classes. Each semester, the dean’s office sends syllabus guidelines to professors with suggested language. In the 2022-23 school year, the dean’s office began recommending language for their syllabi on AI usage.
The most recent recommended language from the 2026 semester suggests that teachers strictly prohibit AI-generated work without “advance, written permission from the instructor.”
Many professors at SAIC have made a range of small to drastic changes in their teaching methods based on the popularity of AI usage.
Giovanni Aloi, a professor, adjunct in the art history department, began by banning technology in his classes because of its distraction. Recently, in response to the growing use of AI, he increased the oral components in his courses. Each week, his students present, speak, and discuss.
“In a culture increasingly saturated with AI-generated language, a person’s voice will carry a different kind of trust and authority. Our students need to practice that skill,” Aloi said.
Aloi believes that institutions should be more transparent in setting clear directions around technology because AI is intensifying existing problems.
“AI is making questions of authorship, attention, trust, and assessment more difficult and more urgent. More than ever, students need practice not only in producing ideas, but in standing behind them,” Aloi said.
Mikolaj Czerwinski, a lecturer who teaches art history survey courses, changed his submission style to a Google document link so that he can determine whether AI was used in online assignments, and has transitioned from take-home exams to in-class writing on paper. Czerwinski said students used to rely on random internet sources for take-homes. Now, they are incentivized to take and review their notes and readings, which he said has resulted in better essays.
Transitioning to paper has solved both the AI usage and the difficulty in determining if AI was used and how much time it takes. Czerwinski sees himself moving completely towards in-class assignments on paper in the future.
“I believe students should learn to write – writing is ultimately reasoning and learning to understand a topic – before they employ tools such as AI, ” Czerwinski said.
Pamela Barrie, an associate professor, adjunct in the liberal arts department, can tell she is receiving papers composed by chatbots, but said she has no way of proving it, as she could with ordinary plagiarism. The papers are smoothly written, but are overall problematic in some way, with excessive summaries or made-up citations. When AI work is turned in, Barrie asks for revisions rather than handing out more intense penalties.
The SAIC policy for academic misconduct in the case of AI usage says that a student can receive a warning, be required to resubmit the assignment, get a failing grade for the assignment, or fail the course entirely.
“I’m a teacher, not a cop, and obsessing about the cheaters takes time away from the attention I’d much rather give to the students who’ve read the books and have something to say,” Barrie said.
Barrie added that as a semi-retired professor, the enthusiasm for AI of others in her profession is making her consider quitting entirely. “I imagine others late in their careers may be having the same thoughts,” Barrie said.
Adjunct professor Eric Fleischauer teaches in the Department of Film, Video, New Media, and Animation. In his sophomore seminar class, “The Digital Dark Age,” he decided to lean into AI for a studio assignment, requiring students to “collaborate” with AI to make an “original” artwork. Rather than have AI independently generate the work, he asked his students to approach AI as “another tool they can use to bolster their studio practice.”
“As a professor of media art, I think it’s important for students to engage with new media and technology in their studio practices as part of their artistic development,” Fleischauer said.
Fleischauer found that students overwhelmingly had a strong aversion to AI in relation to art. He said their views seemed reactionary, based on a narrow understanding of AI, and didn’t consider AI’s relationship to media art histories and genealogies. He wants students to “approach AI through a critical lens that considers potential relationships to conceptual art, technology, and creativity.”
He said many of his students acknowledged that they gained a better understanding of how AI can play a role as an artistic tool and gained knowledge of programs available to them beyond well-known sources like ChatGPT and Midjourney.
“I will confess the majority did not change their minds about using AI in their studio practice,” said Fleischauer.
Lecturer Anya Davidson, who teaches comics courses in the department of Painting and Drawing, recently started teaching an Academic Spine professional practice course for juniors. Davidson knew she didn’t want to confront individuals when she became paranoid about possible AI responses.
“It seemed much quicker and saner to change the assignment format,” Davidson said. Changing Canvas-based discussion questions about the textbook readings to handwritten notes helped her to focus on productive discussions without worrying that some students wouldn’t learn.
Davidson said that most of her students say they are anti-AI, “because they understand the negative environmental and cultural impacts,” and that in general, “artists know better than any other segment of the population how detrimental it is to the planet and our brains.”
Students have also noticed that AI has put professors on their toes.
“I can tell professors are trying to keep the humanity of the classroom intact,” Mo Beamon (BFA 2026) said.
Beamon noticed the crackdown on computer use in class and in their syllabi, mainly for art history classes. Although Beamon doesn’t believe in professors promoting the use of AI, they want professors to encourage honesty surrounding it. They find that some professors may be too strict (though they understand why), as they have witnessed the use of AI detectors causing false accusations against students.
“I really wish people would trust their own humanity more, so professors wouldn’t even have to put policies in place,” Beamon said.
Lev Burmeister (BFA 2028) has found that while most students wholeheartedly dismiss AI, some professors seem more accepting by implementing its use within their curriculum. In his sculpture class, he noticed AI images in a lecture, to the dismay of the students, though it prompted a classwide conversation.
“It only really served to create distance between the educator and class,” Burmeister said.
Allison Dobbins (BFA 2028) is most concerned about the lack of awareness around AI’s detrimental effects on the environment and societal effects on marginalized communities and lower-income neighborhoods.
“My hometown, Memphis, is being poisoned by Elon Musk’s AI data centers that he has specifically placed in areas that are more populated by people of color. The use of AI and AI products directly supports the spread of fascism, racism, and transphobia in our country,” Dobbins said.
The Department of Art and Technology/Sound Practices features AI usage for studio projects in a few class curricula.
Doug Rosman teaches an AI-focused class called “Artificial Intelligence” in the AT/SP department as well as “Language Games: Dialoguing with AI” with Professor Gionata Gatto. “Artificial Intelligence” covers AI technology and tools with the intent of giving students a critical understanding of how AI operates culturally. Rosman said he doesn’t teach AI tools because using them is necessary to be a better artist or viable in the job market, but because “our moment requires that we reckon with them.”
“I feel it’s my responsibility to provide a safe space to engage curiously and critically with new technologies, not despite their many harms, but because of them,” said Rosman.







