[post_format]

AI-generated photo, courtesy of ChatGPT

As artificial intelligence use spreads across campuses everywhere, SAU students are redefining what it means to study, research, and even think.

“I never use it for anything, and I consider that a brag,” says Brenden Argenta, a senior public relations major.

Argenta may be an exception among his peers. For many other college students, they say AI services like ChatGPT or Google Gemini have become go-to resources for help with project ideas, proofreading, and trying to better understand course material.

“Most college students use artificial intelligence tools like ChatGPT to get quick answers, not to learn concepts,” according to a recent report from the University of Southern California.

The report found that while AI can support learning, students often use it to shortcut assignments unless professors actively guide them toward deeper engagement. 

How SAU Students are Using AI 

As AI becomes more common, its role in the classroom is growing and raising questions about where students draw the line between academic support and academic dishonesty.

Jessica Vargas, a senior double majoring in digital media production and public relations, considers AI to be a tool, but a limited one at that.

“I mainly use it for outlines or to rephrase a sentence that I just feel could sound better,” Vargas says. “I refrain from using AI for in-class discussions. I believe discussions should challenge your thoughts in order to better form your knowledge.”

Kalea Dostal, a junior elementary education major, says she rarely uses AI, but when she does, it’s often to help better understand difficult material.

“Sometimes I use it for homework with math concepts I don’t understand, or to brainstorm topics, or reword a thesis,” she adds.

Avery Kent, a freshman social work major, describes a similar approach.

“The only time I will use it is to help give me ideas on presentation topics or to help make sure I am correctly citing sources,” she says.

Where Students Draw the Line Between a Tool and Cheating

Many students agree it is acceptable to use AI to improve work they have already completed themselves, but allowing it to do assignments for them crosses a line

“I believe it’s cheating once you just copy and paste a whole assignment into the program for all the answers,” Vargas says.

Dostal draws a similar boundary.

“Having it just give you answers to assignments or write any part of a paper for you is cheating,” she adds.

Kent agrees, adding that AI should be used to improve work rather than replace it.
“I would say it’s cheating if you are having it write things for you rather than using it as a tool to improve your own work,” she says.

Even Argenta, who avoids AI entirely, says the issue comes down to how much credit students take for work that isn’t entirely their own.

“It starts to be cheating when students are having it generate papers, or answers, or just doing their work for them in general, but running stuff through AI to check it doesn’t seem as serious to me.”

Vargas adds that in classes like graphic design and media production, using AI would defeat the purpose of taking the course.

“You wouldn’t gain or learn anything from using AI to complete any of that kind of work,” she says.

Faculty Perspective on AI in the Classroom

While students are still negotiating how to use AI responsibly, some professors are working to set clear expectations in their classrooms.

Justin Rice, an associate professor of biology who has taught at St. Ambrose University for four years, says he incorporates AI into coursework but does not rely on it himself. 

Rather than banning AI outright, Rice allows students to use it in specific, guided ways. In his Cell and Molecular Biology class, for example, students use AI tools to generate potential protein shapes as part of their learning process.

He compares AI to familiar online resources.

“Students are going to use AI much the way my generation used Wikipedia and the internet,” Rice says. “The important thing is to help them realize that it is a tool, not a replacement for learning.”

Rice requires students to be transparent when they use AI and to properly cite it in their work. While he permits its use, he emphasizes that understanding the material remains essential.

“They still need to understand the material, just as they would if they were using a textbook,” he says. “AI won’t replace their brain, and the risks involved in using it are significant.”

His perspective largely aligns with the boundaries many students described, though he acknowledges ongoing concerns, particularly about the environmental impact of AI, which influences his decision to avoid using it in his personal life.

A Common Distrust of AI

St. Ambrose students also express caution about relying too heavily on AI for accurate information.

Argenta says, “I don’t trust it much. Even the AI overview that Google gives you has been wrong a couple of times that I’ve seen, so it’s hard to put any trust in it.” 

Vargas also takes AI-generated material with a grain of salt.

“I don’t trust it with a whole assignment,” she says. “There is definitely a limit to what it can do.”

Dostal expresses similar concerns, and says she always checks with multiple other sources for accuracy verification.

Normalization

Students have noticed AI usage has become a lot more common over the past few years.

“I think it’s normalized now compared to two years ago. There are definitely more people who use it and are open about it,” Vargas says.

Dostal says the acceptance varies, with some professors opposing AI use entirely and others allowing or even encouraging students to use it as a learning tool.

Kent says AI use feels more normal now because “people are seeing how it can help enhance work rather than just looking at the misuse of it.”

Argenta, however, thinks the normalization has gone too far.

“So many students seem to have it do so much of their work and their thinking for them, it’s such a weird new thing,” he says.

For some students, concerns about AI don’t stop at the classroom.

The Environmental Impact of AI

Beyond academic concerns, students are also thinking about the broader impact of AI.

Argenta says environmental concerns, particularly water usage tied to data centers, are a major reason he avoids using the technology. 

Dostal agrees and says the environmental impact is also “the main reason” she tries to limit her use.

Kent says that she does not currently consider environmental effects when using AI, though she acknowledged she probably should.

Even when students do not use AI, they say its presence is influencing their academic experience.

Argenta says, “I do find myself trying to word things in a way that doesn’t make it seem likeAI.” 

For students questioning whether their use of AI is riding the line between tool and cheating, you can consult St. Ambrose University’s AI usage policy for guidance.



Tagged:
Share this post:

More posts from this author

From Campus to Communities: St. Ambrose Students Serve in Panama

SAU students spend their spring break in Panama providing healthcare in rural communities.