Inside Mercer’s aviation classrooms, Assistant Professor of Aviation Flight Technology, Deanna Lawson, is charting a new course. Rather than resisting the rise of generative Artificial Intelligence (AI), computer systems simulating human-like intelligence, she’s finding ways to incorporate AI in her classroom.
While there are no cats, pandas, or giant creepy clowns in the MCCC library, generative AI would have you believe otherwise. PHOTO ILLUSTRATION | Generative AI, Julia Meriney, and Rikhil Sharma
Professor Lawson says, “Sometimes I feel like I’m the only professor that actually wants to embrace AI.”
When she discovered students using tools such as ChatGPT, an AI language model that is capable of understanding and generating human-like text, for Blackboard discussion posts, she decided that the solution was not to outright remove AI from instruction, but to change instruction in order to use AI as a tool.
“No more online discussion boards. Now we’re going to take 15-20 minutes out of our classroom time and do in class activities.” Professor Lawson stated.
Professor Lawson says that this approach allows students to use technology as a way to work faster. Professor Lawson said, “For the discussion post you can ask Bard [a chatbot developed by Google] for an article on an aviation accident investigation and it’ll show you 100 different articles.”
Professor Lawson also uses AI tools for tasks like rubric creation. Professor Lawson continued, “[ChatGPT] spat out a rubric in, I don’t know, 15 seconds, which probably would have taken me an hour.”
But while Professor Lawson is embracing student use of AI, many faculty members are having a very different response. In a VOICE survey of 16 MCCC professors, more than 87% stated that they prohibit the use of generative AI in their class. Additionally, more than 62% stated that they have not used AI tools to create assignments or content.
Generative AI, commonly known as “chatbots”, are Large Language Learning Models (LLMs) that give users the ability to have a conversation with a robot that has the knowledge of an encyclopedia and can instantly produce written, visual, and audible content.
Generative AI was popularized after the public release of ChatGPT by OpenAI, a non-profit research organization, in November 2022. According to a UBS Investment Bank report, ChatGPT had 100 million users in January 2023.
Since its release to the public, MCCC has seen ChatGPT used by students to complete assignments and assist in tests. Matthew Kochis, Professor of English and Journalism and chair of the MCCC Academic Integrity Committee (AIC), says that the College has no official position on student use of generative AI.
Professor Kochis stated, “[The AIC] has advised faculty to articulate what their stances are within their individual syllabi.”
Despite the lack of college-wide policy, Professor Kochis says he still feels that the use of generative AI will not go unchecked because of the writing style of LLMs.
“AI is pretty consistent in terms of how it writes. It’s also very vague and general. It’s also very bad at quoting…I can see it when it’s not [the student’s] voice. [The AI generated writing] has no tone to it,” he said.
Like Professor Lawson, Professor Kochis recognizes that AI is now a part of the classroom. He said, “If you’re going to use it, do it like you would with any type of source: identify it, quote it, explain it, list, do all of those things.”
While calculators were originally controversial, they settled in as a tool that one had to have knowledge of in order to use it effectively. Generative AI provides a new challenge since complex word problems can be pasted into the chatbot, and a step-by-step answer is outputted with work attached.
When asked about the potential roadblocks that Generative AI presents to Mercer’s mathematics courses, Betty Peterson, Professor and department chair of Mathematics, said, “It’s time for faculty to figure out how to reteach and how to recreate assignments.”
Because the technology is new, it is unclear what those new mathematics assignments might look like. Professor Peterson says, “It’s an ongoing thought process that’s happening.”
In the VOICE survey, the 16 faculty members were asked whether or not they had made changes in their content because of generative AI. More than 66% indicated that they have changed their curriculum due to AI. The types of changes ranged from the omission of homework to the introduction of AI in the classroom.
One way professors are changing their curriculum is to integrate the technology into their program. Michael Chovan-Dalton, Professor of Photography at Mercer, says he plans on using generative AI in his photography assignments.
“Sometimes I’ll ask them to generate things using Adobe’s AI,” Professor Chovan-Dalton stated, adding, “There will be rules about when to use it and when not to use it, so it’s not something that students have to avoid.”
In March of 2023, Adobe introduced its new AI software, Firefly. It is currently in the beta test phase and is being integrated into multiple programs such as Photoshop and Illustrator. The highlighting feature, Generative Fill, allows the user to create a visual asset through a text prompt.
This has the ability to bring up ethics issues. Professor Chovan-Dalton says, “There’s a bigger issue with deep-fakes and videos. … I’m concerned about the ethical implications if someone is presenting [AI generated content] as factual evidence.”
As schools like Mercer work to stay ahead of the evolving technology, faculty will have to navigate these ethical issues. Professor Dalton says, “I think, with what we’re seeing now in deep fake video and AI on social media, is that it is our job to educate people on how easily things can be faked.”
Professor Lawson also believes that this can be part of a revamp of higher-education and says, “30 years ago the way higher education was taught is the exact same way higher education is taught now, and I think it’s time to change that.”