Faculty members consider the role of AI in the classroom

Topics:

If Mozart and Stravinsky lived during the same time period and got together for lunch to discuss compositional style, what would their conversation be like?

That was the question one of Colleen Conway’s students asked ChatGPT for an assignment in her Music in Higher Education course that called for the aspiring educators to feed generative artificial intelligence an essay prompt that the students might give in their own future music appreciation classes.

Conway, professor of music education in the School of Music, Theatre & Dance, wanted the students to analyze: How would the GenAI response compare to a real essay? Would they be able to distinguish between the two?

Conway said the generated essays were “alarming.” With believable facts and data, the AI essay created a realistic scenario in a Paris café, detailing what Mozart and Stravinsky ordered and their discussion about compositional style, with 80% accuracy in conveying the composers’ actual viewpoints.

“(AI) is here before we really know how to figure out how to use it. So it’s exciting — I feel like it’s exciting — as long as we can make sure that we’re having mindful conversations about how to take on the technology,” said Conway, who also is a past chair of U-M’s Faculty Senate.

Photo of Kas Kasravi, lecturer III in industrial and manufacturing systems at UM-Dearborn, working with students in his prototype-design lab
Kas Kasravi (center), lecturer III in industrial and manufacturing systems at UM-Dearborn, helps students in his prototype-design lab use AI to write code to create microcontrollers. (Photo by Scott C. Soderberg, Michigan Photography)

Understanding a “shifting landscape”

The University of Michigan has been at the forefront of tackling how to integrate generative AI into higher education. In May 2023, Laurie McCauley, provost and executive vice president for academic affairs, and Ravi Pendse, vice president for information technology and chief information officer, sponsored the Generative Artificial Intelligence Advisory Committee.

The committee worked throughout the summer to recommend ways U-M can use and accommodate AI technology, including best-practice standards for privacy protections, expanding upon existing information technology infrastructure, and establishing an initiative to develop methodologies for AI-augmented education and research.

“This is meant to be a catalyst for crystalizing our thinking about how U-M should navigate the shifting landscape of GenAI,” McCauley and Pendse wrote in an introduction to the committee’s report. “Some of you will appreciate the proposed directions laid out by this report. Others might only see this report as a discussion starter. More than anything, we are looking to ignite much-needed conversations.”

Conway and three other faculty members met with McCauley in November to discuss concerns about the report and AI learning. She said many faculty members returned to campus in the fall unprepared for the recommendations and did not have the proper guidance to implement the new tools into their teaching.

“We were very supportive of the idea that we really did need to put structures in place, and we were feeling a little like we haven’t quite seen that yet,” Conway said. “We were pushing for a more universitywide, macro-level committee or task force that would really take a look at some of the ethical issues associated with AI, and with some of the concerns about who’s teaching freshmen about critical thinking.”

McCauley addressed the topic during her remarks at a December 2023 Provost’s Seminar on Teaching that considered ways Gen AI can be used to augment and enhance teaching.

“While (AI) will likely change our methodology, it will not change our purpose. AI does not change our values. And in instances where it has the potential to collide with our purpose and values, we have to be collaborative and honest about how to approach that situation,” McCauley said.

With the fast-paced, everchanging nature of AI, Conway said, faculty members need to tackle the change head on and learn how to use AI to their advantage. Uncertainty among faculty, she said, is causing confusion and frustration for students who have some course syllabi that threatens failure if caught using AI and other syllabi that encourage its use to a certain extent.

“I think maybe the biggest threat is that we can’t move fast enough to figure out how to make sure that we are ethical about these things, and we don’t set young people up for a world where they’re not prepared to make hard decisions because they’re depending on the computer to do it for them,” she said.

Viewing AI as a tool

Bob Jones, executive director of support services and emerging technology for Information and Technology Services, has helped advance university AI initiatives — including the tools U-M GPT and U-M Maizey — throughout the past year.

“By taking a position early, you empower the community to respond and the whole tool set was really a response to our leadership saying this is a thing that we care about, and that we want the University of Michigan to embrace it and challenge it,” Jones said.

U-M GPT allows all university employees and students the opportunity to use chat-based GenAI to enhance teaching, learning research and collaboration. U-M Maizey can help faculty prepare lesson plans, course outlines and other material.

Jones said while he hopes more people around campus become aware of the platform, it is necessary to acknowledge that AI has its limitations and should be used as a tool.

“While we built a generative AI platform because we think that it’s here to stay — it has the potential to be transformative, enable people to innovate, enable people to focus on higher value tasks, potentially can be used to solve some of the world’s problems — we should all be skeptics,” Jones said.

While the potential for students using AI to plagiarize has been a primary concern among faculty, Jones said he thinks most students will refrain from cheating. 

“Generally speaking, we believe our students are always going to do the right thing. These are the next generation of policymakers and world leaders,” Jones said. “That’s who comes out of the University of Michigan, and we trust that this is a tool to aid, but this isn’t a tool to complete their work. And I trust the students and everybody else in the community is using them responsibly.”

Photo of Andrew DeOrio teaching his web development class
Andrew DeOrio teaches his fall 2023 web development class. He used U-M Maizey to develop an AI assistant that provides extra help for students. (Photo by Robert J. Scott)

Helping students understand its use

Rebekah Modrak, professor of art and design in the Penny W. Stamps School of Art & Design, said she thinks professors are prepared for students who may use AI to plagiarize art and photography assignments.

“Just as it’s possible for a professor to read a text created by ChatGPT and recognize how it was generated, I think many of us trained in visual literacy can still recognize the distortions of an image created with AI,” she said.

“And, maybe more importantly, if we’re having regular discussions about these tools with students, so that the technology isn’t a dirty secret but just another tool in their kit that they understand how to use critically and thoughtfully, students should be able to produce work using AI, as needed and to talk about this directly.”

As AI has increased in scale, so has its ability to create realistic renderings of art, videos and photographs. Modrak said artists can potentially use this technology as a tool to enhance their own artwork.

“All mediums and tools come with some expectation for how they’ll be used. It’s the job of the artist to extend and challenge this,” Modrak said. “Unique to AI may be the need for visual artists to become better wordsmiths so that they can train AI to generate images with particular meanings. What’s the precise word for the emotion or environment you’re envisioning?”

While AI can prove beneficial for artwork, Modrak said there are several ways generated images can prove dangerous.

“Students need to understand the limitations of AI in interpreting prompts — such as intrinsic race- and gender-based biases or reproducing an existing image rather than compiling a new one — and we need to have conversations about the factors of fair use and copyright and to consider the imperative of creating new meaning,” Modrak said.

Acknowledging and addressing the concerns

Virginia Sheffield, assistant professor of internal medicine in the Medical School, said AI is revolutionizing the medical field. She said programs are being developed to generate AI responses to patient-portal messages, set off sepsis alerts and act as a scribe during interactions with patients.

While AI is proving helpful in reducing administrative burden for clinicians, Sheffield said, generative intelligence can also pose potential dangers for medicine.

“One of the things that I worry about is the overreliance, overtrusting, blindly trusting the AI systems. … (AI) gets information from a lot of sources that we know are biased or based on bias data. So there’s the chance of misinformation, misinterpretation of information, and propagation of biases,” Sheffield said.

Medical students in recent years have started using AI to help study for exams, prepare for clinical rounds, and develop patient cases. Sheffield said the presence of AI in medical education could prove helpful but may also hinder the development of critical-thinking skills.

“To develop adaptive expertise, you have to kind of really struggle and fail and learn through difficult times. And so, if every answer, differential diagnosis, is just gotten without that step of critical thought and being wrong, I worry how that could impact the ability to critically think in the moment,” Sheffield said.

“I think (AI) is a powerful tool that most people don’t understand well enough to leverage effectively. When not leveraged effectively, it could skip the step of learning that critical thought process. When leveraged effectively, I think it has the potential to help expand the critical thought process. So I think it’s a double-edged sword.”

As artificial intelligence grows more powerful and prevalent in society, Jones said, university programs will continue to develop and provide further resources to help faculty and staff understand and implement AI into their work.

“What’s comforting to me, and also what’s terrifying for many people, is the fact that it’s so new we’re all trying to figure it out. Everybody is in the same state,” Jones said.

“It’s OK to be nervous, it’s all right to be skeptical. We want that skepticism because if any place is going to sort through it and find out advantages and disadvantages and even create new things out of generative AI, it’s going to be the University of Michigan.”

Leave a comment

Commenting is closed for this article. Please read our comment guidelines for more information.