How Higher Education Is Dealing with a New Student: AI

Story by Lilly Behbehani

Photo by Allyson Rabon

Sarah Foster, a senior at UNC-Wilmington, uses artificial intelligence to create study guides and explain specific topics she might not fully understand.

But, the history and international relations double major doesn’t use it for her academic papers.

“It’s a great tool only for smaller scale things,” she said. “If you lack media literacy, then AI becomes dangerous. Right now, I know if I’m asking AI something, I might have faith in the answer and I might use it.

“But I’m also aware that it’s a computer generated response and, for example, if I asked a question about vaccines, that is not going to be the argument I bring to the table,” she explained. 

Steven King, founder and chief innovator at Blue Sky Innovations imitates the actions of the xArm5, a robotic arm that uses artificial intelligence on Wednesday, Oct. 11, 2023 in Chapel Hill, N.C.

Caution about the role AI could take in classrooms surfaced after the rise of accessible AI chatbots like ChatGPT, which “put the power in everyone’s hands, making it a gamechanger for the role of AI by making it easy to use in daily life,” said Steven King, associate professor of innovation and emerging technologies at UNC-Chapel Hill. 

Schools have adopted new and emerging technology for years, from handheld calculators to computers, integrating the tools into the classroom as a classroom enhancer for students. 

Now with AI at the door, many professors and faculty members of different higher education institutions in North Carolina came to a similar conclusion about how to confront AI: understand how to use it. 

Through workshops, discussions, open dialogue and ask-and-answer events, universities are giving professors tools about how to comfortably use such new technology while allowing them academic freedom to choose whether they want to embrace it. 

“With any technology, there’s going to be a learning curve and part of that burden falls on us to help engage faculty on this,” King said. “I expect my students to use AI as a tool using specific guidelines where I say AI helps you learn, but it won’t do all the work for you.”

This past summer, Mustafa Akben, a professor of business at Elon University, conducted a study with two other faculty members to understand AI’s impact on the higher education community. They found that 85% of universities give professors the autonomy to choose whether they want AI tools in the classroom. 

They saw a wide range of reactions, where some believe the tools can increase educational outcomes, and others believe they can reduce creativity. 

Akben has researched AI and its effect on human cognition for the past seven years. When many were uncertain about using AI in classrooms, he saw it as an opportunity to expose students to the new technology in an ethically and socially responsible way. 

“There’s no way we can escape it, you have to think how you can use it for your benefit,” he said. 

He has implemented three different chatbots for his students to use: one that draws information from the syllabus to answer students who have questions about the course, another explains concepts from the course textbook, and the third engages students in Socratic dialogue about AI with a generative AI Socrates. 

A few months into the semester, he said he can already see a clear benefit of the use of the technology in his classrooms. 

“Students have increased engagement with the materials,” he said. “Overall, their conceptual understanding is also better than the last cohort that I didn’t utilize generative AI tools in the classroom.”

At Johnson C. Smith University in Charlotte, John Bannister, director of the Center of Innovative Teaching and Learning, said the rise of accessible AI technology reminded him of when computers first became popular. 

“I remember how educators shielded away as much as they could until they had no choice,” he said. “I felt like we had to approach AI in a way where the people who were afraid of it wouldn’t be as afraid of it.”

He believes the support network the university has created for professors will help them integrate such technology, but those who choose not to use AI in the classroom will have less fear about how it works. So far, he hasn’t heard complaints from faculty members about abuses of AI in the classroom or misunderstandings about how to use the technology.   

Students experiencing AI in classrooms for the first time see it as both a positive and negative aspect of their education. Lilah Pjetra, a senior at N.C. State University studying business administration, said she knows of students who copy-and-paste homework questions directly into chatbots, without detection by the professor. 

Pjetra hasn’t used ChatGPT or any other chatbots to help with schoolwork but said many of her teachers encourage students to become familiar with the technology. 

She said she thinks the education field is going to change as it gets easier for students to use AI to do all their work instead of using it to help with their thinking. 

“Teachers are going to have all these students that are easily getting by without doing work and without learning,” she said. “It’s a lot harder for teachers to tell whether or not they are thinking of it on their own since there’s no plagiarism test yet.”

 Akben, who has integrated AI into many aspects of his course learning, said he has faith that students will use such enhancers for good.

If he notices a student might use AI to do their work, “We have an honest discussion about it. I say, ‘hey, if AI can do your work today, and  probably tomorrow too, you need to develop a skill set from today to protect your job in the future.’”

The UNC Blue Sky innovation team meets to discuss different projects involving artificial intelligence at the Reese Lab on Franklin Street in Chapel Hill, N.C. on Wednesday, Oct. 11, 2023.

With the digital disruption comes a need for new challenges and ways of testing students’ knowledge. Instead of relying on written assignments, students might start to be asked to convey their knowledge through oral assessments to combat issues of intellectual property, misinformation or plagiarism by using a chatbot, both Bannister and King said. 

To combat its misuses, the black box of AI technology must be opened, said Shiyan Jiang, assistant professor of learning design and technology at N.C. State University who researches AI in education.

“Once we know how it works, we’ll get a sense of what kind of bias these models might generate and whether we should trust the result, and why we got the result the way that we did,” Jiang said.

Many teachers not only trust their students to use the technology in a beneficial way, but also believe in their own capabilities to face the digital disruption head on and continue to foster a positive learning environment working in conjunction with AI, not against it. 

“Education is not just about delivering content knowledge,” Jiang said. “It is also about caring, about fostering critical thinking, about the kind of human interaction that we make in a learning environment that AI can’t replace.” 

Lilly Behbehani

Lilly Behbehani is a senior from Chevy Chase, MD, studying journalism with a minor in conflict management. As an aspiring journalist, her area of expertise is in writing with interests in editing and research. After graduation, she hopes to be working in a writing role that exposes her to various fields to gain exposure to a number of different viewpoints.

No Comments Yet

Comments are closed