When it comes to the rise of Artificial Intelligence, journalism Professor Amanda Sturgill said it’s not all that different from when the graphing calculator was introduced decades ago.
“It was a new technology, it was helpful in doing the things that you needed, but you needed someone with some sort of experience in the field to guide you in how to use that too,” she said. “I feel like professors in college need to be doing the same thing with AI now.”
On Jan.16, the North Carolina Department of Public Instruction released guidelines on generative AI use. According to the NCDPI, implementing these guidelines and familiarizing students with AI is essential to preparing them to enter the workforce.
However, these guidelines aren’t completely new. They’re adapted from North Carolina’s Digital Learning Plan from August, 2022 – months before the release of OpenAI’s ChatGPT in November and the overall widespread adoption of generative AI.
As a private institution, Elon University is not obligated to follow these guidelines. The university instead created a set of its own principles as a framework for AI literacy at the UN's Internet Governance Forum this past October. Elon put forward six principles, spearheaded by university President Connie Book and In-residence Scholar Lee Rainie.
Elon’s principles focus on human-centered work, overcoming digital divide, information literacy, responsibility, learning evolving technology, and using AI as a tool, not a replacement.
Rainie also works as a member of the Imagining the Digital Future Center, formerly the Imagining the Internet Center, at Elon – an initiative that aims to provide insight on the impact of digital evolution.
“The big challenge for teachers now is that they’re the ones that have to test whether knowledge is transmitted or not,” Rainie said. “How are you going to set up systems to know what’s inside a student’s head as opposed to what they’ve expropriated from some obvious new ways to copy and paste things.”
Academic Integrity
In many cases, outputs copied from generative AI are copied from elsewhere. According to a study from CopyLeaks – a company specializing in AI detection software – almost 60% of all content from ChatGPT-3.5, the most recent free version of ChatGPT, contained plagiarized content at some level.
But Kenn Gaither, dean of Elon University’s School of Communications says that he hopes that students and teachers will use AI responsibly.
“AI can do a lot of the shortcuts, but at the end of the day, the students, as well as teachers, have to show the command of information,” Gaither said. “We hope when students will use it, it shouldn’t do their work, it should complement their work.”
As students learn how to harness generative AI, Sturgill has found there are different ways students can be academically dishonest, but this is not a new issue.
“There's services where you can pay people to write papers for you, you can go to check and download a paper, you know, any of those kinds of things. So I don’t think it’s really that different,” Sturgill said.
Anthony Hatcher, professor at Elon and chair of the Journalism Department, explained that some of his class assignments are designed so that AI would’t be able to be used. Although students in his COM 3330: Religion and Media classes could ask ChatGPT to write reports on the basic structures and histories of religions studied, which Hatcher said he could probably detect, there are more in-depth research assignments that are driven by personal analysis and talking to people face-to-face.
When describing one such project where students attend a religious service that does not align with their own practices or spirituality, Hatcher said this is something you can’t use ChatGPT for, as it’s their personal experience.
Both Sturgill and Hatcher have strict policies against using generative AI to write assignments for their journalism courses, treating AI use in this way as plagiarism. However, both professors express how they understand the benefits of AI and the importance of learning how to use it.
Gaither said it’s important to stay aware of AI moving forward.
“How can we be either on the curve or ahead of the curve? Nobody’s really ahead of the curve in Comm, and it’s just changing too quickly,” Gaither said.
He expressed the importance of teaching core skills applicable both with and without generative AI, like media literacy. But when it comes to AI, Gaither also emphasized the importance of teaching students how to effectively utilize AI.
“We’re not saying you shouldn’t or should use AI, we’re saying if you’re going to use it, here’s what you need to know,” he said.
Hatcher said it’s important for journalism students to be aware of AI and how it works in preparation for internships because it's already in active use in the professional field, from generating obituaries to assisting with data-driven sports stories.
“At an internship, I wouldn’t be surprised if they looked at you and said, ‘Do you know how to use AI?’ You may be required to know how,” said Hatcher. “It’s good for you to be aware of it. That way, you’re not caught off guard.”
According to the NCDPI, it’s more than likely that understanding how to use AI is essential. They cite the World Economic forum in their press release, which says that 75% of companies plan to implement generative AI by 2027.
“Do I think that AI has a place? Yes. I think it can speed things up. I think it can be a backstop. I think it can be a support system,” Hatcher said. “Should it replace a sports writer? Should it replace a caption writer? Should it replace a business writer? No. I don’t think it’s good enough, and I worry about misinformation.”
Hatcher referenced AI’s history of fabricating information as evidence of untrustworthiness. A famous case of this is when lawyers cited precedents in aviation injury cases they had found through ChatGPT in their own case, and the judge discovered several of the cases didn’t exist.
“Some call it hallucinations. I call it lying,” Hatcher said.
Sturgill said she knows other professors who try to transform generative AI answers into what they are looking for.
“The amount of time it would take me to do that would be the same as the amount of time for me to do it in the first place,” Sturgill said.
AI After Elon
Hatcher said AI can rapidly output statistics, such as the crime rate of a city within a certain time period.
“What a human does is put that in context,” he said. “How much has the population grown? What kind of crime are we talking about?... It is left, or should be left still, to a human to flesh out a story from the numbers.”
According to Sturgill, data-driven stories are the best candidates to be written with AI or have AI-assisted sections.
“It turns out that things like financial markets stories, a story on a baseball game, … anything that's kind of based on statistics, usually the words around explaining that are kind of written according to a formula, or they can be,” Sturgill said. “And so, what I would hope would be that you can use tools like that to do some of the kind of boring repetitive stuff.”
AI going forward
“What I love about the era that we’re in right now is experimentation,” said Haya Ajjan, an associate dean at Elon University’s Love School of Business. “I think that different disciplines are going to find new ways to teach topics in the classroom and then help their students co-create with the technology.”
At the Love School of Business, generative AI use has moved beyond data entry and number-crunching. Ajjan said the future is co-creation — one where students learn and create alongside AI.
“I think we have to think about co-creation and say, ‘Then how am I addressing what learning is, and how am I assessing?’” Ajjan said.
Ajjan is working on a piece for Harvard University about AI and educational roleplay. As an example, she described a scenario where ChatGPT assumes the role of a customer. The student would then have to effectively communicate with the AI to close the sale.
Both Ajjan and Gaither believe that as technology advances, the beating heart of education is personalized learning. However, Gaither also expresses how this goal can be difficult for larger schools to achieve.
“We have small class sizes – 33 students. We get to know our students,” Gaither said. “If you're in a large lecture hall with 200 other students, how can you as a professor get to know your students and assess their knowledge?”
Rainie said while term papers and take-home exams will be less important in the future, he feels that assessments will take the form of oral exams or standardized, proctored tests. He also sees a future where AI could be utilized in knowledge assessments where students show what prompts they used to arrive at an answer.
“You know, the way that math teachers used to say, ‘Show me your work and how you got there,’” Rainie said.
Rainie also said this could signal a change in cognition and how knowledge is transferred.
“Physical labor used to be the thing tools replaced. Now, we’re in this brand new situation for all of humanity where these tools can do the cognitive work. That used to be the special province of humans,” Rainie said. “So our tools now are thinking a lot like us, or in ways that seemingly are a lot like us.”