For the fall semester, Elon launched a new student guide for artificial intelligence use on campus, in partnership with The American Association of Colleges and Universities. Mustafa Akben, Elon’s director of artificial intelligence integration, said Elon already has a policy on AI that will remain the same saying that AI use in classrooms will vary from professor to professor.

“We are not replacing any human work,” Akben said. “Every class, every faculty and instructor has their autonomy to decide whether they are going to use, ban or partially enable students to use these tools.”

This is also stated in Elon’s honor code, as it is prohibited to use any tools not allowed by faculty members.

The student guide includes sections showing students how-to best use AI, in both a productive and ethical manner, concerns regarding AI and how AI can be used in one's career. 

In a study released by Resume Templates, managers are more likely to prefer a candidate for a job who has less work experience, but more experience with AI. This increase of AI in the job force is part of why Elon is working to stay ahead of the curve with new technology, Akben said. 

This year, Elon has launched other tools for students using AI including an academic advising tool and will be launching ElonGPT — first for staff and faculty, and then for students — a software comparable to ChatGPT but without using people’s data to train AI, Akben said. It will be free and help students to utilize AI ethically and responsibly, Akben said. 

The advising chatbot was piloted this summer and can be used to ask questions related to academic advising and registration. It can also answer other Elon related questions, such as when is the best time to bring a parent to campus and where can they stay. 

One reason Akben said Elon is looking to use its own AI program is because of a report released by Tyton Partners. The report found that half of students regularly use generative AI — and 75% of students who already use AI will continue to do so even if professors or schools ban it. This means that professors will need to be aware of this and teach accordingly, Akben said.

A more productive use of AI is using it for brainstorming and outline work, Akben said. This is why ElonGPT, when launched for students, will include safeguards where the chatbot will not fully do assignments for students. Akben also said it defeats the purpose of learning and taking a course to allow AI to do students work for them. 

“You want to use your critical thinking,” Akben said. “Make sure that you're learning something through whatever course you're taking, and use AI for these purposes, not just, ‘I'm going to offload my tasks.’”

The current version of ElonGPT, made this semester for faculty and staff, is not learning and teaching oriented and won’t include the same guards, Akben said. A use of ElonGPT for staff and faculty can be creating end of year reports, facilitating conversations and summarizing documents, Akben said. 

If there is an problem or unexpected response for ElonGPT, Akben said faculty and staff are encouraged to report that and he and his team will work to fix any issues that arise. Elon won’t have any way to monitor the use of ElonGPT as it is people’s personal data, Akben said, therefore Elon will be relying on survey responses from participants to see if it is effective. Akben also said he is interested in hearing feedback from students on what is helpful to them in a director of AI for them at Elon and beyond. 

“I just really want to learn,” Akben said. “How can I help them to successfully complete their program and be ready for their career?