(The Center Square) – Adequate preparation by university faculty to use generative artificial intelligence for teaching or mentoring is not in place at their respective schools, say 68% of 1,057 college and university faculty members sampled nationwide.
The institutions, say 59%, are not well prepared to use GenAI effectively to prepare students for the future, according to Wednesday’s report from Elon University’s Imagining the Digital Future Center and the American Association of Colleges and Universities.
“Faculty views are not uniformly pessimistic,” says Elon University President Connie Book. “Significant numbers acknowledge AI’s potential to improve aspects of teaching and learning, including the customization of instruction, efficiency in course preparation, and the quality of assignments and research support.
“Moreover, 69% of faculty say they now incorporate AI-literacy topics – such as ethics, hallucinations, bias, privacy and transparency – into their courses, demonstrating growing efforts to prepare students for a world in which AI fluency will be essential.”
The report is authored by Eddie Watson, vice president for Digital Innovation at the American Association of Colleges and Universities, and Lee Rainie, director of Elon’s Imagining the Digital Future Center. Sampling was done Oct. 29-Nov. 26.
In the section about challenges posed to embrace GenAI tools in courses, one faculty member said, “AI tools will be helpful if they are used correctly, to supplement learning and instruction, rather than replace it. Students must be taught to use discretion about what they see in AI and learn how to utilize it effectively.”
Collectively saying a lot or some, 92% have concerns regarding diminished student learning outcomes; 90% lack trust in the safety and security of GenAI; and 88% say there is poor quality of GenAI tools’ output, including false, misleading or biased information. There were 70% saying a challenge is the lack of training and support infrastructure to foster broad adoption of GenAI.
The survey asked an open-ended question about what human skills schools should teach. Watson and Rainie wrote within the report “the most dominant theme by far was that critical thinking becomes more important in an AI-saturated world.”
“Respondents,” they wrote, “repeatedly frame AI as increasing the need for skepticism, verification, reasoning, judgment and discernment. Many argue that without these skills, AI accelerates misinformation, intellectual passivity and epistemic collapse.”
Eighty-seven percent of faculty said they had created guidelines or policies for students’ use of generative artificial intelligence.
“This is not a story of simple resistance to change,” said Lynn Pasquerella, president of the American Association of Colleges and Universities. “It is, instead, a portrait of a profession grappling seriously with how to uphold educational values in a rapidly shifting technological landscape.”
She notes higher education has adapted throughout history with such inventions as the printing press, calculators, computers and the internet.
“Yet,” she says, “few innovations have entered our classrooms with the speed, scale, and impact of generative artificial intelligence. ChatGPT, Gemini, Claude and Copilot – once novel tools – have quickly become woven into everyday academic life. The speed of this transition invites not only attention but also candor as we consider how these technologies are shaping teaching, learning and understanding.”




