Press "Enter" to skip to content

AI in Higher Ed: Study Shows Educators See AI as Catalyst, Not Threat

  • Faculty attitudes toward AI are shifting, with growing interest in tools that personalize learning, fill knowledge gaps, and improve student success.
  • Anthology’s ethical AI framework keeps data on campus, requires human oversight, and focuses on automating routine tasks to free faculty for deeper student engagement.
  • AI is expanding beyond STEM into arts and humanities, serving as a creative partner and teaching students how to engage with technology responsibly.

In an exclusive interview with Justin Louder, associate vice president of academic innovation at Anthology, groundbreaking insights emerge about the future of artificial intelligence in higher education. As one of the world’s largest edtech companies navigates the delicate balance between technological advancement and human-centered learning, its comprehensive study of 2,500 faculty members reveals a profession cautiously embracing AI’s transformative potential.

The survey results challenge common assumptions about educator resistance to new technology. A significant 22% of faculty expressed interest in AI tools designed to guide students and identify knowledge gaps, while 20% showed enthusiasm for AI-driven personalized learning pathways. These findings signal a clear shift toward viewing AI as a catalyst for student retention and success rather than a threat to traditional teaching methods.

“Faculty are excited about some of these AI-powered tools,” Louder explained. “They’re looking for tools that really will help them make sure their students are successful.” This enthusiasm stems from AI’s potential to support individualized instruction and address diverse learning styles – critical factors in improving graduation rates and student persistence.

While STEM faculty predictably showed greater enthusiasm for AI integration – recognizing their graduates will enter AI-transformed workplaces in engineering, manufacturing and technology – Louder advocates for expanding AI literacy across all disciplines. Drawing from his communication background, he demonstrates how AI can enhance learning in unexpected areas.

Anthology’s AI conversation tool within Blackboard, its popular learning management system, exemplifies this cross-disciplinary potential. The system can engage students in Socratic dialogue, helping them refine thesis statements through iterative questioning – simultaneously improving academic skills while demonstrating ethical AI interaction methods.

“You can actually set this AI tool within Blackboard to help a student think about something like their thesis statement,” Louder noted. “It not only helps a student develop their speech and thesis statement, but it also shows them ethical ways to interact with AI.”

In creative fields, AI serves as collaborative partner rather than replacement. While AI might assist composers in developing new pieces, the irreplaceable human elements of live performance remain distinctly human. A concert pianist’s artistry cannot be replicated, but AI tools might enhance the compositional process.

This expanded vision moves beyond the assumption that AI literacy belongs exclusively in STEM curricula, extending into STEAM education to encompass arts and humanities. The goal: demonstrating how AI can serve as a supportive tool across creative and analytical disciplines while maintaining clear ethical boundaries and human-centered learning objectives.

Ethical framework addresses core concerns

Recognizing faculty apprehensions, Anthology has developed a comprehensive ethical AI framework built on three foundational principles that directly address institutional barriers identified in their research.

First, all AI interactions remain within institutional systems, never feeding back to large language models. “Everything that is done with AI through anthology tools stays within that campus,” Louder emphasized, addressing privacy concerns that often paralyze institutional decision-making.

Second, AI tools are opt-in, allowing institutions to control the pace of adoption. Third, human oversight is mandatory – faculty must review all AI-generated content before implementation. “You can’t just have the AI develop a test question and immediately put it into the LMS – the faculty member has to review it,” he explained.

This framework directly addresses the training and institutional support barriers faculty identified as primary obstacles to AI adoption. “AI cannot replace that subject-matter expert, that faculty member,” Louder stressed. “You need that human element.”

Rather than revolutionizing pedagogy entirely, Anthology’s AI tools target what Louder terms the “transactional nature of teaching” – routine tasks like building rubrics, designing test questions, or creating discussion prompts. By streamlining these activities, AI liberates faculty time for more meaningful student interactions.

“Faculty are seeing opportunities to have more time with their students because of AI,” he observed. “We are seeing a shift in education to where there will always be a need for different types of assessments, but there’s also an opportunity, because of AI tools, to allow students to express what they’re learning in different ways.”

This transformation enables authentic learning experiences – developing podcasts for assignments, creating presentations, or producing historical newspapers – while AI provides creative ideation support. Internal data reveals that faculty accepts only 40% to 45% of AI-generated suggestions, a figure Louder views positively as evidence that educators use AI for creative inspiration rather than passive acceptance.

Partnership model for institutional transformation

For institutions hesitant about AI adoption, Anthology positions itself as a partner rather than a vendor. Through its ‘Ideas Exchange’ platform, faculty can propose new AI tools, with user community feedback informing product development. This collaborative approach ensures AI development aligns with actual educational needs.

“The campus has to be ready,” Louder acknowledged, emphasizing the need for stakeholder conversations involving faculty, administrators and students. Much institutional concern stems from job displacement fears – worries he directly addressed by reinforcing AI’s supportive rather than replacement role.

What excites Louder most about AI’s educational future is its unpredictability. Tracing evolution from Microsoft’s Clippy assistant to today’s sophisticated conversation tools, he sees unlimited potential for supporting teaching and learning environments.

“We went from Clippy to now, tools within Anthology that can help faculty think about new ways to assess, new ways to interact with students,” he reflected. “What excites me is that if we have gone through all of these tools, how can we support what’s next? How can we make sure that the teaching and learning environment on our campus is there to support our students and our faculty and set them up for success?”

As higher education navigates this technological frontier, Anthology’s approach suggests a path forward that embraces advancement while preserving education’s fundamentally human character. Its integrated ‘Video Studio’ tool exemplifies this philosophy – enabling AI-assisted assignment creation, AI-developed rubrics, and real-time faculty evaluation, all within a single environment that includes automated captioning for accessibility.

The research reveals a nuanced landscape where faculty are simultaneously excited and cautious about AI, seeking tools that support student success without compromising educational integrity. By emphasizing responsible implementation, ethical frameworks, and genuine institutional partnership, Anthology demonstrates that AI’s educational transformation lies not in replacing educators, but in empowering them to focus on what they do best: inspire, guide, and connect with students in meaningful ways.

The integration emphasizes teaching students not just how to use AI, but how to engage with it responsibly as they develop critical thinking and creative skills across diverse academic areas. As Louder concluded, the “human element of education is paramount,” a principle that will likely guide higher education’s AI integration for years to come.

Author

  • Aneeka Cheema is the chief knowledge officer of The AI Innovator. She is a Boston-based art educator with international experience in design, arts, and education. Aneeka is working towards a doctorate in educational leadership and policy studies from the University of Massachusetts Dartmouth.

    View all posts
×