Advertisement

Balancing optimism with caution in higher ed AI

Higher education leaders are right to be cautious about using artificial intelligence in classrooms or operations, Educause's Kathe Palletier writes. But there are reasons for optimism, and helpful examples to guide ethical and practical applications.
AI
(Getty Images)

Artificial intelligence is everywhere, informing who we hire, verifying our identity, influencing politics and guiding our viewing habits, but higher education has been slower to adopt it.

There is a wariness about AI, which could be because of misconceptions about potential applications and the technology’s limitations, including its accuracy, comprehensiveness and unintended adverse outcomes. Institutional leaders and IT professionals are right to approach AI with caution, but it’s also just as essential to apply a healthy dose of optimism.

There are many possible reasons for the lack of maturity and momentum. Institutional leaders and IT professionals see a lack of knowledge about what the technology is and how it’s used across campus, as well as unfamiliarity with AI because it is built directly into applications and tools, according to a recent poll.

AI in learning tools and AI for analytics occupied two of the top six practices and technologies for the future of teaching and learning in Educause’s annual Horizon Report, based an expert panel’s votes. However, the proportion of “exemplar” AI projects institutions submitted was much smaller compared to submissions for the other four technologies and practices. That could suggest that while there’s a great need for AI in the next decade, there are fewer active implementations today compared to other leading technologies and practices in academia.

Advertisement

Even as adoption lags, integrating the right AI applications has the potential to benefit faculty and students, making it easier to arrive at data-driven decisions, saving faculty members’ time, enhancing student success and improving students’ learning journeys. Higher education AI success stories can inspire and guide leaders as they are considering this technology. At the same time, leaders need to consider the capacities and practices that will aid in the ethical and equitable implementation of AI solutions without triggering unintended consequences that could harm students, faculty, or the campus community.

To avoid becoming too fearful (or too enthusiastic), there are a few general guidelines institutions can use to navigate their AI conversations:

  • Match the tool to the purpose. AI is typically used in three applications in higher education: institutional use, student success, and instruction.
  • Invest in data capabilities. Before attempting to adopt AI-powered analytics tools, establish systems of data governance, consider interoperability and data integration needs, and overall data literacy on campus. Without these capabilities, campuses may lose out on benefits, or worse, create unintended issues related to ethics or privacy.
  • Couple the potential efficiency of AI with human decision-makers. Remember that today’s AI applications in higher education utilize machine learning, quickly analyzing and “learning” from enormous amounts of data. Human interpretation can guide that power. For example, AI tutoring tools might be used as a supplement when repetition and practice are needed, but not to replace coaching and guidance from a teacher.
  • Uncover the human effort needed to implement, scale, and maintain AI systems. The promise of more efficiency can blind institutional leaders to the time and skill needed to implement AI. For example, using chatbots requires workers with the time and skill needed for initial programming and ongoing training of the chatbot and consistent discussion around how to use the technology across departments.
  • Be an informed, critical consumer. Inform yourself of the limitations of AI tools that have ethical consequences and push industry partners to address issues of equity and accuracy. Educate campus professionals, especially those who are procuring AI tools, designing policy, or making decisions based on an AI generated insight, about issues of equity, privacy, and fairness.
  • Lean into improvement opportunities. AI analytics and learning tools can surface instructional or student support gaps for campuses to address. Campuses are currently using AI-powered software and analytics to back pedagogical or programmatic improvements.

By thoughtfully defining parameters, identifying its impact on students and faculty, assessing current capabilities and making necessary investments, institutions can select the AI tools that fit their needs. More importantly, leaders can have greater confidence the tools selected will be applied ethically and are adopted in ways that value faculty time and enable student success.

At one time educators were uncertain about the effectiveness of any technology in learning at all. Today, perceptions are different. Among educators, 86% say technology should be a core part of education. AI might experience the same turnaround as how AI can — and does — bolster learning becomes more clear.

Advertisement

Kathe Pelletier is the director of the teaching and learning program at Educause. Each year, the higher education IT organization convenes experts to review emerging technology and practices for the annual Horizon Report. For more information and research on higher education and AI, explore the resources on EDUCAUSE’s AI Showcase.

Latest Podcasts