5 tips for maintaining teacher-student trust as AI classroom use grows

A youth technology expert shares best practices and considerations for using artificial intelligence in classrooms.

As artificial intelligence-assisted technology increases in K-12 instruction and learning, many educators and education businesses see opportunities and potential for the tools — including enhanced instruction that can be personalized for individual students and efficiencies in conducting student or teacher-led research.

Others, however, hold concerns about the potential for cheating or false accusations of cheating, as well as overuse or inappropriate uses for AI systems, which use large amounts of analyzed data to make predictions and perform tasks.

“We’ve done this over the decades because technologies when they’re first introduced, we either say that they are going to be detrimental or they’re going to be lifesaving,” said Shelley Pasnik, senior advisor to the Center for Children and Technology, a nonprofit that researches technology’s influences on teaching and learning.

In fact, the use of technology in classrooms, including AI, can be much more complex because humans are actually guiding the application of the technology, said Pasnik, who is also a senior vice president at Education Development Center, a nonprofit that designs, implements, and evaluates programs to improve education. The Center for Children and Technology is affiliated with the Education Development Center.

AI in education has been used for tutoring support, language translation, checking for plagiarism, verifying student absences, teacher coaching, administering and scoring assessments, and more.

With teachers and students in the driver’s seats of the use of AI in classrooms, Pasnik suggests five ways they can maintain trusting relationships during the growth of AI platforms.

Discussing what’s known and unknown

Conversations about the tech tools that will be used or under consideration can help teachers and students better understand shared goals for their applications, guardrails needed for inappropriate uses, and if there are apprehensions, anxieties, and excitement in using the technology.

“Ask open-ended questions and find out what students know, what they’re thinking, what teachers know, what they may be thinking,” Pasnik said.

These conversations can also reveal what teachers and students understand well and what they need to learn about classroom use of artificial intelligence. This understanding of the known and unknown can be helpful as the policy is written regarding AI-assisted instruction and learning, Pasnik said.

Having a shared set of expectations

As policies are drafted around using AI in classrooms, considerations of governance and expectations will need to be made. This may include adding AI-assisted activities in student code of conduct agreements or to classroom-level teacher expectations for students, Pasnik said.

She added that some teachers are very clear that they will fail if students generate answers to assignments through a large language model that uses algorithms to develop the text.

Pasnik added that expectations should also be paired with consequences when trust is broken.

Allowing for teacher collaborations

Teachers should be given time to consult with each other about their experiences with AI in the classroom, including how AI may be changing their lesson plan development or how it’s influencing pedagogy.

Additionally, schools should reach out to parents to ask if they have questions, worries, or suggestions.

“So often, parents and teachers alike are confronting an environment and a set of conditions that is different, and perhaps even radically different, from their own lived experience” because it doesn’t mirror their own educational experiences, Pasnik said.

Understanding reasons for overuse or improper use

Overuse or improper use of AI by teachers and students should be explored with the goal of better understanding why these actions are taking place. A teacher who is prone to a surveillance mentality may see AI as helpful in preventing and catching cheating.

Likewise, a struggling student may be more inclined to rely on AI assistance.

For those reasons, it’s important for educators to look at AI in a larger context of instruction and learning and in relation to other supports available to teachers and students, Pasnik said. Teachers and students together can also explore the benefits and limitations of AI assistance in learning by comparing AI-generated and student-generated work and discussing the differences.

Addressing biases

Whether perceived biases come from technology or humans, it’s important that students and teachers feel seen and heard, Pasnik said. That means ensuring all students feel welcomed and included and that their social-emotional needs are being addressed.

“Algorithms are not accurate reflections of the full diversity of humanity,” Pasnik said. “How are students and teachers thinking about their own biases and also the biases of these new tools?”