BLOG

Teaching Students to Use AI Ethically

Useaiethically

Teaching Students to Use AI Ethically 

Adapted from A Teacher’s Guide to Using AI by Meenoo Rami.

The idea of cheating has set off alarm bells for educators for centuries. Today, AI adds a new layer of concern. In addition to general chatbots, a quick web search for “essay writer” or “do my math homework” reveals a wide array of tools that can be used for cheating. It’s terrifying to imagine students turning to generative AI to do their assignments for them.

You might wonder: Can’t a teacher tell when a student uses AI? Actually, no. Researchers who studied teachers’ ability to spot AI found that “novice and experienced teachers could not identify texts generated by ChatGPT among student-written texts. . . . Furthermore, both groups were overconfident in their judgments” (Fleckenstein et al. 2024). As I write this book, tools such as Humanize AI™ offer to, well, humanize chatbots’ writing, further complicating attempts to tell AI-generated content from human-generated content.

When we become fixated on “catching” kids cheating with AI, things go downhill fast. If we put our faith in alleged AI-detection tools, we’re not only deluding ourselves, we’re opening the door to disproportionately and wrongly accusing multilingual learners of cheating (Myers 2023). Even if we don’t use those tools and simply resign ourselves to being suspicious of students and reading their work with an eye to identifying cheating, we erode the essential trust between educators and students in classrooms. It’s time for a different approach.

To help your students use generative AI tools with academic integrity, try the following:

  • Consider who else might be doing this work in your school. Collaborate and share ideas and best practices. Librarians are often on the forefront with AI, and I have seen strong student-centered work with AI come out of partnerships between school librarians and teachers.
  • Design meaningful work. Assignments that students find relevant, that build over time, that invite reflection, or that include some kind of public sharing designed for a specific audience are not only more meaningful for students, they’re also harder to outsource. As one middle-school student explains, “People who are motivated to learn will use AI for good.” Jenn Thompson, a district supervisor of instructional technology, emphasizes project-based learning for exactly this reason: it makes students more invested, and it gives them opportunities to use AI ethically. Instructional technology specialist and author Mary Beth Hertz sees the issue of student cheating (with or without AI) as a symptom of our “transactional culture” of schooling, an endless exchange of completed assignments for grades (Rami 2025a). AI becomes the shortcut when the task itself is disconnected from curiosity, relevance, and ownership.
  • Model ethical use of AI. High school English teacher and author Jen Roberts talks openly with her students about AI, uses it in front of them, and shows them how to use it well. “By presenting it as something I’m excited about,” she says, “I’m able to show them that I’m going to teach them how to use it appropriately” (Rami 2025b). She makes space for questions, helps students set boundaries, and normalizes transparency. Her students know that she’s paying attention. More importantly, they know she’s there to support them, not to punish them.
  • Model understanding, not fear. Jen Roberts says it plainly: when students think you don’t understand AI, they’re more likely to use it in ways that undermine learning. That’s why she’s transparent about how she uses AI and invites students to explore what it can and cannot do.
  • Co-create clear guidance for when and how students can use AI in your class. Working with students to define acceptable AI use empowers them and fosters a sense of ownership and accountability in adhering to the agreed-upon guidelines. By including students in the conversation, you demonstrate that their voices matter and that you value their perspectives on how AI can support their learning.
    • This collaboration also helps surface potential challenges and creative solutions you might not have considered on your own: co-creating AI rules and expectations allows you to tap into your students’ lived experiences and insights. For example, your students may already be using AI tools outside of class and have ideas about what works well and what pitfalls to avoid. By involving them in shaping expectations, you can create a set of rules that feels relevant and practical rather than imposed and disconnected.
    • This process can also spark meaningful discussions about ethical considerations, critical thinking, and the responsible use of AI skills. If your school does not have a cohesive policy, expectations will differ from teacher to teacher. Help students understand that these are shared agreements between you and them and that they may face different expectations in another subject.
  • Teach students to use AI as a partner, not a shortcut. Have them ask themselves, “Am I using this tool to help me learn, or just to get the work done faster and replace the effort required?”
  • If students are using AI tools, encourage them to use tools that support ethical use. This might mean prompting a chatbot to support them with brainstorming ideas for an essay while directly telling the chatbot not to write the essay for them. Some made-for-school, student-facing platforms are designed to support students but not to do the work for them.
  • Encourage students to be honest with you about their work. Once your class has a co-created policy for acceptable use, normalize discussing how you and your students use AI in accordance with the policy. For example, you might ask them about how they’re using it, or you might share examples of your own. Transparency shows students you trust them to use the tools responsibly. STEM lead Aaron Maurer suggests “offering the student a chance to write a reflection piece, explaining how they used AI and what they learned in the process. This can help them demonstrate their understanding and growth. . . . Perhaps it is part of your classroom policy that if students choose to use AI they must explain their process or share their workflow with the tools” (Maurer 2025).
  • Prioritize learning over perfect results. Students are more likely to cheat when they perceive the class to be focused on outcomes rather than on learning. By emphasizing growth and valuing curiosity, you help students to see your class as a place to learn, not a place that demands unrealistic perfection. Teach students to ask themselves as they work, “What am I learning by doing this?”
  • Teach students to ask for help. Remind students that you are there to support them and that you want them to succeed. Let them know that if they find themselves feeling overwhelmed and over-reliant on AI, they can reach out to you. Then, when a student takes you up on this offer, find time and space to ensure that you give them your full attention.

Sources

Fleckenstein, Johanna, Jennifer Meyer, Thorben Jansen, Stefan D. Keller, Olaf Köller, and Jens Möller. 2024. “Do Teachers Spot AI? Evaluating the Detectability of AI-Generated Texts Among Student Essays.” Computers and Education: Artificial Intelligence 6. June. 100209. https://doi.org/10.1016/j.caeai.2024.100209 

Maurer, Aaron. 2025. “Ensuring Student Work Authenticity in an AI-Driven World.” Chaos Navigators, Feb. 20. https://aaronmaurer.substack.com/p/ensuring-student-work-authenticity 

Myers, Andrew. 2023.“AI-Detectors Biased Against Non-Native English Writers.” Stanford HAI. May 15. https://hai.stanford.edu/news/ai-detectors-biased-against-non-native-english-writers 

Rami, Meenoo. 2025a. “Creating with AI: A Conversation with Mary Beth Hertz.” The AI Conversation for Educators, A Heinemann Podcast, July 21.

Rami, Meenoo. 2025b. “Using AI with Students: A Conversation with Jen Roberts.” The AI Conversation for Educators, A Heinemann Podcast, July 14.


A Teacher's Guide to Using AI includes:

  • practical strategies for using AI in your own work to save time, personalize instruction, communicate with caregivers, and spark creativity
  • integrating AI into lesson planning, creating and refining assignments, planning curricula, analyzing student data, and providing feedback
  • guidance for teaching students about AI’s capabilities, limitations, ethical considerations, and potential risks as well as how it can supercharge their learning and agency