Technology Essentials in Education Episode 16:
Practical Examples of AI in Education

Host: Monica Burns

Apr 17, 2026

About the Episode

Technology Essentials in Education is your go-to podcast for practical insights on using technology to simplify your school week. Hosted by author and educator Monica Burns, Ed.D., in partnership with Jotform, this series is designed for K-12 educators, administrators, and leaders looking to make a meaningful impact. In this episode, Monica welcomes Caroline Haebig, a K-12 coordinator of teaching and learning and ISTE author, to discuss the practical application of AI in today’s classrooms. Together, they explore how educators can move past the "wow factor" to find sustainable, research-backed ways to integrate AI into daily instruction and student work. They discuss the transition from viewing AI as a "magic wand" to treating it as a sophisticated partner for retrieval practice, lateral reading, and creative brainstorming. Caroline also shares her insights on constructing effective AI prompts through a cognitive science lens, emphasizing why transparency and clear citation norms are the foundation of AI literacy for students.

Hello there. My name is Monica Burns and welcome to Technology Essentials in Education. Today's episode is all about practical examples of AI in education with Caroline Haebig.

I'm chatting with Caroline, who I've known for over a decade. She's the coordinator of teaching and learning for a K-12 district in Wisconsin and an ISTE author. You might know her from her books, The Maker Playbook or the newly updated Learning Supercharged Second Edition.

I've known Caroline for a long time and I always appreciate her perspectives on education technology and how it can really transform learning experiences for students. So today we talk about practical examples of AI in education with plenty of strategies you can try out right away.

This episode is brought to you by Jotform. Jotform provides an all-in-one solution to streamline administrative tasks, enhance community engagement, and foster innovation. Using their no-code drag-and-drop forms and workflows, your teams can securely collect and store data, automate tasks, and collaborate on team resources. Educational institutions are also eligible for a 30% discount on Jotform Enterprise. Head to their website to learn more at jotform.com/enterprise/education.

Welcome to the podcast. I am so excited to chat with you about some practical examples of AI in education. But before we get into all of that, can you share a bit with listeners, what is your role in education? What is your day-to-day look like?

So I'm Caroline Haebig. I am the coordinator of teaching and learning for a K-12 school district in Wisconsin. I have a unique role because I get a variety of projects. In the recent past, some initiatives I've really worked to develop and launch have to do with design thinking, maker spaces, engineering focus, as well as looking at integrating computer science K-6 for every student, which is really exciting. Then all the way to continuous improvement work, looking at how can we maximize our curriculums and resources in the best ways possible to meet the needs of teachers and students while meeting standards and developing common assessments. Also looking at what's new, like AI, and how do we help prepare teachers, students, and families to embrace and maximize changes in practical ways?

I love that idea of maximizing changes because things are moving and changing really quickly, and really making the most of what is out there and available is a big part of this conversation. Today we're talking about AI and education. So what misconceptions about AI do you see most often in K-12 classrooms right now?

I think the biggest misconception is that sometimes people over-assume what a tool can do or their hopes for what a tool can do at its present state or time. People also don't think about how these technologies operate and what that means for being responsible, whether you're an adult or a student, thinking about data privacy. I also see people saying, oh wow, it's something new, we have to really think about how this works, but not thinking about what practices, policies, and guidelines we already have in place for technology in general and then building on that. So those are the three things that I see kind of mislead people or misconceptions they have.

That's such a great point about the way we interact with some of these tools, even just from a professional standpoint. When we're trying to do some instructional planning or get some support, we hit that wall of, well, couldn't it do this or should it be able to do this? There might be aspects of that bigger task that it can take on, but it's not going to go from A to Z automatically just because you opened up ChatGPT or Gemini and asked it to do something. I love that emphasis too on how we have a set of guidelines already and are thinking about technology use, and while this might not be exactly the same, there are definitely some parallels to lean into.

As you think about what you're seeing in your role or things you're hearing from some of your colleagues, can you share a few examples of how students are using AI in classrooms today? Are you seeing this or something adjacent to it? What have you noticed?

I see a variety of things in professional practice, which is important. Thinking about how we connect it to the work we already have to do, especially for students. I see teachers and students using it to create resources to support studying and retrieval practice within learning. So how do kids practice information and recall it from memory, which helps build learning and understanding? Also using it for experiences to build skills around lateral reading, like when we get an output or search for something, using AI to generate a summary or lead us to the next step, then teaching what lateral reading is—reading across sources and using critical thinking to compare results and synthesize information from various sources. I also see features kids can use to get creative. A lot of teachers find that younger students want to make something look really cool, like a poster or slide deck, and sometimes get lost in the features and not focus on sharing the learning outcomes. A creative use I've seen is students doing the work and creating a product that shows their learning, then trying enhancements without getting too lost in them and keeping learning at the front. AI gives a little wow factor, but then you spend time showing the learning and some of those features. I've also seen AI support brainstorming and building on ideas. It's been interesting to see the different angles these tools can be incorporated.

Lots of that is student-facing and AI-powered. I think about being in a computer lab with my fifth graders trying to get them to wait to do PowerPoint transitions until the end of class. That might be one angle for one group, or maybe some of that brainstorming or idea gathering. Such great examples.

When schools or educators ask about acceptable versus unacceptable AI use, what guidance do you typically offer?

When it comes to acceptable and unacceptable use, it's about creating clarity and starting with educating any user, whether professional or student, about how these tools work and building understanding around that. Knowing how machine learning works, even at a surface level, is important and informs conversations about what is acceptable and responsible use, what information should or should not be input, and what parameters and guidelines already exist in your district. Creating opportunities to build on current acceptable use policies by providing learning and education about these tools is key. Also, having avenues for teachers or students to ask about tools or apps they are curious about, and having a practice to vet and explore those tools, allows users to give insight into their interests. It's helpful to pick specific features with AI-powered tools that address pain points or increase efficiency. I always tell people to look at current guidelines, build understanding to make informed decisions, allow opportunities for curiosity and exploration, and focus on things that need to be tackled first based on existing needs. This leans into practical examples and use cases teachers have found to address pain points, then connecting those back to guidelines to see if anything needs revisiting. Also, tap into resources like Teach AI, Digital Promise, or Google's Guardian Guide to AI for literacy practices and resources you can implement. Not everything has to be done from scratch.

Just that idea that you might find something you really like and then connect with a school or district that also likes it, so you have common vocabulary to brainstorm. Finding someone with commonalities and using a particular set of norms to bounce ideas off is valuable. I appreciated you mentioning opening conversations for colleagues to share what they're seeing and how they're using AI, supporting curiosity. That also comes back to transparency. What routines or norms have you seen work well for fostering responsible and transparent AI use with students?

With students, determining the role we want AI to play and having clarity around that is really important. For example, when working on Learning Supercharged Version 2, I laid out ways AI tools or features can be used. Providing students with a guide that says they can use AI to develop thinking, provide feedback, summarize in conjunction with their own reading, or generate new content. For example, using AI to create flashcards or fill-in-the-blank notes, or teaching helpful study strategies. It's also important to call out inappropriate uses, like passing off AI-created content as your own, accepting revisions without critical analysis, or asking AI to do homework answers. Being clear and kind about this is helpful. Also, stating that if you're going to use AI, you need to cite it, disclose when AI is used, describe the value it brought to your work, and include metacognitive reflection on how it contributes to academic growth. Being clear about appropriate and inappropriate use and how to cite or explain the value AI adds to learning fosters responsible use.

Jotform lets you build forms in minutes, including student surveys, homework submissions, quizzes, and more. You can start from scratch or use free templates designed for teachers, schools, and districts. Educational institutions can get a 30% discount on Jotform Enterprise at jotform.com/enterprise/education.

Being specific is important. We can't assume someone will share their use cases if we haven't talked about why or how to have that conversation. I’m glad you mentioned citation because it’s great for teacher modeling. Even if second graders don’t use MLA citation, they can put a note below an AI-generated image. I’m working with Colorado educators and we reviewed their guide on citing this type of content. There are resources we can adopt as a school community to maintain transparency about AI use.

I'd love to hear about constructing AI prompts through a cognitive science lens. What does that mean in practice for teachers?

Cognitive science research has shown what is effective for getting information into kids’ heads and more importantly, helping them apply it with understanding for high-impact learning. Three strategies stand out: retrieval practice, spacing, and interleaving. Retrieval practice involves recalling information from memory, not just presenting it. Spacing means practicing information over time. Interleaving mixes topics to help students discriminate between similar concepts. These strategies help students remember and understand content deeply.

Prompting AI gets interesting because outputs depend on inputs. As professionals, we need to bring high-impact practices into prompts. For example, specify the grade level, academic standards, and essential questions in your prompt. Ask AI to create activities incorporating retrieval practice to improve learning within that standard. The more detail and instructional strategies we include, the better the output. For example, teachers can use AI to create fill-in-the-blank notes for explicit teaching or graphic organizers for scaffolding, then provide space for students to reflect on key takeaways. AI can help jumpstart our work by including these details we know are effective but may not have time to create from scratch.

Also list common struggles students may have and include those in prompts. For example, create an activity where grade-level students use sequence reconstruction, a cognitive science strategy involving chronology, to support recall. Exit tickets are another example—prompt AI to create exit tickets with open-ended questions using student-friendly language to help develop thinking. The better the prompt, the more we can bring in cognitive science strategies. Even if you don’t know all the details, AI can help brainstorm and bring those elements together using key vocabulary terms like exit tickets or retrieval practice.

For someone new or frustrated with AI outputs, your advice to lean into what you know works best in pedagogy and content is crucial. These are high-impact strategies that have always worked, and now AI can help scale that impact to reach more students. Having contacts on best practices for content and pedagogy is super practical for rethinking prompting structures.

As we finish, can you tell us about your new book, what motivated you to create it, and some of your past resources for educators?

I'm excited about the new book coming out in March, Learning Supercharged Version 2. It covers a variety of topics from experiential learning, project-based learning, design thinking, game-based learning, to AI. AI is infused in every chapter, showing how it supports that work, also looking through a cognitive science lens. The book discusses pitfalls in research and ways to improve practices. Many teachers don’t have time to dig into academic articles and translate them to their context, so this book is rooted in research but offers practical actions. For example, the AI chapter includes example prompts people can use and customize to jumpstart their work. It also has a companion site with documents and templates people can copy, modify, or translate into tools they like. I know time is a barrier, so I aimed to provide resources that help overcome that.

For someone revisiting this topic, those additional resources can really help them be successful this school year and beyond.

Where can people connect with you and learn more about your work?

ISTE and other places are great starting points. I’m open to social media conversations on Facebook, Instagram, and email. People often reach out with questions or for direction, and I’m always happy to chat and hear what others are doing that inspires them. Those are some of the pathways to connect.

Perfect. I’ll link out to everything. Thank you so much for your time today and for sharing these great strategies with listeners. It was so much fun chatting with you.

Let’s finish this episode with a few key points to help make ed tech easy. Many misconceptions about AI stem from overestimating what tools can do and underestimating how they actually work. Students are beginning to use AI for studying, summaries, brainstorming, and creative enhancements. Clear, acceptable use guidelines can build on existing tech policies and emphasize responsible, informed use. Transparency about how and why AI is used, including when to cite it, is essential for helping students build healthy habits. Remember, you can connect with Caroline and learn more about Jotform by finding all the info in the show notes below this video.

A big thank you to Jotform, the presenter of today’s episode. To learn more about Jotform and how educational institutions can get a 30% discount on Jotform Enterprise, head to jotform.com/enterprise/education.