AI and Keeping Your Element Real
AI is everywhere, but are we helping kids use it wisely?
Terri speaks with Jeff Riley of Day of AI about AI safety, AI literacy, and how teachers and parents can keep kids learning authentic in a digital world.
Jeff Riley shares his perspective on the good, the bad, and the possibilities of AI in K-12 education.
TERRI NOVACEK:
Imagine a world where every child can use AI the way an artist uses a brush or a scientist uses a microscope — not to replace their thinking, but to reveal it. A world where AI helps kids stretch into new challenges, follow their curiosity, and make discoveries they might never have made otherwise.
Welcome back to Element Is Everything, the show where we explore how principles of self-determination, purpose, and authentic learning help us operate “in our element,” and help our kids discover theirs.
Today we’re diving into a topic that feels both exciting and unsettling: the rise of artificial intelligence in the lives of our children and in the world of education. AI is suddenly everywhere, and it’s shaping how kids learn, play, explore their interests, and even how they see themselves.
Like most powerful tools, it brings the good, the bad, and a lot of unknowns.
On the good side, AI can expand possibilities. It can personalize learning. It offers creative exploration and tutoring that meets you exactly where you are.
Organizations like DayOfAI.org are helping schools and families build AI literacy so that our kids learn with AI, not just from it, and understand the ethics, risks, and opportunities that come with this new terrain.
But AI also has shadows — bias, surveillance, misinformation — and a very real risk that it can nudge kids away from the autonomy, competence, and connection they need to find and develop their element. When a tool becomes a crutch or replaces curiosity, challenge, or real human relationships, it can pull us further from the work we’re trying to do.
And then there’s the unknown: innovation that is outpacing our understanding. How do we prepare children for a world we cannot yet predict?
I recently spoke with Jeff Riley, who has dedicated decades to educational leadership and innovation. He’s now the Executive Director of Day of AI, a program developed by MIT in 2021 to prepare K–12 students to thrive in an AI-powered world. Since then, Day of AI has expanded to teacher training, family resources, and school partnerships.
Let’s listen.
TERRI:
Thank you for reaching out, and for recognizing that what you do aligns so closely with what we do. I’m excited to hear more about you, Jeff, and Day of AI. Where should we start?
JEFF RILEY:
A good place to start is my background. I was in public education for 32 years. I started as a teacher in Baltimore City and concluded my career as Commissioner of Education for the state of Massachusetts.
I was getting ready to retire when MIT called and asked if I would help run a nonprofit focusing on AI and education. This was about a year and a half ago. I said, “Can I say what I want?” and that made them nervous. They asked what I meant.
I said, “Fifteen years ago we didn’t regulate social media, and now we have kids with rising rates of anxiety, depression, and suicidal ideation, partly because their self-worth is based on how many likes they get on TikTok.” That’s oversimplified, but the research is not looking positive.
AI — which I truly think can usher in a new revolution in education — also has downsides and problematic aspects that we need to address head-on to keep kids safe.
TERRI:
So MIT RAISE is connected, right?
JEFF:
Yes. MIT RAISE is our sister group. RAISE stands for Responsible AI for Social Empowerment. They’re directly out of the Media Lab and focus more on policy. Day of AI is the group I run. We focus on working in schools — training teachers and administrators on how to get kids to use AI safely and productively. We help districts with policy and build AI literacy curriculum.
We’re in 170 countries and throughout the United States. We’ve trained state education commissioners through the CCSSO network, all the superintendents in Alabama and Maine, and teachers all over the country.
TERRI:
I love that you’re trying to get ahead of this and learn from mistakes with social media. What are lessons we need to make sure we’re not repeating with AI?
JEFF:
I don’t think anyone understood the influence social media would have on students — how much time they’d spend on their screens, what they’d be doing, and how it would take away from academics and social life.
In my last year as Commissioner, we put out a voluntary grant for districts to restrict or ban cell phone use in schools. Many used those Yondr pouches to lock phones up. The first week, half a dozen superintendents called and said, “Jeff, they’re talking to each other in the cafeteria again.”
Kids had literally been sitting together staring at phones, spooning food into their mouths without talking.
With AI, we can already list the problematic issues: voice clones, deepfakes, energy concerns, bias, misinformation, and hallucinations — when AI just makes things up.
A famous one: someone asked, “How do I keep cheese from sliding off my pizza?” The AI said, “Add an eighth of a cup of glue.”
We think it came from a sarcastic Reddit post about creating display pizza. AI didn’t know the difference.
Another example: someone asked, “Where do cheeseburgers come from?” The AI said, “Cheeseburger trees found in cheeseburger orchards.” It just doubled down.
Then there are AI companions — essentially imaginary friends — that replace real social engagement. They’re sycophantic, they tell you what you want to hear, and there are cases where AI companions have encouraged self-harm.
Common Sense Media recently reported that about 70% of kids have interacted with an AI companion, but only 30% of parents know their kids are using AI at all. That’s alarming.
TERRI:
Do you see any benefit to kids using AI for companionship?
JEFF:
Some. As a former adjustment counselor, I think talking through problems — even with AI — can help. But it never replaces a trained professional or real relationships.
The question is balance. How do we create healthy boundaries and teach kids to use AI as a tool, not a substitute for human connection?
TERRI:
Teachers can get frustrated. AI doesn’t. Kids can ask a question 27 times and the AI won’t respond with, “I already told you.” That low risk of judgment is a huge draw.
JEFF:
Exactly. Kids don’t want to be embarrassed in front of peers. If AI helps them take a first step toward asking questions, that can be positive — as long as it isn’t the only step.
We need to shine a spotlight on both the pros and cons.
And we need AI literacy — not fear, not blind enthusiasm — literacy.
TERRI:
I heard someone say, “You shouldn’t fear AI taking your job — you should fear the person who knows how to use AI taking your job.”
JEFF:
Absolutely. AI won’t replace humans, but humans who use AI will replace those who don’t. We used to talk about the three R’s: reading, writing, and arithmetic. Now we need a fourth: artificial intelligence.
Kids need to understand what AI is, what it’s not, and how to use it.
TERRI:
Where can parents and teachers start?
JEFF:
Go to DayOfAI.org. Put in your email to access:
-
Family resources
-
Free K–12 AI literacy curriculum
-
Toolkits we developed with Common Sense Media
-
Virtual teacher training
-
Policy support for districts
TERRI:
Our schools participated in the Day of AI introductory workshop. I was impressed — newbies weren’t overwhelmed and AI-savvy teachers weren’t bored. Here are some takeaways:
-
Generative AI includes chatbots like ChatGPT.
-
Prompts matter — detailed prompts get better results.
-
AI breaks words into tokens, predicts patterns, and can hallucinate false answers.
-
Bias and misinformation come from training data.
Teachers experimented by entering the same prompt into different AI tools and comparing results — and got totally different responses. That was eye-opening.
TERRI (closing):
Keeping your element real in a world shaped by artificial intelligence starts with remembering one truth:
AI can mimic knowledge, but it cannot live a life.
It can generate answers, but it cannot generate meaning.
Your element — that intersection of strengths, passions, values, and purpose — comes from lived experience, not processed data.
Visit DayOfAI.org. Pick one resource. Start one conversation in your family. Keep asking human questions:
-
What do I care about?
-
What feels meaningful?
-
What impact do I want to have?
AI can help you brainstorm, but only you can know the answer.