World history teacher Andre Wangard walks among the rows of desks in his classroom at Cristo Rey Boston High School, stopping occasionally to look over students’ shoulders as they type away on their laptops.
The 11th graders asked Genghis Khan and Confucius about Asia's economic systems — and through an artificial intelligence chatbot, those long-dead figures answered back.
The AI answers may or may not be correct — one of the big problems with generative AI technology — but Wangard's students are delighted by the ability to have a simulated conversation with historical figures.
“We were able to actually ask a person questions and get a better answer than what the textbooks would give you,” junior Fatima Koumbassa said excitedly.
AI is taking off in classrooms, and the education business sector is expected to be a $30 billion industry by 2032, according to the firm Global Market Insights. A recent survey by Boston publisher Houghton Mifflin Harcourt, which has begun selling AI programs “to save teachers time” found that nearly 40% of teachers plan to use AI in their classrooms this year, up from 10% last year.
And those numbers are rising despite debate around the technology’s accuracy and faster than states’ efforts to create guidelines or student privacy protections.
MIT professor Eric Klopfer, who co-directs the RAISE lab at the institute, said schools and students would benefit from more guidance navigating the technology. (RAISE stands for Responsible AI for Social Empowerment in Education.)
The technology offers benefits, for example, it can help students learn a language or catch up after pandemic learning loss. But some AI tools can perpetuate bias because they reflect the biases of their developers, or because they were trained on incomplete or systemically biased.
“I think it's really important for kids to be aware that these things exist now, because whether it's in school or out of school, they are part of systems where AI is present,” Klopfer said. “Many humans are biased. And so the [AI] systems express those same biases that they've seen online and the data that they've collected from humans.”
A May report by the Department of Education highlighted the benefits of AI in the classroom, saying it could achieve educational priorities “in better ways, at scale, with lower costs.” For example, AI-based tutoring could help address unfinished learning from the pandemic, and customizable AI tools could provide better support to students with disabilities or who are English language-learners. But the report also cited problems with student privacy and security.
“The development and deployment of AI requires access to detailed data. This data goes beyond conventional student records (roster and gradebook information) to detailed information about what students do as they learn with technology and what teachers do as they use technology to teach,” the report said.
On an early Thursday morning after world history class, Wangard said the allure of the technology was hard to ignore. It’s the second year he’s using a tool called Character.AI in his 11th grade world history class at the Dorchester charter school.
He says there are many different AI tools that are available to him as a teacher that help create lesson plans, worksheets, student quizzes or writing prompts.
“Right now, there’s an AI boom,” Wangard said. “But you have to filter through what's presented to you and make sure that it’s actually beneficial and that more importantly, students actually want to use it.”
He learned about classroom AI tools, like Character.AI, through teacher support groups on Facebook and taught himself how to use them.
Wangard said that the tools he uses definitely saves him time.
“It was pretty hard to not to implement it into education because it was everywhere,” he said.
When students use the Character.AI chatbot, a simulated historical figure like Confucius — or modern celebrities like Ariana Grande, videogame characters or generic "helpers" — will answer their questions through simple back-and-forth text messages.
The website for the Palo Alto company that made Character.AI says outright that “characters make things up!” to the frequently asked question about whether the characters’ answers are trustworthy.
The company also says characters “may provide links to fake evidence to support their claims.”
Character.AI spokesperson Rosa Kim said in an old-fashioned email that students and teachers have the choice to use AI technology in whatever way is important to them, with a caveat that they need to check their facts.
“We have seen many different use cases, and learning partners are one of them — whether it is a Character helping to study for a quiz, language tutoring, or speaking with historical figures,” she said in an email to GBH. “As always, with anything on the internet, it’s important to check your work and do your own research.”
Another program called Khanmigo, created by the online school Khan Academy, was found to be flawed.
In July, a Washington Post reporter used its AI-powered tutor bot and teaching assistant still in beta to simulate an interview with Harriet Tubman. The writer found that the bot misattributed quotes to Tubman.
Wangard, the Cristo Rey teacher, recognized that Character.AI can be inaccurate when discussing certain topics, particularly modern events.
“When asking specific history questions about their lives and achievements, the characters are usually correct,” he said. “This is why Character.AI should only be used to introduce key figures, not give key AP-like historical information.”
Cristo Rey principal Thomas Ryan said the school has been looking into adding AI-regulation policies, but it can be tricky to create guideposts because AI is continuously and quickly evolving.
“We included AI in academic integrity language. Later in the year, we will do more intentional professional development around the topic,” Ryan said in an email to GBH.
According to the Houghton Mifflin Harcourt survey, educators don’t feel confident in their abilities to use AI tools. Only one in five teachers said they feel equipped to use tools like ChatGPT in their classrooms.
Wangard said he also uses an AI technology called Class Companion to give his students’ writing prompts. The chatbot's feedback on their writing is instant and personalized, he said.
“It's really good feedback too,” he said. “It’s not very basic like a 'good job' or 'bad job.' It’s very detailed on what they could do better, which saves me a lot of time, but it helps the students a lot as well.”
Back in Wangard's old school classroom, situated in a red brick schoolhouse constructed more than a century ago, students say the new technology felt fresh.
“I think it's beneficial because it really helps us get a headstart,” said junior Nailea Gonzalez. “I feel like it's better to make this connection now.”
Fatima Koumbassa, another junior in Wangard’s class, said it made the lesson more engaging because she was leading the questioning. But she also never once thought she was actually talking to the historical figure.
“I wasn’t expecting a kind of human connection,” she said. “I know there’s a human behind that.”
Or maybe a cyborg, she said.