As schools across the country debate banning AI chatbots, some math and computer science teachers are embracing them as just another tool
For as long as Jake Price has been a teacher, Wolfram Alpha — a website that solves algebraic problems online — has threatened to make algebra homework obsolete.
Teachers learned to work around and with it, said Price, assistant professor of mathematics and computer science at the University of Puget Sound. But now, they have a new homework helper to contend with: generative artificial intelligence tools, such as ChatGPT.
Price doesn’t see ChatGPT as a threat, and he’s not alone. Some math professors believe artificial intelligence, when used correctly, could help strengthen math instruction. And it’s arriving at a time when math scores are at a historic low and educators are questioning if math should be taught differently.
The Education Reporting Collaborative, a coalition of eight newsrooms, is documenting the math crisis facing schools and highlighting progress. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
Artificial intelligence can serve as a tutor, giving a student who is struggling with a problem immediate feedback. It can help a teacher plan math lessons, or write math problems geared toward different levels of instruction. It can show new computer programmers sample code, allowing them to skip over the chore of learning to write basic code.
As schools across the country debate banning AI chatbots, some math and computer science teachers are embracing them as just another tool.
“Math has always been evolving as technology evolves,” said Price. A hundred years ago, people were using slide rules and doing all of their multiplication with logarithmic tables. Then, along came calculators.
Price makes sure students have the skills to solve problems on their own. Then, he discusses the limitations of the technologies they might be tempted to use when they get home.
“Computers are really good at doing tedious things,” Price said. “We don’t have to do all the tedious stuff. We can let the computer do it. And then we can interpret the answer and think about what it tells us about the decisions we need to make.”
He wants his students to enjoy looking for patterns, seeing how different methods can give different or the same answers and how to interpret those answers to help make decisions.
Min Sun, a University of Washington education professor, thinks students should use chatbots like personal tutors. If students don’t understand a mathematical operation, they can ask ChatGPT to explain it and give examples.
She wants teachers to use ChatGPT as their own assistant: to plan math lessons, give students feedback and communicate with parents.
Teachers can also ask ChatGPT to recommend different levels of math problems for students with different mastery of the concept, she said. This is particularly helpful for teachers who are new to the profession or have students with diverse needs, Sun said.
“It gives you some initial ideas and possible problem areas for students so I can get myself more prepared before walking into the classroom,” Sun said.
A year ago, if you asked Daniel Zingaro how he assesses his introductory computer science students, he would say: “We ask them to write code.”
But if you ask him today, the answer would be more complex, said Zingaro, an associate professor at the University of Toronto.
Zingaro and Leo Porter, a computer science professor at University of California San Diego, authored the book Learn AI-Assisted Python Programming with GitHub Copilot and ChatGPT. They believe artificial intelligence will allow introductory computer science classes to tackle big-picture concepts.
A lot of beginner students get stuck writing simple code, Porter and Zingaro said. They never move on to more advanced questions — and many still can’t write simple code after they complete the course.
“It’s not just uninteresting, it is frustrating,” Porter said. “They are trying to build something and they forgot a semicolon and they’ll lose three hours trying to find that missing semicolon” or some other bit of syntax that prevents a code from running properly.
Chatbots don't make those mistakes, and allow computer science professors to spend more time teaching higher-level skills.
The professors now ask their students to take a big problem and break it down to smaller questions or tasks the code needs to do. They also ask students to test and debug code once it's written.
“If we think bigger picture about what we want our students to do, we want them to write software that is meaningful to them,” Porter said.
Magdalena Balazinska, director of the University of Washington’s Paul G. Allen School of Computer Science and Engineering, embraces the progress artificial intelligence has made.
“With the support of AI, human software engineers get to focus on the most interesting part of computer science: answering big software design questions,” Balazinska said. “AI allows humans to focus on the creative work.”
Not all professors in the field think artificial intelligence should be integrated into the curriculum. But Zingaro and Porter argue that reading a lot of code generated by artificial intelligence doesn’t feel like cheating. Rather, it’s how a student is going to learn.
“I think a lot of programmers read a lot of code, just like how I believe the best writers read a lot of writing,” Zingaro said. “I think that is a very powerful way to learn.”