Educators say AI more like a coach than a crutch in classrooms
Local educators are always adapting to the world around them. One area they have had to adapt to the most in recent years is the use of artificial intelligence in learning environments.
No different from how the calculator changed mathematics, AI is changing how students learn. Local educators are still trying to find the line between where AI is acceptable and when it goes too far.
Mark DiMauro, assistant professor of digital humanities at University of Pittsburgh Johnstown, has worked with and studied AI for the better part of the past six years. He said when most people refer to AI, the first thing that comes to mind is ChatGPT.
ChatGPT is considered a large language model, which is an AI system designed to understand, generate and manipulate human language by processing large amounts of text data.
“It’s important students know that AI is not just limited to ChatGPT,” DiMauro said. “There are so many offerings out there you could fill a book with them.”
Large language models trace emerging patterns in literature, DiMauro said, with the most common sentence becoming the most likely output when trying to get the program to compose something.
There also is generative AI, a type of artificial intelligence that creates original content like text, images, music, audio and video.
“Generative AI can include Midjourney or Titan or other image-generating software,” DiMauro said. “It can include presentation software like Gamma. It can include animation software like Runway.”
Generative AI presents some copyright dilemmas, DiMauro said, citing a case that started in 2019.
Stephen Thaler created a picture he called “entrance to paradise” through the generative AI model Creativity Machine. He applied to copyright the image at the United States Patent and Trademark Office. The office essentially said “No. You didn’t produce this. AI did.”
“He sued them, basically saying ‘you are breaching my First Amendment rights,’” DiMauro said. “’This is my art. I want to copyright it.’ The case went to court, ruled in favor of the copyright office and affirmed again, Thaler was not the author. That was the outcome.”
The ruling means any product produced by generative AI has no official, human author, thus no one is liable or accountable if someone creates misinformation or deepfake material, DiMauro said.
DiMauro said AI implementation can vary across colleges, universities and even subjects.
The way he uses it with his students is subject specific. For example, in his composition classes, he has outlawed its use except for research.
“In my other classes, one I teach is called AI culture, which teaches people the best and most efficient ways to use these tools,” DiMauro said. “All my other classes use some form of AI to some extent. I usually give students a warning speech as to how to be careful with your data, how to deploy it and what is ethically happening when you engage with these tools.”
DiMauro also allows students in his humanities classes to use AI for cryptography, while students in his AI culture classes use it to create media content for commercials.
In his classes, DiMauro basically helps his students explore the practical uses of AI while touching upon its limitations. However, what holds some professors back from implementing it are the shortcuts students can take with assignments.
“I am seeing a cultural shift in terms of acceptance versus when it all first began,” DiMauro said. “There were a lot of people in academia that were thinking this was the end of times and end of education as we know it. My comeback to that is: Color photography didn’t kill painting. The calculator didn’t kill math. AI is a force multiplier.”
Butler Area School District superintendent Brian White said the district is still in the exploration stage as far as finding practical uses for AI with students.
White said some teachers are using AI within their classrooms as a process tool, rather than a product tool.
“Some of the more innovative things I have seen are teachers using it as a way to personalize projects,” White said. “I thought that was a pretty cool use of it. It wasn’t doing the project, rather helping frame a project around a serious topic.”
White said the possibility for students to use programs outside of school to take shortcuts with their assignments is forcing the district to rethink how assignments are graded.
“What is going to happen is, I think we are going to start grading the process more than the end product,” White said. “Show how they revised it, take revision notes, cite resources properly and produce a rough draft. The challenge with that is it’s a lot of work. An English teacher could have 150 kids.”
Lisa McKinsey, an eighth-grade social studies teacher at Butler Intermediate High School, said AI is meant to enhance the learning experience for students, with small introductions to programs to explore appropriate uses.
McKinsey is trying a new project that will take place over the course of the year using a program she discovered.
“In my eighth-grade social studies class they have the ability to use a platform called Humy.ai,” McKinsey said. “It allows them to have conversations with historical figures we are studying. It’s like a chatbot.”
She has hand-picked about 30 historical figures from the Civil War to World War II, the period of history the class is studying.
Students can ask these historical figures questions about topics or events that happened beyond their lifetimes and take educated guesses as to what their likely answers might be based on what they know about them.
“One student might have Abraham Lincoln, and they could ask him what he thinks about policies around imperialism during the turn of the century,” she said.
McKinsey said she thinks students are used to being told not to use AI, but the students are realizing the AI they are told to use is structured with a lot of oversight.
“I think using AI to augment learning is difficult depending on the grade level and subject area,” McKinsey said. “I am using it in a way to supplement research to create more relevant experiences.”
At Moniteau School District, secondary assistant principal Kimberley McBryar said the district has an AI policy in which teachers are given some power in deciding when and where to implement AI.
“Teachers have to decide how much of an assignment can be AI generated, but they have to tell their students what percentage is permitted,” McBryar said. “We want to encourage the use of it as long as it’s productive in the classroom.”
Moniteau career education and student services teacher Jeremy Borkowski said in some of his math classes he and his students use the program Wolfram Alpha, which can solve math problems and give step-by-step instructions when needed.
“It can give you clues, but it still wants you to figure out the solution,” Borkowski said. “It just supports the educational process.”
Borkowski and McBryar said students using AI programs to cheat on essays or having the programs do the work for them isn’t as big of a problem as most people would think, at least in their experience.
“Last year I had two or three office referrals for academic integrity that had to do with AI,” McBryar said. “It’s a very small percentage.”
At Seneca Valley School District, AI is being implemented in two main ways, according to superintendent Tracy Vitale.
“First, we paid for the subscription to Magic School AI,” Vitale said. “It was developed by educators for teachers. It has morphed into a way the adults can create a classroom and create assignments as they deem appropriate.”
Vitale said all student personal information is protected, which was a main concern of hers from the beginning.
The other way it is being used at Seneca Valley, which is only used by teachers at this time, is Microsoft Copilot, which acts as a productivity assistant across Microsoft apps such as Word, Outlook, Teams and Edge.
It helps with tasks such as drafting documents and emails, summarizing information, analyzing data, brainstorming ideas and creating images.
“I think AI can be more like a coach instead of a crutch,” Vitale said. “Instead of leaning on it to get work done, can it coach me and enrich what I am working on?”
From area colleges to Butler, Moniteau, Seneca Valley and elsewhere, educators are constantly adapting to new programs. They all said staying ahead of the curve is the main priority.
The jobs students will have after graduation will most likely have some form of AI integrated into what they do, educators said.
This could lead to shifts in what people do at their jobs, rather than jobs going away entirely, as a lot of people assume, DiMauro said.
“I have seen studies that say we will add more jobs,” DiMauro said. “Maybe you won’t be a bank teller anymore. You will be a financial solutions engineer or something like that.”
One way Vitale imagines AI growing at the high school level is through schools developing their own AI program that will be used only in house, tailored to specific learning assignments with limitations to what it can and cannot do.
“At BNY Mellon they developed this system called Eliza,” Vitale said. “What they do is, they are working with OpenAI to build a closed system internally for their employees. They would dump documents into this and it’s protected. It was built in house with the help of OpenAI. The CEO requires the employees to use it for research, inquiries about the company.”
Vitale sees Seneca Valley and other schools adopting a system like BNY Mellon’s down the line, similar to Magic School’s program, but it would be a program all their own where they can control everything it does.
“I see this like a gold rush, like the Wild West,” Vitale said. “We need to prepare our students to be in a workforce that has AI embedded in their jobs every day. I hope we all learn to use it for time efficiency and use that extra time for quality type things in our lives.”
This article originally appeared in the November edition of Butler County Business Matters.
