Find Orlando Family Magazine on Facebook! Follow Orlando Family Magazine on Twitter!

A Gift and a Curse

The growing emergence of AI can both help and hurt the way students are learning.

As artificial intelligence has risen to prominence in the lives of many across the nation, the question of the tech’s place in education has become one of the most hotly contested issues facing parents, teachers and administrators.

In Central Florida schools, both public and private, from elementary to college, educators are handling the topic in a myriad of ways as the technology rapidly evolves and further integrates with daily life.

Presently, Orange County Public Schools’ (OCPS) policies do not allow any student use of generative AI and block access to platforms like ChatGPT on their network, keeping the students from utilizing them while in the classroom.

In addition to the network block, devices owned by the district, like laptops intended for educational use, are monitored by staff for any and all unauthorized programs or sites. If district devices are used inappropriately or not in compliance with district policies, administrators respond accordingly.

Though current policy throughout the district allows for no use of artificial intelligence in the classroom, the influence of AI on almost all industries has led to evaluation of how to ensure students are AI literate, while also not allowing overuse that replaces critical thinking, creativity and the ability to research and fully absorb information learned.

Use of AI in OCPS for potential educational use in limited capacities for specific grade levels that meet all of the information technology department’s security requirements is currently under review by district leadership. If use is permitted in the future, it would be closely monitored to ensure compliance with the code of student conduct.

Not all schools in the area are as strict on the topic, though, with Windermere Preparatory School (WPS), a private pre-K through 12th grade college preparatory school, allowing the use of AI as a tool for both faculty and students.

For faculty, the tech is used for creating rubrics and curriculum timelines, while student use varies, with some using it constructively as others use it counterproductively.

Justin Muenker is a WPS high school social studies teacher, IB CAS and Service Learning Coordinator, who explains that while some students use AI to give them feedback on essays and writing prompts, others use it to do classwork for them.

“AI can be and often is a slippery slope with students. … There are quite a few students who are utilizing it to take shortcuts and represent their own work. They kind of use it almost interchangeably with search engines, which can lead to some problematic results and some challenges for teachers and for the whole educational process,” Muenker says.

There are systems in place at the school to try to prevent the use of generative AI by students to do their work for them. These include the Turnitin website’s AI detector, GPT Zero—which attempts to detect large language model systems in writing—and a Google Chrome extension that allows teachers to see students’ individual keystrokes compared to copy and pasted material.

Despite these guardrails that are meant to stop students from plagiarizing or otherwise cheating on assignments, Muenker explains that these are not perfect solutions, and do not eliminate the risk of students using the tech to cheat on homework and classwork.

“All of these are guardrails that exist, but they have workarounds, or they’re kind of imperfect. So it can let a teacher know when something looks like it’s probably AI, but it’s not really definitive,” he says.

For the time being, the most effective way to prevent students from abusing artificial intelligence in the classroom is to have teachers carefully read the work submitted to look for some of the signs that are often reflective of AI use in writing or other coursework.

Another way that students at WPS are being prohibited from using AI on exams or other in-class assignments has been the return to paper and blue books for class tests and essays, putting laptops and computers away during these periods of work.

“If you’re truly trying to assess what a kid learns absent those tools, the only real, surefire way of making sure that that’s in place is by taking those tools away during those moments,” Muenker says.

Despite the drawbacks and concerns about the novel tech enabling cheating, Muenker also highlights some of the benefits of using AI in the classroom, like changing the style of an article that is above the grade’s reading level to make it easier to understand, or aiding in the brainstorming process for creative assignments.

“It’s great for supplementing, it’s great for brainstorming. It’s great for making complex or challenging work more accessible. But that line that I draw in the sand, at least in my classroom, is in representing the thinking, the writing or the skill that I’m trying to have students work on,” he says.

Fortress Christian School, a private online religious school based out of Florida College, has no blanket policy on the tech, allowing teachers the option to tailor its use around the specific subject matter and assignment.

If a student violates the teacher’s class policy on AI, the first offense sees a 10% penalty on their grade for the assignment and they get a chance to redo it. Second offenses are met with a zero grade on the assignment, and if a third offense occurs, the violating student will be dropped from the school.

“We feel like that’s a policy that gives some grace as well as some flexibility for our teachers, but also it is a policy that we can enforce. We are not opposed to the use of AI, that is the world in which we live. And if students are going to be successful, they’re going to need to be able to navigate it, ultimately, with Christian integrity,” says Alex Hale, M.S.E., head of school for Fortress Christian.

At the college level, the use of artificial intelligence in the classroom is more complex, as several different industries require the use of the tech in their daily operations, meaning that students being prepared for the workforce in those sectors requires the skill to use it.

At the University of Central Florida (UCF), there’s no one way that students and staff are using AI. Some classes specifically aim to imbue students with fluency in AI, while others completely ban the use of it.

Professors are not required to tell the school of their specific policy on AI, and with hundreds of classes on campus, it’s almost impossible to track exactly how and where the tech is used at UCF.

“The faculty are worried about students becoming so reliant on AI answers that they never develop their own internal foundational knowledge and skills. If they graduate from college and they don’t actually know anything about biology or whatever [subject matter], because they’ve just had AI to give them answers the whole time [it would be a real problem],” says Kevin Yee, Ph.D., UCF’s special assistant provost for artificial intelligence and the director of the Faculty Center for Teaching & Learning.

The staff at UCF, much like at WPS, also has had to create workarounds to prevent the use of AI on assessments and exams, making online testing more complicated so as to not allow AI to give students answers.

Yee also emphasizes that AI is a shift in the technological and educational landscape unlike anything seen within living memory, disruptive enough that it is more akin to the harnessing of electricity than the introduction of computers or the internet.

“I think we will find a lot of disruption in jobs, both job creation and job deletion. And that makes it all an exciting time for students, but also kind of a worrying time,” says Yee.