By Emily Gessner
As artificial intelligence progresses, some North Carolina school districts are looking to implement different technological approaches to safety. However, with these new options also come different concerns about privacy, accuracy and politics.
Davidson County Schools and New Hanover County Schools were some of the first districts in North Carolina receiving state funding to implement AI-based school safety technologies.
The AI technology analyzes camera feeds around schools to detect weapons, dangerous and aggressive behaviors or smoke and fire. The program then alerts officials to the issue.
While Davidson County Schools are piloting the technology, New Hanover County Schools turned it down, citing student privacy concerns and a fear of false positives, where the model may flag something as suspicious that, in actuality, is not a threat.
Tim Merrick, a school board member for New Hanover County Schools, voted against implementing the technology in the schools.
Merrick said many factors influenced his vote. One of the main factors that led to his vote was that Eviden, a French AI company with offices all over the world and the program proposed for Davidson and New Hanover schools, “had a great propensity for false positives.”
Davidson County has introduced AI to the school district in different ways. The program has been in place since 2025, with Eviden in an elementary school, a middle school and a high school.
While Davidson is starting with the AI technology, the district is opting out of Eviden’s facial recognition feature and the license plate reader for further protection and personal privacy.
Eviden did not respond to requests for comment.
Others have expressed hesitation and skepticism toward the pilot because of a fear of data records. Any data recorded by AI can be held in centers which hold the data for an undisclosed amount of time. This includes information about children, adults or anyone being monitored by the technology.
Tuli Kaazim, a parent of a student in NHCS, said that the idea of information being recorded and stored does not really concern her because we are recorded everywhere now.
“If you go anywhere – in the mall, Walmart – you don’t know where that stuff is being stored or anything anyway, so that’s not really a concern for me because you literally can’t even avoid that,” Kaazim said.
Shea Trantham is the parent of a student in Davidson County. The child attends Central Davidson, one of the schools involved in the pilot program.
She said that the recording of data depends on how long it’s being stored and why.
“I think it just depends again, like what’s the information being used for, how long is it going to be used, is it going to be reported to the police or to the government or to social services, or is it just for internal uses,” Trantham said.
Trantham and Kaazim both said the technology could be useful to help people react quicker to situations of violence, injury or threat.
“I think that it is a good idea,” Trantham said. “You know, having ways to proactively, excuse me, proactively safeguard and prevent problems, I think is a great idea.”
Kaazim said that the technology could be useful for if an accident happens involving smoke and fire, injury or a large crowd moving in one direction.
Wendy Harper, a fifth grade teacher at Wallburg Elementary School, one of the schools participating in the AI school safety pilot, said that her first thought when learning that the school would be implementing the AI program was that she hoped it would be safe and help the school react quicker to threats.
“If groups of people start mingling or start running, they can catch that because just with the regular security cameras we have now, someone has to be watching them, and something has to pique their interest if they see someone walking on campus or if people start running, then they have to investigate more,” she said. “Where maybe this program could catch it earlier and catch it quicker.”
Kaazim said that those things may signal that something is going on in an area, but it could potentially not be a great thing, especially if there is a fear of false positives.
Other schools in the United States have had issues with AI technology, not Eviden, identifying false positives. Last year, an AI gun-detection system used in a Maryland school system flagged a Doritos bag in a student’s pocket as a potential firearm. This led to the student being apprehended by police officers with their guns out. The officers searched the student and, after finding nothing, showed him the picture which led them to search him.
As AI is still a developing technology, especially with safety features, these false positives may create scenarios of students being falsely accused of wrongdoing.
The AI surveillance pilot can be traced back to 2023, when the state legislature allocated $25 million for school safety grants, including $5.2 million for an artificial intelligence pilot in two school districts: New Hanover County Schools received $3.2 million as a directed grant, and Davidson County Schools received $2 million as a directed grant.
In 2025, lawmakers added requirements for those grants, stating, “Funds allocated for the pilot program shall be used for the implementation of a school safety system that integrates AI technology into existing cameras, video management systems, and alerting protocols.”
The 2025 N.C. legislature also stipulated that the two districts must use the same vendor for the Pilot Program, and the vendor must offer certain performance capabilities, including threatening object detection, intruder detection, person down detection, door open detection, tag and track, facial recognition, forensic face search and license plate reader.
DCS opted out of using facial recognition and license plate reader, although Eviden, the vendor, does offer them.
New Hanover County Schools’ decision to not implement the program means the state must now decide where to allocate the remaining $3.2 million. The General Assembly must pass the state budget before a decision can be made on where to allocate the remaining money.
“$3.2 million is great for the schools, but it would have had to be used for that one specific thing,” Kaazim said. “I just think there could be some other better uses for that money.”
Though Davidson and New Hanover were chosen to pilot Eviden’s camera-based school safety program, other school districts that have tried using AI surveillance in other ways have faced similar concerns about privacy.
In May 2023, Durham Public Schools stopped the use of Gaggle, an AI surveillance tool, to monitor student activity on district-issued accounts and devices.
The system was originally piloted in 2021 and expanded into all schools in January 2023. Costs were covered by state-awarded federal pandemic relief grants.
Gaggle raised concerns from parents and students, fearing of a lack of privacy.
Millicent Rogers, a Durham Public School board member, was present for the vote to discontinue Gaggle.
“The discontinuation of the technology was about the surveillance happening with student information, student input that would review the notes and the writings and any emails or any content that students were creating, and send it to this third party,” she said.
Rogers said that Gaggle would send the data to a third-party vendor for them to scrutinize and then the information was sent back to administrators in the district. Administrators would then determine whether or not police would get engaged or how they would handle concerns brought up by the third-party vendor, she said.
“There were some levels where the third party would automatically contact the police in this area and deploy police to respond to a call at a home in the area, and could essentially cause an increase in policing of families that the board did not want to take responsibility for doing,” Rogers said.
Although many parents and school board members have expressed hesitation or distrust of AI, there is still a possibility of programs being implemented more in the future.
Rogers said that there is still a space for AI to come in and enhance student safety.
“It’s possible. It’s possible. I don’t think we should count anything out in terms of student safety, so long as it aligns with the board and the district’s overall values and vision for where the district is headed,” Rogers said. “But we have to do everything in our power and explore our options to create safe spaces.”