The emergence of Artificial Intelligence has brought many improvements to our lives, but it also raises serious concerns – especially in education. While AI tools can enhance learning and streamline teaching, they also pose big risks to academic integrity and student development. Because of these risks, administrators should put more attention on the use of AI in school settings where younger students are still forming critical thinking skills.

Currently, AI use in schools is surging. According to the CollegeBoard report, 84% of high school students have used AI tools, and 69% report using ChatGPT to help with assignments (College Board, 2025). Similarly, Johns Hopkins found that roughly 60% of K–12 teachers had used some form of AI in the 2024–2025 school year (Effective and Ethical AI Implementation, 2025). However, this rapid adoption reveals a troubling trend: AI adoption is outpacing the development of norms, policies, and safeguards in schools.

The core issue isn’t the existence of AI (technology improvement is always a benefit for society) but rather how it’s being used. The challenge lies in integrating AI into classrooms in ways that do not compromise skill development. This is especially important in school settings that already rely heavily on digital access. As of 2022, 96% of U.S. public schools reported providing students with individual tablets for educational use (National Center for Education Statistics, 2022). With nearly every student now equipped with a device, the pathway to AI is wide open but the guidance is often lacking.

The central tension is that AI’s rapid adoption is outpacing the ability of schools to set policies. This creates a situation where students can use AI freely (often unsupervised) for assignments and learning that poses risks to skill development. Educators are asking: How do we embrace AI while preventing misuse and negative consequences?

The sudden availability of powerful AI assistants (like generative AI chatbots e.g. Chat GPT that can write essays or solve problems) has opened the door to academic shortcuts. Students can have an AI generate answers with minimal effort which creates new risks to student growth and critical thinking. Students can now delegate assignments to AI with little to no engagement. Research is beginning to support this concern. In a Wharton experiment with nearly 1,000 high schoolers, students who used GPT-4 during practice scored 17% lower on a follow-up test than students who never had AI help (Basiouny, 2024). The AI helped them solve more practice problems, but they retained far less and struggled on assessments. Using ChatGPT as a crutch “substantially inhibit[s] learning,” the study concluded (Barshay, 2024).

Another MIT Media Lab study observed that participants who wrote essays with ChatGPT showed the lowest brain activity and became increasingly reliant on copy-paste. They are underperforming at “neural, linguistic, and behavioral levels” compared to those writing without AI (Chow, 2025). In follow-up tasks, the AI-reliant students remembered little of what they wrote themselves and bypassed deep learning processes. This research highlights that students who rely heavily on AI perform worse on assessments requiring reasoning or original thought and may develop a false sense of confidence in their understanding. 

Some may argue that “Cheaters have always existed. AI just makes cheating easier.” That’s partly true, but it misses the core issue. Before AI, students who needed help would use Google, ask peers, or turn to resources like Quizlet. While not always ideal, these still required some engagement like reading, comprehension, and application. Even copying from peers or old materials demanded some effort. Now with generative AI, a student can paste a prompt into ChatGPT and receive a complete and correct response instantly without using any cognitive work. 

The real concern isn’t extreme cheaters, but the average student who may now be tempted to take shortcuts. Research shows that students who use AI as a primary academic tool often struggle to retain core concepts and lack confidence when tackling similar problems independently (Barshay, 2024). In other words, over-reliance on AI can erode the very skills education is meant to build: critical thinking, problem-solving, and original expression.

There’s also a broader philosophical concern. In The Matrix (1999), the antagonist explains that the simulated world was set in 1999 because it was “the peak of your civilization,” and adds, “As soon as we [machines] started thinking for you, it really became our civilization.” This quote offers a warning that if humans outsource too much cognitive effort to machines, they risk losing agency and intellectual developments. We must be careful not to let students become mere “muppets for AI marionettes.” If we allow students to overrely on AI before they’ve learned to reason, we may raise a generation of passive thinkers that are more dependent on AI. Human creativity and critical thinking are learned through struggle, failure, and problem-solving. These can’t be outsourced.

Schools should focus on regulating AI usage. Students are still developing their ethical frameworks, critical thinking, and personal identity. At such a formative stage, they need the time and space to explore ideas, engage in struggle, and think independently. Introducing AI tools too early without the right boundaries will risk replacing that process that may stunt cognitive growth.

To respond responsibly to AI’s rise in education, schools should prioritize several key strategies. First, they must actively teach AI literacy and ethics. Students need structured opportunities to discuss what AI is, when to use it, and how to evaluate its outputs critically. Schools can embed these lessons into digital citizenship to create a culture of thoughtful AI use from an early age. Some districts have begun doing exactly this. Montgomery County Public Schools recently outlined an AI literacy plan to help students to use “technology to learn new information …responsibly, ethically and cautiously” (Zalaznick, 2023). By incorporating AI topics into the curriculum, they aim to demystify the technology and instill a sense of responsibility in students. Such efforts show that with guidance, students can learn about AI rather than just learning through AI.

Second, professional development for teachers is critical. As of 2023, only 39% of educators reported receiving formal training on how to manage AI use in classrooms (Policar, 2024). Without the proper knowledge and resources, many teachers are unsure how to respond to student AI use: whether to allow it, detect it, or integrate it. Investing in comprehensive training would empower teachers to guide students effectively. Encouragingly, the number of school districts offering AI training for teachers is rising. Research shows more than doubled from 23% to 48% of districts between 2023 and 2024 (Merod, 2025). Clearly, a national effort is needed to ensure all educators get the support they need. By equipping teachers with AI knowledge, schools can turn wary instructors into confident mentors who help students use AI appropriately. A well-trained teacher will know how to show students the limits of ChatGPT’s answers or how to create assignments that require human insight.

Third, schools need to rethink assessment design. Many traditional assignments are now easily completed by generative AI. This means that evaluations should shift toward formats that emphasize human thinking. One high school English teacher in California was encouraged to favor “authentic assessments” such as in-class writing, oral presentations, project-based tasks (things that AI can’t easily replicate). “I used to give a writing prompt and say, ‘In two weeks, I want a five-paragraph essay.’ These days, I can’t do that. That’s almost begging teenagers to cheat”  (The Daily Record, 2025). Now her students do most writing in class, where she can monitor their process. After all, an AI can churn out a five-paragraph essay on The Great Gatsby, but it can’t as easily engage in a nuanced Socratic seminar.

Fourth, clear and consistent policies must be established around AI use. As of late 2024, only 31% of public schools had any policy on student AI use. Even where policies exist, 60% of educators say the guidelines are unclear (Hastings Initiative Members, 1). Districts should involve teachers, students, and parents in developing universal guidelines that define unacceptable AI usage. Policies should reflect questions like: “What constitutes AI-assisted plagiarism, and how will it be handled?” 

AI is not inherently harmful. When used thoughtfully, it can enrich education and open new doors. But left unchecked in the hands of students who are still learning how to learn will risk doing more harm than good. Schools must lead with intention, establishing boundaries and systems that preserve human curiosity, creativity, and intellectual growth. The future of education should not belong to machines; it should belong to students who know how to think.

Citations

“AI, Student Cheating, and How Schools Are Adapting.” The Daily Record, 18 Sept. 2025, https://thedailyrecord.com/2025/09/18/ai-student-cheating-schools-adapt 

Angie Basiouny, Knowledge at Wharton. “Without Guardrails, Generative AI Can Harm Education.” Knowledge at Wharton, Knowledge at Wharton, 27 Aug. 2024, knowledge.wharton.upenn.edu/article/without-guardrails-generative-ai-can-harm-education 

Banerji, Olina. “More Teachers Than Ever Before Are Trained on AI. Are They Ready to Use It?” Education Week, 8 Apr. 2025, www.edweek.org/technology/more-teachers-than-ever-before-are-trained-on-ai-are-they-ready-to-use-it/2025/04

Barshay, Jill. “Kids Who Use ChatGPT for Schoolwork Score Worse on Tests, Study Finds.” The Hechinger Report, 4 Oct. 2023, https://hechingerreport.org/kids-chatgpt-worse-on-tests 

Balingit, Moriah. “Survey: 60% of Teachers Used AI This Year — and Saved up to 6 Hours of Work a Week.” The 74, 22 Apr. 2024, https://www.the74million.org/article/survey-60-of-teachers-used-ai-this-year-and-saved-up-to-6-hours-of-work-a-week/ 

Citizen Portal. “Montgomery County Schools Outline AI Literacy Plan, Preview Tech Use Policy Revisions.” Citizen Portal, 5 Oct. 2025, https://citizenportal.ai/articles/6038243/Montgomery-County-schools-outline-AI-literacy-plan-preview-tech-use-policy-revisions 

College Board. Artificial Intelligence in Education: Student and Educator Perspectives. College Board Research Brief, 2024, U.S. High School Students’ Use of Generative Artificial Intelligence 

Common Sense Media. “New Report Shows Students Are Embracing Artificial Intelligence Despite Lack of Parent Awareness and School Guidance.” Common Sense Media (Press Release), 18 Sept. 2024, www.commonsensemedia.org/press-releases/new-report-shows-students-are-embracing-artificial-intelligence-despite-lack-of-parent-awareness-and

Chow, Andrew R. “CHATGPT’s Impact on Our Brains According to an MIT Study.” Time, Time, 23 June 2025, time.com/7295195/ai-chatgpt-google-learning-school/ 

Ducharme, Jamie. “What We Lose When Kids Learn Through AI.” TIME, 17 Oct. 2023, https://time.com/7295195/ai-chatgpt-google-learning-school/ 

Effective and Ethical AI Implementation: What Educators Need to Know – JHU School of Education, education.jhu.edu/news/effective-and-ethical-ai-implementation-what-educators-need-to-know/  Accessed 19 Dec. 2025.

Gewertz, Catherine. “Fewer Districts Are Providing Home Internet Access. But Students Still Need It.” Education Week, 6 Sept. 2022, https://www.edweek.org/technology/fewer-districts-are-providing-home-internet-access-but-students-still-need-it/2022/09 

Hastings Center for AI and Education. AI in High School Education: Institutional Responses and Teacher Perspectives. Bowdoin College, 2024, AI in High School Education Report 

Merod, Anna. “Teacher AI Training Remains Uneven Despite Uptick.” K–12 Dive, 10 Apr. 2025, Teacher AI training remains uneven despite uptick | K-12 Dive  

National Center for Education Statistics. School Pulse Panel: Use of Technology. U.S. Department of Education, 2023, Technology Support 

Teachers Are Changing Their Minds about AI. Find out How, Policar, teachinglicense.study.com/featured-insights/teachers-changing-their-minds-about-ai.html  Accessed 18 Dec. 2025 Zalaznick, Matt. “A War on ChatGPT Is Raging. But Some K12 Leaders Are Making Peace with AI.” District Administration, 31 Jan. 2023, districtadministration.com/briefing/chatgpt-bans-k12-school-districts-ai-chatbot-artificial-intelligence/

Leave a comment

Trending