Opinion: AI has no place in academia or journalism
Lars Jendrushewitz | Photo editor
Get the latest Syracuse news delivered right to your inbox.
Subscribe to our newsletter here.
With Syracuse University’s S.I. Newhouse School of Public Communications ranking among the top 10% of journalism schools in the country, it is crucial for students to recognize the potential problems that the use of artificial intelligence presents to future members of the workforce.
Over the past few years, journalists and students alike have been affected by the increasing abilities and presence of AI. We need to turn away from the use of auto-generated answers and instead work to reclaim our skills to write and read authentically.
Not only does reliance on AI hinder the human aspect within journalism, it also limits progress within journalism and in greater society. When we rely on AI to communicate our decisions, thoughts or ideas, we aren’t exercising the critical thinking necessary for basic communication. Without this, society lacks the ability to create change and move forward by cultivating powerful ideas that motivate.
Automated responses that generate unoriginal thought and report statistical inaccuracies create a void in reporting and amplifies the potential for misinformation. We shouldn’t be using unreliable tools to help us in our journalistic processes. Newhouse’s newly launched fellowship program that encourages the use of generative AI raises concerns about the ethical implications of using AI in journalism and journalism education.
Arguably, many people lack the ability to originally communicate the stories of different cultures, marginalized experiences or address social issues, because we constantly refer to some form of social media to tell us what to think. Social media and AI work hand in hand in this way, as both train us to rely on other platforms for information rather than to think for ourselves.
Dakota Dorsey | Design Editor
When we allow AI to invade our thought process, it becomes more of a challenge to create our own ideas. While 89% of students reported using Chat GPT on a homework assignment, only 5% of journalists report using AI in the workforce. If professionals in the very fields we desire to be in aren’t using AI, then it should neither be allowed nor encouraged in schools.
A major concern with AI usage in schools is plagiarism. According to Forbes, “48% of surveyed students admitted to using ChatGPT for an at-home test or quiz, 53% had it write an essay and 22% had it write an outline for a paper.” Additionally, over a third of educators believe ChatGPT should be banned in all educational institutions.
The use of AI has shifted from a valuable resource to a path of least resistance. It raises concerns about students’ academic integrity and lack of motivation.
Some students remain hesitant to use AI. Others may argue that incorporating AI into classrooms is the best approach to advancing our society and a natural part of evolution. Regardless, current SU students must consider how continued usage may impact their future careers.
We have yet to see the long term effects of AI on academic institutions, but it is already beginning to reshape teaching methods.
While AI does hold potential to serve as an educational tool, its integration into classrooms must be approached with caution. To foster genuine human thought and drive meaningful change, it is crucial to establish boundaries that prevent overreliance on AI tools.
By limiting AI in the workforce and in our academic lives, we can foster a greater set of writing and critical thinking skills to engage more meaningfully with the world around us.
Autumn Clarke is a freshman majoring in Broadcast and Digital Journalism. She can be reached at auclarke@syr.edu.
Published on September 17, 2024 at 10:58 pm