AI: The Ethical Challenges of Intelligent Machines
By Eli Turner
The title you just read was generated by “My AI,” a feature given to everyday Snapchat users. For those who don’t know, AI (Artificial Intelligence) is technology that performs human-like problem solving. By this definition, it does not sound harmful, but with developments like “Chat GPT,” some might say AI is a direct threat against human thinking.
Chat GPT is a program that when prompted writes nearly anything that you ask it to write. For example, if I tell Chat GPT, “write a sentence about the benefits of playing sports” Chat GPT responds with:
“Playing sports can improve physical health, mental well-being, social skills, and personal development for people of all ages and backgrounds.”
At Moeller, we have seen the direct affects of this program. Students have been known to be using Chat GPT since as early as December of 2022, only a month after its release. These students, however, are not using the program for their own amusement and technological curiosities. Students have been using this technology to complete assignments and essays with none of their own knowledge and skills. Fortunately, teachers have caught on to these students “cheating.” However, teachers can’t catch them all.
Looking at this situation with a critical mind, it should be clear that the problem is not just that students are cheating. While they do still have a part in it all, the primary issue is that some students value the grade they receive more than the knowledge and skills they gain. If students feel that earning good grades is what makes them successful, they are going to try their hardest to achieve that, even if it means cheating on an assignment. Conversely, if students adopt the mindset that the skills they develop and knowledge they gain is what makes them successful, they will be more motivated to get everything they can out of class assignments and be less concerned with the grade or points they receive in the end. Many famous scientists, such as Stephen Hawkins, have said that the AI could take over humanity one day. This will certainly be inevitable if society no longer values curiosity and knowledge.
Moeller students and teachers should shift their focus from wanting students to earn the highest score on a paper to pushing students to strengthen critical thinking skills and understand the world. Many teachers have started to implement AI scanners that detect whether a student’s work was generated by AI or not. One teacher I spoke with commented that she has implemented more in class time for her students to work on papers and assignments. She feels that if she is present and able to assist her students as they work, they may be less inclined to lean on their AI ‘buddies.’ This is a great example of not trying to just stop students from using AI through punitive measures, but of trying to help students want to learn and think for themselves.
In conclusion, AI is not the root of the problem with students not wanting to do their work, but a symptom of a larger culture that emphasizes grades and scores, making students more likely to use the technology in a harmful way. So, it is important that Moeller strives to educate students in mind, body, and spirit, through our actions.