Brooke Pollock / Daily Nexus

Despite the ethical drawbacks of cheating, many of us have distinct memories of forearms discreetly covered in formulas, notecards tucked into UGG boots and mid-class restroom trips that didn’t involve using a toilet. These practices were simple and often easy to catch, but the rise of large language models like ChatGPT has led to new challenges where the ethics are not always clear. Students are now able to produce answers without even taking the effort of writing things on their arm or paying someone else to do the work for them, fueling new conversations about why and how they cheat. Though the method of cheating has evolved, the fundamental question remains: Why would we cheat in our classes when it so clearly leads to us learning less? 

UC Santa Barbara boasts a growing student population but lacks the resources from the state of California to keep up; students stand on chairs outside of classrooms that are already full and wait months for counseling appointments. Students are juggling these constraints while having to respond to tuition increases, political and economic uncertainty and an overwhelming, constant buzz from social media. These factors compound onto existing stress college students face, making artificial intelligence (AI) incredibly tempting to use. 

My observations paint a picture of a burnt-out and disillusioned student population willing to use whatever tool they can to make it through their classes. While these admissions seem to affirm some of the most prevalent stereotypes about Generation Z lacking work ethic or the ability to navigate hardship, they also reflect the broader issue of digitally native generations struggling under the pressures of an increasingly competitive and resource-strained environment. 

Students who are using it to cheat claim that they don’t do it when they feel “engaged and interested” in their coursework. Some report being tempted by ChatGPT because they don’t feel like they can succeed on their own and need it as a “tool of efficiency for when [they] feel overwhelmed].” Most often, students simply claim to not have the time or energy to complete every single assignment with the same effort, making ChatGPT the perfect tool to ease the burden. 

AI may be the most accessible tool for cheating so far, but it is far from perfect. ChatGPT often produces inaccurate or simplistic answers and rarely produces writing on par with A-level college work. It also tends to replicate bias, producing responses that align with the majority opinions in its dataset rather than always producing the most accurate or impartial response. 

Students who tend to produce high-level work on their own and feel confident in their abilities know that ChatGPT’s work doesn’t measure up to their own skill level, so they don’t use it. However, as one professor pointed out, “those aren’t the [students] you worry about.” The students who will fall through the cracks are the ones who feel ChatGPT is the only way for them to succeed. An article by GovTech points out that as with other instances of inequality, the impacts will be disproportionately felt by students from underserved or marginalized backgrounds.

The harms of heavy reliance on ChatGPT to get through coursework are incalculable. When used improperly, it can think for students rather than with them. It was meant to be a tool to assist humans, not replace their cognitive processes entirely, but without the proper guidance, burnt out students will let it do the latter. 

The current restrictive policies that dominate classroom AI policy leave little room for students to engage with ChatGPT as a tool to improve their learning, meaning that some will simply understand it as a replacement for their own work. The risks of producing a generation of workers who lack foundational critical thinking skills are dramatic and the consequences will be felt for years after. 

My conversations with students and faculty suggest that cheating with ChatGPT is not as pervasive as faculty seem to think. Still, it is a very real issue that speaks not only to the implications of technological advancements in the classroom but to a fundamental learning disconnect and burnout phenomenon for students. 

Students desire the opportunity to learn and grow as scholars but can only do so when they are in an environment that recognizes the complex and overwhelming realities they’re facing. Rather than trying to stamp out the use of ChatGPT, which some faculty seem bent on doing at this point, it is better to find ways in which it can be used for a more meaningful learning experience. 

The rise of ChatGPT and other large language models offer a crucial turning point to start having deeper conversations stretching across faculty, staff, graduate students and undergraduates to get a better understanding of how to best serve each group. One student who was interviewed wished professors were using ChatGPT to get her to engage with course topics more creatively or to design assignments that were more applicable to diverse learning styles and levels of ability. Echoing this point, another stated that they “feel like we should all collectively kind of come together and agree that it’s a tool, and you use tools just to make your life a little bit easier.” 

The possibilities for generative AI are incredible, but they won’t be realized if it continues to be considered primarily a vehicle for academic dishonesty rather than a potential remedy for it. The age-old adage, “You’re only cheating yourself,” comes to mind, but with the updated consideration that maybe you were already being cheated in the first place.

A version of this article appeared on p. 12 of the April 11, 2024 print edition of the Daily Nexus. 

Print