Skip to Content, Navigation, or Footer.
Sunday, Dec. 22, 2024
The Emory Wheel

Humanities and AI.jpg

Humanities majors still matter in age of AI

With rapid advancements in artificial intelligence (AI), it will soon be tempting to eliminate all humanities studies. Influenced largely by public discourse about financially desirable majors, students are already choosing STEM majors at higher rates due to their perceived desirability in the job market. With rates of information technology and computer science majors continuing to increase with each passing year, careerist dogma may be persuading students to forgo their personal interests in favor of purportedly “safer” academic major choices. Axing humanities programs and discouraging students from following these paths is harmful for our ideological culture and the richness of humanity. By doing so, universities will both entirely outsource humanity’s thinking and decision-making and artificially homogenize the type of students coming out of higher education.

Many universities across the country have undervalued humanities majors in favor of quantitative disciplines and have readily cut liberal arts majors. In the world of STEM, Emory University has added their AI minor last year. Conversely, in 2012, Emory axed their journalism program alongside other departments in visual arts, educational studies and Russian language studies, much to the chagrin and surprise of students, alumni and educators alike. At the time, Former College Dean Robin Forman defended these changes by claiming that these specific changes were to “reexamine our scope and to set clear priorities” and “to train the leaders of the century to come.”

Broadly, universities cite issues like declining enrollment, budget cuts and, most importantly, competition with STEM degrees. Humanities majors provide a variety of perspectives, and those who are well-versed in disciplines like creative writing and political science can attest to the critical thinking skills they have gained as a result of liberal arts pedagogical traditions. Despite AI’s prevalence in popular discourse, there is still a purpose for us writers, musicians and philosophers: to utilize our linguistic, cultural, political, philosophical and anthropological traditions to make sense of the chaotic and ever-changing 21st century.

With the explosion of AI platforms and the uncertainty of the future job market, the choice of picking a major is increasingly unclear for many incoming freshmen. I even remember flirting with pursuing a Bachelor of Business Administration (BBA) degree during my freshman spring semester. This impulse was not because of any inherent interest in the program but because all of my other friends were pre-BBA. I ultimately chose to major in philosophy, politics & law, but not until after intense deliberation and weighing the pros and cons of the program for my own skills and future career interests.

Large language models (LLMs), which are AI programs that utilize deep learning to interpret and create human language, have taken center stage in the public discussion of AI. Despite these models' incredible achievements and rapid progress, LLMs like ChatGPT are merely language repurpose machines. They do not provide novel information, but rather perfectly synthesize existing information. Due to this homogenization of unique perspectives, I predict that the thirst for real writers, real artists, real novelists and real musicians, as compared to the plasticky and lifeless quality of AI art, will only increase in the coming years and decades.

The current age of AI involves the frequent and gross violation of the rights of artists, whose work, which AI systems rely on, is not credited or monetarily compensated. There are already many examples of real artists and writers being denied creative opportunities in favor of AI: Take, for example, a copyright infringement lawsuit that The New York Times filed against OpenAI and Microsoft in December 2023. With the case still ongoing, The Times alleges that OpenAI and Microsoft used millions of articles to train their LLMs without permission to do so. Regardless of the lawsuit’s outcome, this case is groundbreaking and will undoubtedly shape the future legal landscape of AI, as well as the level of protection artists and writers are afforded when their work is used to train chatbots without their permission. A dangerous precedent will be set if platforms like OpenAI and Microsoft are let off the hook for blatant plagiarism.

Other cases regarding artists’ rights have cropped up, posing challenging ethical concerns. In April 2023, TikTok user Ghostwriter977 used AI to impersonate the voices of Drake and The Weeknd on a track titled “Heart on My Sleeve.” For many internet users, this was the watershed moment that illustrated the sheer power of these tools. Many people were shocked to eventually learn that neither Drake nor The Weeknd were consulted in the making of this song, despite both of their names, images and likeness being present in the final project. The moral ambiguity of this song and many others like it — namely, whether you can use someone’s image, sound or other intrinsic characteristic without their permission — opens a whole host of new ethical and moral concerns for litigators and artists alike.

I am not here to argue that majoring in science, technology, engineering or math is not a smart choice. The truth is far more complicated than that, and for many people, STEM majors are the right way to go. Emory, for example, joined the chorus of excitement around AI by launching its Center for AI Learning and AI minor last year. Additionally, Emory's AI.Humanity initiative launched its flagship program, Empathetic AI for Health Institute, a program that began in fall 2023 to harness technology toward bettering public health and diagnostic outcomes. These examples should be lauded as the correct way to utilize AI technologies. However, choosing a STEM major — while dismissing the value of a humanities major — is an incomplete path forward for the future of our planet and our species.

Artists and students alike cannot fully outsource our autonomy and intellect to chatbots and AI because, at some point, we will lose a part of our humanity — that part of us that reconciles, reasons and addresses real-world issues through a uniquely human lens. AI platforms are too complex to be reduced to a simple good-evil dichotomy. It is up to our institutions, government and higher education administrators to decide how best to utilize new tools to advance and promote human harmony and democratize education access.

There is a balance here, but it will take a lot of compromise between ethical regulations and inevitable technological progress. Ideally, the next generation of students will be well-versed in both quantitative and qualitative skills, being able to advance society through writing, mathematics, coding and reading. While our future is not, one thing is for certain: We still need thoughtful humans to comb through the mess we have made for ourselves.

Contact Ari Segal (25C) at asegal7@emory.edu.