GPT Education Roundup: Accessible AI Presents New Potential Challenges for IHEs

3 min

Publicly available artificial intelligence (AI) programs and applications (apps), and the tasks they can perform, have been gaining much more attention lately. The average internet user can access websites and apps that use AI to carry on realistic chat conversations with AI "chatbots" and for tasks like generating images that range from digital art to photo-realistic images based on a user inputting keywords or phrases, while open-source tools for AI and machine learning provide new opportunities for beginners and experts alike to create their own AI-driven programs for practically any purpose. This is a transformative and evolving area of technological advancement. Of course, with these advancements, and the increased private-individual accessibility, come new challenges in myriad industries. For institutions of higher education (IHEs), AI software presents a potential challenge for admissions and academic departments where the authenticity of an applicant's or student's writing could be called into question.

Recently, Forbes conducted an experiment using a new version of an open-source natural language model (i.e., OpenAI's chatbot ChatGPT) to draft customized essays responding to college entrance essay prompts for a hypothetical applicant.[1] The program crafted these custom essays in mere minutes after a user simply input some background information about the applicant. Although they were a bit formulaic, the essays produced in the experiment weren't half bad. For admissions officers, the availability of sophisticated AI may result in more frequent situations in which they receive reports that applicants submitted essays that the applicants did not write themselves. Similarly for academic departments, they may receive increased reports of current students turning in assignments prepared by AI.

Although using AI to draft essays or other submissions may not be rampant in admissions or in college classes just yet, the writing is on the wall that this will likely become increasingly common in the future. IHEs should take the opportunity now to grapple with how potential use of AI fits within existing policies on academic integrity and plagiarism. How will an admissions department deal with receiving a report from an applicant's high school, peer, or counselor that an applicant misused AI to forge an essay response on their application? Will there be an investigative process? What will that entail? Admissions offices should consider crafting a procedure to follow if they receive such reports so that the IHE handles these matters consistently. Similarly, IHEs should review their codes of conduct or other academic integrity policies to determine how the organization would handle reports of potential AI misuse by current students. IHEs may want to include language in their policies, for example, that specifically prohibits the use of AI in drafting submissions for class assignments and describes the disciplinary ramifications for students who violate such policies.

These are important questions for IHEs to consider and use as a guide to update their policies and procedures, to provide adequate and clear means to address them. There is no doubt technology will continue to transform and affect academia as advancements continue, and IHEs will be better equipped to deal with new challenges if they begin thinking now about their response to AI use and misuse. If your IHE has questions regarding your policies and procedures, or handling potential academic integrity or other misconduct investigations, please contact the authors of this article or any lawyer in Venable's Labor and Employment Group.

[1] Emma Whitford, "A Computer Can Now Write Your College Essay—Maybe Better Than You Can," (Dec. 9, 2022), available at