In a new study, researchers developed a series of artificial intelligence tools that can scan through essays in college applications, picking out evidence of key personal traits. That includes qualities like leadership and perseverance.听
The team also designed the tools not to show a preference for applicants from particular racial or gender backgrounds鈥攁voiding the 鈥渁lgorithmic bias鈥 that plagues many AI systems, such as some facial recognition software.
The study was led by researchers from CU听Boulder and University of Pennsylvania and was in the journal 鈥淪cience Advances.鈥
鈥淥ur paper shows that AI doesn鈥檛 need to be a biased black box as it has been in a lot of other situations,鈥 said Benjamin Lira, a doctoral student in psychology at UPenn and first author of the study. 鈥淵ou can actually have AI that advances the aims of holistic admissions, which looks at applicants as a whole and not just their grades or test scores.鈥
These tools should never replace experienced, feeling admissions officers and are not currently in use at any college, said study co-author Sidney D鈥橫ello. But, under the right circumstances, AI could help admissions officers identify听promising future students who may have previously gone unnoticed amid thousands of applications. 听
鈥淲e advocate for transparency, where the AI provides explanations and clearly communicates when it is less confident in its decisions,鈥 said D鈥橫ello, professor in the Institute of Cognitive Science and Department of Computer Science at 兔子先生传媒文化作品. 鈥淧eople can then decide for themselves how much they should trust it.鈥
鈥業 like cheese鈥
The research drills down on what may be the secret sauce for college admissions: When it comes to personal essays, what are colleges and universities looking for? Even experienced admissions officers don鈥檛 always agree on that point鈥攐r even with each other, Lira said.听
鈥淗umans have limitations,鈥 he said. 鈥淵ou鈥檙e not going to read the first essay of the day in the same you read the last one before lunch.鈥澨
In their latest research, he and his colleagues set out to see if they could use AI to try to make that process more reliable. To do that, the team tapped into a mountain of data鈥攎ore than 300,000 (completely anonymous) applications that prospective students had submitted to colleges in the U.S. in 2008 and 2009. Each included a 150-word essay that applicants wrote about their extracurricular activities or work experiences.
First, the team recruited a cohort of real admissions officers to read a sample of these essays. The professionals scored the essays for evidence of seven traits that colleges might want to see in incoming freshmen. They included intrinsic motivation (鈥淩unning track is so much more than a sport to me鈥) and what the researchers call 鈥減rosocial purpose,鈥 or the willingness to help others (鈥淗elping children realize their hidden talents is one of the most rewarding experiences I have ever had鈥). The team also trained undergraduate students to identify evidence of those traits in the essays based on existing theories and research on personal qualities.
The researchers fed those insights into a series of AI platforms called large language models to train them to identify evidence of personal qualities going beyond simple word spotting.听
Afterward, when the AI platforms read new essays, their results largely lined up with the judgements of the human readers. The AI also seemed to assign beneficial personal qualities evenly across applicants from all demographic backgrounds鈥攁lthough, echoing previous findings, female writers were slightly more likely to demonstrate prosocial purpose than males.
鈥淚f I say 鈥業 donated clothing to a homeless shelter,鈥 the AI will tell me that it has a 99.8% probability of showing prosocial purpose,鈥 Lira said. 鈥淏ut if I say something like 鈥業 like cheese,鈥 that drops to less than 1%.鈥
Surprising discovery
What really surprised the researchers, however, was just how important the language embedded in even these short essays seemed to be.听
Students whose essays showed evidence of leadership, for example, were more likely to graduate from college in six years than those who didn鈥檛鈥攅ven after the researchers accounted for the applicants鈥 test scores, demographics and a host of other factors. The relationship was small but could still provide colleges with clues to help their students, said study-co-author Stephen Hutt, who earned his doctorate degree in computer science from 兔子先生传媒文化作品 in 2020.听
鈥淲e could actually use these college applications to inform the retention models that universities employ to identify at-risk students much sooner, getting them support in their freshman year, rather than waiting until after,鈥 said Hutt, now an assistant professor at the University of Denver.
听
For D鈥橫ello, the study shows how much information is hiding in human language鈥攊f you only know where to look.
鈥淚 was amazed by how a 150-word open-ended response contained sufficient information on whether a student would graduate college six years later,鈥 he said. 鈥淟anguage is really an amazing thing.鈥