2 sophomores aim to eliminate AI discrimination at SU with new SGA bill
Syracuse University's Student Government Association will vote on a clause of a bill aimed at eliminating bias in university AI systems. The two sophomores who proposed the bill hope to see other universities to do the same. Lola Jeanne Carpio | Contributing Photographer
Get the latest Syracuse news delivered right to your inbox.
Subscribe to our newsletter here.
After hearing that testing and resume screening systems powered by artificial intelligence nationally flagged students of color disproportionately more than white students, Syracuse University sophomore Indrė Espinoza was motivated to take action.
Along with sophomore Chloe Brown Monchamp, the two presented a bill to the Student Government Association last Monday as a first step in their hopes of eliminating bias in AI systems at SU.
The bill would require SU to publish the different AI systems it uses, check for bias, create a student data rights policy and establish a Student Technology Advisory ad hoc committee.
Espinoza and Monchamp are against the belief that AI is unbiased, they said. AI collects data from text-based sources, which does not guarantee an accurate representation of history, as sources often favor specific perspectives and discriminate against minority groups, they said.
“The education you are getting is fundamentally flawed,” Espinoza, an SGA representative for the College of Visual and Performing Arts, said. “It is not just fundamentally flawed, it’s hurting students.”
Espinoza said the rapid growth of AI is like the industrial revolution, integrated into every facet of society. She said AI has now become “deified” in culture today, seen as an all-knowing, unbiased technology.
She said it’s “necessary” to combat this misconception through education and advocacy.
Espinoza and Monchamp have four goals: to improve SU’s AI transparency, require independent auditing through third-party testing, build a committee to inform faculty and students and reform systems to stop bias.
“When it’s embedded in models, you are amplifying a lot of these systematic inequalities that have existed for centuries,” Espinoza said.
The two also hope to inform non-marginalized groups about bias in AI systems, using their privilege to speak for disadvantaged communities who face oppression through such systems.
Societal racism and discrimination influence AI systems to model these real-world biases, according to UC Berkeley Law. In a 2022 article, Dayo Ajanaku, J.D. candidate, said it’s “irresponsible” to ignore current prejudices that exist in different industries.
While there have been lawsuits surrounding AI bias, Espinoza said there are few policies in place to reform the technology. However, she said she is “confident” that through research, progress can be achieved.
“We could actually make some sort of change,” Espinoza said. “Our generation is going to be the pioneers for something like this.”
Monchamp stressed the importance of ensuring AI systems are ethical and unbiased. Planning to study international humanitarian law, her interest in human rights inspired her to accept Espinoza’s invitation to work on the bill.
“This is something that is impacting human lives,” Monchamp said.
The two also created a petition and are looking for student signatures to raise awareness for the bill. Espinoza and Monchamp hope their bill will inspire similar policies across different universities.
Espinoza plans to ultimately expand the bill to pass in her home state of Illinois’ State legislation, a state that has pioneered legislation for privacy and protection in the past.
A specific clause of the bill will be voted on at Monday’s SGA meeting.
“It’s all about setting a precedent,” Espinoza said. “My hope is that it can create some sort of change and it will make society be more honest.”

