The news is by your side.

California is building the future, for good and bad. What’s next?

0

While the task force has not provided an exact figure on how to compensate descendants of enslaved people for overpolicing, mass incarceration and housing discrimination, the economists advising it estimate that the losses incurred by the state’s black residents could run into the hundreds of billions. of dollars. Whether the fee will actually be approved is yet to be determined.

The conversation about reparations shows that California has a unique ability to account for its checkered history. But that thinking doesn’t always extend to the future. Artificial intelligence systems are used to moderate social media content, review college applications, sift through resumes, generate fake photos and artwork, interpret borderland movement data, and identify suspects in criminal investigations. Language models like ChatGPT, created by San Francisco-based company OpenAI, have also attracted a lot of attention for their potential to disrupt areas like design, law, and education.

But if AI’s success can be measured in billion-dollar valuations and lucrative IPOs, its failures are carried by ordinary people. AI systems are not neutral; they are trained on large datasets that contain, for example, sexually exploitative material or discriminatory police data. As a result, they reproduce and magnify our society’s worst prejudices. Race recognition software used in police investigations, for example, routinely misidentifies black and brown people. AI-powered mortgage lenders are more likely to deny home loans to people of color, perpetuating housing inequality.

This seems to be a time when we can apply historical thinking to the issue of technology so that we can prevent the injustices that have resulted from past paradigm-changing changes from happening again. In April, two lawmakers introduced a bill to the State Assembly that seeks to ban algorithmic bias. The Writers Guild of America, which is currently on strike, has included restrictions on the use of AI in its demands. Resistance to excess also comes from within the tech industry. Three years ago, Timnit Gebru, head of the Ethical AI Team at Google, was fired after raising the alarm about the dangers of language models such as GPT-3. But now even tech executives have become wary: In his Senate testimony, Sam Altman, the CEO of OpenAI, admitted that AI systems need to be regulated.

Ultimately, the question we face with both reparations and AI isn’t all that different from the one that arose when a Franciscan friar set out on the Camino Real in 1769. It’s not so much “What will the future look like?” — although that is an exciting question — but “Who is entitled to the future? Who can be served by social recovery or new technology, and who can be harmed? The answer could very well be decided in California.


Laila Lalami is the author of four novels, including ‘The Other Americans’. Her most recent book is a non-fiction work, ‘Conditional Citizens’. She lives in Los Angeles. Benjamin Mara is an illustrator, a cartoonist and an art director. His artwork for Numero Group’s “Wayfaring Strangers: Acid Nightmares” was nominated for a Grammy.

Leave A Reply

Your email address will not be published.