siliconstories.com
Send request
0

No products in the cart.

Ruha Benjamin Reimagines Technology, Shares Ideas for … – Bowdoin College

November 29, 2023

“Technology is not creating the problems. It is reflecting and amplifying—and often hiding—preexisting forms of inequality and hierarchy,” she said during her lecture.

“As we talk about reimagining technology and society, we always want to open up the frame and not just look at the impact of technology—that is only half the story—but the inputs, the prior configurations of history and society, which is part of the training of these newfangled tools,” she added.

Benjamin’s visit to campus was sponsored by the Harold and Iris Chandler Lectureship Fund, which brings scholars to Bowdoin who are looking at the impacts of technology on the humanities, social sciences, and our society. The event was also part of the College’s celebration of the tenth anniversary of its digital and computational studies program (DCS).
Mohammad Irfan, Bowdoin’s John F. and Dorothy H. Magee Associate Professor of Digital and Computational Studies and Computer Science, introduced Benjamin, describing her as an extraordinary teacher and scholar who writes books that are “at the same time powerful, hopeful, and even practical.”

Benjamin is the Alexander Stewart 1886 Professor of African American Studies at Princeton University. She is also the founding director of the Ida B. Wells Just Data Lab and author of the award-winning book Race After Technology: Abolitionist Tools for the New Jim Code, among many other publications.

In her Bowdoin lecture, “Race to the Future? Reimagining the Default Settings of Technology & Society,” Benjamin emphasized the importance of education and academic programs like DCS for “radically expanding who and what is shaping our collective future. The issues are too important to leave it to those who only have the technological know-how, when so many other forms of knowledge are necessary as part of the process.”

Benjamin also argued that we need to widen our frame on technology to include the people laboring to support our consumer desires, such as Amazon warehouse workers, or even those doing the critical but invisible task of labeling data.
Part of the problem with our current conceptions of technology, she said, is we remove ourselves as active agents, repeating two basic stories that divide technology into drivers of dystopia or utopia. “While these sound like opposing narratives, and they have different endings for sure, they actually share an underlying logic,” she said. “We might call it techno-deterministic logic, the idea that technology is determining us. But the humans behind the screen are missing from both scripts.”
She put some of the onus of constructing a just technological future on scientists and people of learning. “I want to emphasize this because a lot of times when we think about racism and other forms of domination, we associate it with people who are ignorant, people in the backwoods, the hillbillies in the trailer park. When it is actually those who’ve been in the citadels of learning who have created these stories, ideologies, and falsehoods and diffused them to the rest of us,” she said.

“The reason why this is so important,” she added, “is because if science and scientists have been so essential in erecting this architecture, then that means science and scientists, researchers, and people of learning are responsible for dismantling it.”
She spoke about how we could avoid the pitfalls of “discriminatory design”—when the knowledge and design that teaches and constructs AI is harmful and biased—by embracing, for instance, “liberatory technologies.”
As an example, she described Breonna’s Garden, “a digital public sphere” and immersive virtual reality experience created in memory of Breonna Taylor, a twenty-six-year-old Black women killed by police in her apartment. The garden, which can be downloaded from the App Store, allows people to share their griefs and hopes. The project was created by developers who worked with Taylor’s family to transform their pain into purpose, Benjamin said.
“Developers listened to the family and checked in with them during every stage of building the platform,” she continued. 
In Breonna’s Garden, ”they created a really beautiful, immersive space that we can put into the category of a liberatory digital space.” When building these products, she said designers should ask: “Who are you consulting? Who are you involving? Are the people most effected by the the problem your technology is supposed to solve part of the process?” 
Other positive developments she urged audience members to explore include President Biden’s AI Executive Order and the European Union’s AI Act. (But she warned students to find any “loopholes” that might exclude or even harm some communities.)
As for “bottom-up organizing, community-building efforts,” she mentioned The Data Nutrition Project, which applies the model of food labels that tell us what is in our food to data sets. In this way, developers can see what’s in data sets before using them to train a computer. “So you see the biases and omissions that might be in it,” Benjamin said.
Another hopeful example is a movement called “consentful technology,” which applies the FRIES framework of sexual consent to technology and the way data is developed, stored, accessed, and interacted with. FRIES stands for freely given, reversible, informed, enthusiastic, and specific.

Benjamin purposefully ended her talk on an optimistic note: “I promised to be a little hopeful!” she said. “If inequity is woven into the very structure of our society—in policing, education, health care, and work—we can feel overwhelmed. But in my view, all of those become fronts for change, for places where we can reimagine the status quo.”

source

Posted in Design, Events, Technology, Ui DesignTags:
Write a comment