Serena Oduro - AI Needs Black Feminism: Principles and Questions to Guide AI Development and Policy
Black Feminism & AI: A Guiding Light In Tech Equity Policy
In recent times, there has been an increasing dialogue around artificial intelligence (AI), especially its effects on various social issues like fairness, justice, and bias. Today, we delve into an unexplored perspective that combines Black Feminism with AI to inform a more equitable technology policy framework. We're dissecting an insightful chat led by Sereno Darrow, the Technology Equity Policy Fellow at the Greenlining Institute, at the recent tech global conference.
About Sereno and Greenlining Institute
Sereno Darrow is not just a passionate technology policy fellow, but also an equally enthusiastic writer. Her work primarily revolves around broadband issues and algorithmic equity at the Greenlining Institute. This institute envisions a world where race does not act as a barrier to economic opportunities and where people of color find equitable growth. Their policy focus includes economic opportunities and their intersection with technology policy.
Racial Discrimination and AI
Sereno points out how AI models, heavily reliant on past data, can inadvertently reinforce patterns of racism originating from historical data. This raises serious concerns about the unfair prospects for communities of color that these AI models can exacerbate. A commonly cited instance refers to algorithmic bias seen in Amazon’s hiring algorithm that was biased against women.
It's evident that technology and AI need regulation and a concrete framework to follow. The usual anti-discrimination laws struggle to effectively address issues that stem from AI due to the nature of AI's functioning.
Black Feminism in the realm of AI
Sereno advocates for integrating the principles of Black Feminism in the field of AI. She suggests that Black Feminism provides essential standpoints that fight against various systems of oppression - race, gender, class, and more.
Lending a unique perspective to the intersectionality of these oppressions, Black Feminism can, for instance, examine how Black women, in particular, experience discrimination on the Internet. This lens can serve as a robust tool to create more inclusive and representative AI systems henceforth, creating higher demands for justice.
Questions for Reflection
In her poetic guide, Sereno lays out four pertinent questions that can serve as a much-needed reality check for AI:
- Does this algorithm increase individual/personal well-being for black women?
- Is this algorithm less biased than a black feminist?
- Should black feminists be hyped?
- Why is this AI solution for black women?
These questions aim to counter the prevalent 'race-neutral' standpoint in technology, prompting a shift towards a more inclusive perspective.
Do We Need AI?
In conclusion, Sereno argues in favour of having the liberation that black feminism demands lead the way, supported by technology. As long as technology is the guide, minoritized people will tend to fall behind.,” asserts Sereno.
Her work in this space reiterates that AI should be a tool for progress, not another avenue for perpetuating oppression. If you're interested in staying updated on Sereno's policy and advocacy work, you can find her shared insights on Twitter under the handle @SerenoZero.
A pivotal takeaway from Sereno's discourse is the need for a new perspective in addressing AI's inherent bias. The integration of Black Feminism offers a promising lens to observe technology and its impact on marginalized communities, ensuring the development of more equitable AI systems.
Video Transcription
OK. So I'm so excited to be presenting today. Um I'm Sereno Durrow. I'm the Technology Equity Policy Fellow at the Greenlining Institute, which I'll talk about more in a second. And I'm also a writer.Um So part of what you're going to see today is kind of a mixture between my tech policy work um at green lighting. I mainly work on uh broadband issues and then also algorithmic equity. So work on A I, this is specific about A I because that has been kind of the bread and butter of most of my work. Um And I'm just very excited to be at the tech global conference and to be presenting today on an issue that I think is very important and that I'm very passionate about. Um This is called A I DS Black Feminism principles and questions to guide A I development and policy. Um And these questions were developed um after a few months of uh for myself in this field of A I um looking at, you know, issues of fairness, um bi bias, um looking for solutions uh to promote justice for things of color um with the use of A I um and how A I also is often not used for just reasons.
Um And I realized that for myself, black feminism was a framework that I thought was very useful. And also for you all, even if you're not in the field of A I, and I will explain what I mean by A I because that's a very broad topic. Um I'll, I'll define that. But I hope even if you're not in this field, even if you're just in tech or you know, from different parts of technology or even not that, that these questions could be useful for you, especially if you're trying to take a racial justice approach and really uh grounding your work from a specific standpoint.
And I think black communism is a great scene. I'm so I'm excited and I'm gonna dive in. So first off to explain the green lighting institute more so green lighting is a policy and research institute based in Oakland, California. Um we envision a nation where it means of color thrive and race is never a barrier to economic opportunity. So we have two different policy areas, basically economic opportunity and then also climate and then within the economic opportunity, we also look at the role of technology and that's what I focus on with technology policy. So looking at how the lack of access to broadband and how the impacts of A I is a affecting communities of color economic opportunities, which kind of frames some of the work we do because we work a teeny bit less on like um facial recognition technologies and more so like when banks deploy A I, how does that impact people's or communities of color and their economic opportunities?
Um So even with A I, there's a lot of room to have to do advocacy. So green lining its name opposing red, which is legal practice of denying services to communities of color. And so in this map behind. So again, green lighting is based in Oakland, California. This map is from the 19 thirties of Oakland. And what you can see here with the red areas is those were areas with Ramly black communities and they were denied, financial, denied financial instruments, banking institutions, so not having access to uh mortgage loans and other things kind of needed to, you know, buy a house and have access to the type of wealth uh needed.
Um And so the yellow blue and then the green areas in particular were green areas were more white suburban areas that had access to those type of financial instruments necessary to thrive economically. Um And so green lighting is trying to pose redlining through policy. And a lot of, you know, uh the discriminatory um nature of redlining was used through policy trying to use policy to actually counteract the effects. And so the fact where you can see the harms of redlining is this that even today, you know, for every dollar of wealth, the white family has the median black family has eight cents. So, you know, redlining that happened from the 19 thirties, the effects of that and the racial wealth gap have only continued and led to the racial wealth gap. Um And so that's essential to our work. And when we get, when I get into a I, I'll kind of show how this connects even though it can seem kind of separate. Um So green lighting is the affirmative and proactive practice of providing economic opportunities to communities of color. So again, how red lining was used to directly extract from and bar and oppress communities of color um in order to um you know, put wealth towards white communities, green lighting is doing the exact opposite trying to provide economic opportunities to communities of color.
And so how does this connect to A I? Well, an algorithm uses past historical data addictions. And so the interesting thing that you see with the map of 19 thirties, red lines Oakland to the racial wealth gap today is that all those data points that came from, you know, zip codes being barred from access to financial instruments and other forms of discrimination are the same types of data used to make predictions about the future.
And so what we're seeing um and that people have seen this a lot and I'll give some examples in the past few years though, it's been an issue for a bit is that the data used for A I is biased. I mean, it has a history of racism and it's not race neutral. And so when we say, let's just make a decision about policing or about the economy about finances, et cetera, that's not neutral because the same data points used to make guesses about that because all an algorithm does is, you know, processing massive amounts of data finding patterns within that data and then making future predictions.
What it can actually do is see a history of racism patterns of racism and then peripherally exacerbating racism. And not only is that a problem because racism is a problem but also A I can also be harder to fight against, to. Uh I'll talk about the legality of fighting against A I and the and the difficulties that it presents. And so what you have is a system or technology that's that can exacerbate racial inequality and other forms of inequality on a massive scale due to the nature of what an A I model does and the racist history that we exist in. And so another form for this is algorithmic bias. Um So an example of algorithmic bias is when Amazon wanted to make a hiring algorithm um to as a recruiting tool, but then it was biased against women because um Amazon typically only hires uh or mainly is filled with men. And so the tool was actually biased against women because it saw you know, women characteristics as you know, not desirable for. And there's also racial bias that will arise in medical algorithms. Um Again, using past data in the healthcare industry which tends to be racist.
Um It can then exacerbate and multiply um racism through the algorithm. And then you have issues of Michigan um an employment agency made 20,000 false fre fraud accusations through an A I system that basically didn't work well. Um That was less about bias, but more about um what I've noticed is the promises of A I and then what it can actually deliver. Sometimes we're so excited by the future technology that they didn't really prove to the agency that it worked well. And by the I kind of mean the company that uh contracted that uh that and that made that algorithm. And so you have the issues of, you know, A I that's not made, well, A I that exacerbates and multiplies racism and other types of forms of discrimination. And so I think people have grown the need to call for tech companies to just regulate themselves or just to do good science isn't good enough. And so something that people I think have more realized is that policy is needed. And that's where I work in, in green lighting and policy is a combination of a lot of things, ethical standards, research, legal standards, industry practices and political barriers, which are pretty huge. Um And not that this is an order of importance or grants this. Um But something that you see within A I is that legal standards, for example, the no terms of anti discrimination law in the US, uh doesn't match well with the way that uh, discrimination tends to, uh arise in A I because our anti Discs laws are very focused on the intent of a human, but with A I, it's not really about intention.
Um And so, you know, there's kind of an issue and there um also you have um even research and industry practices that don't match well with policy. And so something that we've done together, you know, data scientists, computer engineers and different folks to be like, how can we regulate this? Well, but from this, I think that you do have to have a framework and that's why I'm going to talk about why Black feminism to me provides a framework to not just go towards A I that doesn't harm people but A I that's truly beneficial. And I think black feminism provides an important standpoint for that. Um And so here I talk about the technological realities, mutations and norms. And basically what I want to highlight here is that um in response to a lot of oppression that has happened due to A I due to racism that's been exacerbated, um People have in the A I community um tried to bring up bring about different statistical approaches to fairness. So can we measure fairness. Um And you know, calibrate na I so that it creates fair outcomes and it's not racist.
Um But some, one thing that this shows on the right side of the screen is that there's a lot of incompatibilities between the legal and machine learning fairness. And also just like to add that in addition to from a legal perspective, some of the bias correct methods being discriminatory, oftentimes, all of these issues really are social. It's not just a mathematical problem, it's a problem of justice. It's a problem of what does fairness mean in a history of racism. And so that's why I think Black feminism is important because if you try to find a statistical approach without the approach of justice, how will communities of color actually benefit from the use of A I or be prevented from uh experiencing harm? So Black feminism and technology, so I wrote um do we need a, is we need Black feminism's a poetic guide which was in response to a call for proposals from Meets Space Press about A I hype. And basically, they were just talking about how there's been a lot of instances in which there's been promises for A I to be the grand solution. And not that it doesn't always work, but oftentimes it. So how do we identify what is A I hype?
And I believe that we really need a lens that really has the history of race and also sexism and that's why I think Black feminism is really great for it. And so I wrote this poetic guide that proposes four questions. Um in addition to some more kind of uh regular essay typewriting that talks about the need of kind of discussing, you know, do we need a, I, do we need Black feminism? And I'll kind of get to my conclusion of that in the end. Um But first to just give a bit of background of black feminism, this is very high rooted, this is at a high level and this is actually the first paragraph of my writing on the left, you see, you know, black feminism is rooted in the knowledge every system in technology has race, gender and class implications.
Black feminism grows from black homes, experience of living at the intersections of race, gender, class and many other axis of oppression. And so black feminism is really about the black female experience. Um And often people, what what I think is really powerful about it is as a black woman and the swe drew to black feminism myself because I feel like it explains so much that inherently, it can appeal to white systems of power. Um Even though they experience the discrimination of being a woman and it kind of with black men like they, they are black and they experience anti blackness, but they can appeal to systems of patriarchy. And so oftentimes when they're fighting for liberation, it can be about womanhood or blackness, but not often those who experience both, which is a unique form of oppression. And so black feminism, and I think the black woman's standpoint really um is searching for liberation and fighting against all those systems of oppression, as mentioned, you know, race, gender class and many other axis of oppression because I can't just choose one. You know, if I just choose my blackness, that's not gonna help me when I'm also experiencing sexism and sexism as a black woman, which is even different from a white woman.
Um And so I think this is important for technology because oftentimes even in discussions that I have been um about justice and technology, we talk about race separately, you know, gender separately. But it's like, what about when those come together algorithms with a crash by Dr S Emoji Noble. And she talked about, you know, Google search engine, how it reinforced racism and sexism against black women by bringing up pornographic results whenever black girls was search. And I think that was a good example of the specific type of oppression that black women face on the internet and within technology. But that's sadly very understudied. But if we want to fight against the Algo make oppression that has arisen over the past many years, I do think that black feminism provides and requires more Liberator demands and higher demands for justice than just fairness that's not race aware or um even just fighting for racism but not looking at that feminist uh intersection or the woman's intersection that the black woman's standpoint presents.
And so some questions that I brought forth um impose in my poetic guide it. Personal communal. Well being for black women is this algorithm less biased than a black feminist? Should black feminists be hyped? And why is this A I solution for black women? And I wanted to post these questions because oftentimes the benefits and even drawbacks, sometimes of A I is presented from a very race, neutral, gender, neutral standpoint. It's like, you know, this A I solution is very beneficial. But something that a lot of critical race scholars um within the A I field have talked about is that A I oftentimes when you just say is beneficial for everyone that there's no such type of race, neutral standpoint in a history of racism such as in America. And so if you're saying it's beneficial for everyone, that oftentimes just means it's beneficial for the most, most empowered and most privileged, which tend to be white men and folks close to that um that level of privilege. And so for me, I want to suppose, why is this A I solution for black women? How does it increase individual and personal well being for black women of making beneficial for me? Because in the history of you know, us racial oppression, that's not true when something is said to be beneficial for everyone that often really means that it's not for black women or for other minoritized peoples.
Um So I think that this is an important question to pose and I think it's good when developing A I or crafting a policy. And I think these questions hold myself to a higher standard because it makes me have to fight against that idea that policies can be race neutral or that the development of technology could be race neutral. So this is a question you can ask yourself, you know, if you're at a company and they said, you know, we're going to create this marketing plan, like everyone will love it. Like you think like, like would black women love it? I'm like, what, who, who's benefiting from this? Who, who is everyone? Because often we say everyone that's not actually true, it's not everyone. Um And also in this poetic guide, you know, I talk about on here, if black women aren't mentioned, it means black people will be harmed.
Um But also if any historical marginalized group is not mentioned, that means that this group will be harmed. What I like about this poetic guide is I I do think you could talk about different groups with these questions. You know, you can say like, does this algorithm increase individual and personal communal well being for folks with disabilities? Does it for those who don't have us citizenship? There's many different axes of oppression which you could ask these questions. And I think it's important that we normalize, especially within technology, which can be seen as so neutral, but it is actually so loaded and really carries in technology and forms our cultural norms, our social norms as social media has changed so much that these questions can't be um asked from a neutral standpoint, that's not good enough anymore, especially if we want to follow through on, you know, the just use of technology that I think more people are pushing for.
And so just in conclusion, and um I know I have three minutes left and I'd like to see if there's any questions. But I, I wanted to show some of the stanzas that I put in. So this isn't my whole poetic guide. It will come out from Meet Space Press, hopefully in July. So I definitely will post it on Twitter. You can follow me there also linkedin so that folks can read it, especially if you want to implement this in your work, you can follow me there and you can definitely see it. Um But I have poems uh like stanzas and then like stands us in the blocks of text. Um And this one on the My, right that says, do we need a I, we need Black Feminism liberation should lead. Technology should support. There's no mechanical solution to sin. There's only the purposeful striving towards justice. I think encapsulates. Well, of the point of the poetic guide is do we need A I or do we need Black feminism? It's not saying we don't need A I. But as long as technology leads, black women will fall behind, minoritized people will fall behind liberation needs to lead for technology to actually be useful. Or else it will just become another tool for oppression as we we've seen in other cases of algorithmic bias.
Um And that's why I think black feminism is an important tool because I think it blatantly calls out the need for justice in a way that really kind of ignore because it can make many people uncomfortable under other frameworks. And so this piece that I wrote was kind of on my own time. It wasn't just exactly green lighting. Like I said, I'm at green lighting. I'm also a writer myself, but it's interwoven with my own day job work very well, which I appreciate. Um If you want to keep up with green lighting's work specifically of instant land, he sell La Moya, who's also a technology equity fellow, who's really awesome. And you should follow on Twitter um publish algorithmic bias explains how automated decision making becomes automated discrimination.
And I also have a forthcoming report called G an Governance Seven Pulse Recommendations for algorithmic oppression. And there you can follow more of our policy work. So I know I have a minute left. I tried to get through as much as I could, like I said again, uh to keep up with my policy and advocacy work. You can follow me on Twitter at Sereno Zero and also at linkedin at Sereno Zero, quite a few things that's going to be published soon. Um So you could definitely follow that there. Um In addition to uh the poetic guide, there's gonna be some other things. So please feel free to follow me if you'd like to see that as well. And thank you, I like love seeing the comments. Thank you for those who attended and I hope you enjoy the rest of the woman's tech global conference. I'm excited to see other speakers. Thank you.