Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Raw Transcription

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 2

Raw transcription by Brigette L.

Domingo
Duration: 35 minutes
[00:00:00.282] - Sam Charrington
All right everyone. I am on the line with Abeba Berhane. Abeba is a Ph.D. student at University
College Dublin. Abeba, welcome to the TWIML AI podcast.

[00:00:12.552] - Abeba Berhane


Thank you so much for having me, Sam.

[00:00:15.522] - Sam Charrington


I'm really excited about this conversation. We had an opportunity to meet in person after a long
while interacting on Twitter at the most recent NeurIPs conference, in particular the Black in AI
Workshop, where you not only presented your paper Algorithmic Injustices Toward Relational
Ethics, but you won best paper there. And so I'm looking forward to digging into that and some
other topics. But before we do that, I would love to hear you kind of share a little bit about your
background. And I will mention for folks that are hearing the sirens in the background, while I
mentioned that you are from University College Dublin, you happen to be in New York now at
the A. I. E. S. Conference in association with AAAI. And as folks might know, it's hard to avoid
sirens and construction in New York City. So just consider that background. Our mood, mood,
ambience, background sounds so, your background.

[00:01:24.982] - Abeba Berhane


Yes, yes.

[00:01:25.942] - Sam Charrington


How did you get started working in AI Ethics?

[00:01:28.482] - Abeba Berhane


So my background is in cognitive science and particularly and a part of cognitive science called
Embodied Cognitive Science, which is which has the roots, you know, in cybernetics, in systems
thinking. The idea is to focus on on on the on the social and the cultural, on the historical and
kind of to view cognition in continuity with the world, with historical backgrounds and all that
and as opposed to, you know, your your traditional approach to cognition, which just creates
cognition as something located in the brain or something formalizable, something that can be
computed. So, yes. So that is my background. Even during my masters, I. I lean towards the A.I.
side of cognitive science. The more I delve into it, the more I am much more attracted to to the
ethics side, to, you know, injustices, to the social issues. And so the more the Ph.D. Goes on, the
more I find myself in the ethics side.

[00:02:49.942] - Sam Charrington


Was there a particular point that you realized that you were really excited about the ethics part
in particular, or did it just evolve for you?

[00:02:59.212] - Abeba Berhane


I think it just evolved. So when I started out, I at the end of my Masters and at the start of the
Ph.D., my idea is that, you know, we have this new relatively new school way of thinking, which
Raw transcription by Brigette L. Domingo
Duration: 35 minutes
is Embodied CogSci, which I quite like very much because it emphasizes, you know, ambiguities
in messiness and contingencies as opposed to, you know, drawing clean boundaries. And so the
idea is, yes, I like the idea of redefining cognition as something relational, something inherently
social, and something that is continually impacted, influenced by other people and the
technologies we use. So the technology aspect, the technology end was my interest. So initially,
the idea is, yes, technology is constitutes aspect of aspect of our cognition. You have the
famous 1998 thesis by Andy Clark and Dave Chalmers, the extended mind where they claimed,
you know, the iPhone is an extension of your mind. And so you can think of it that way. And I
was kind of advancing the same line of thought. But the more I delved into it, the more I saw
yes, digital technology, whether it's, you know, ubiquitous computing, such as face recognition
systems on the street or your phone, whatever. Yes. It does impact and it does continually
shape and reshape our cognition in what it means to exist in the world. But what became more
and more clear to me is that not everybody is impacted equally. And the more privileged you
are, the the the more in control of you are as to, you know, what can influence you and what
you can avoid. So that's where I become more and more involved with the ethics of
computation and its its impact on cognition.

[00:05:23.422] - Sam Charrington


The notion of privilege is something that flows throughout the work that you presented at Black
in A.I., the Algorithmic Injustices paper, and this idea, this construct of relational ethics. What is
relational ethics and what are you getting at with it?

[00:05:45.202] - Abeba Berhane


Yeah, so relational ethics is actually not a new thing. A lot of people have theorized about it and
have written about it. But the way I'm approaching it is the way I'm using it is and it's I guess it
kind of springs from and this frustration that for many folk who talk about ethics or fairness or
justice, most of it comes down to, you know, constructing this neat formulation of fairness or a
mathematical calculation of who should be included and who should be excluded. What kind of
data do we need, that sort of stuff. So for me, relational ethics is kind of let's leave that for a
little bit and let's zoom out and see the bigger picture. And instead of using technology to solve
the problems that emerged from technology itself. So which which means censoring
technology, let's instead center the people that are people, especially people that are
disproportionally impacted by, you know, the limitations or the problems that arise with the
development and implementation of technology. So there is a robust research in you can
quantify fairness or algorithmic injustice. And the pattern is that the more you are at the at the
bottom of the intersection level, that means the farther away from you are from your
stereotypical white CIS gendered domain and the more the bigger the negative impacts on you,
whether it's classification or categorization or whether it's being scanned and called for and by
hiring algorithms. Or looking for housing or anything like that. The more you move away from
that stereotypical category, you know, the status quo, the more the heavy the impact is on you.
So the idea of relationality is kind of to to to think from that perspective to to take that as a
starting point. So these are the groups or these are the.

You might also like