Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Week+8+-+Algorithm+Accountability+ L0+Responsibility

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Accountability & Responsibility

NEW MEDIA LITERACY

WEEK 8
How algorithms silently structure
our lives: Module recap
•Algorithms provide us with more efficient and convenient ways to access relevant content in exchange for our data

From Do social media algorithms erode our ability to make decisions freely? (Mitchell & Bagrow, 2020)

• Digital and social media platforms are driven by algorithms which ranks and recommends content based on our data
• Every online activity we make online provides data that is then used to make predictions about us
• Determines our search results & content we see on our social media news feed

• Users are being profiled


• Determines the ads we see based on our online activity
• Ability to profile your psychology based on Facebook likes – e.g., Cambridge Analytica (Halpern, 2018)

• Users are being predicted


• Algorithms sift through datasets to identify trends and make predictions (Martin, 2018)
• Predicts if we will make a purchase, who we will vote for, or what policies we will support
• Researchers found that it’s possible to build “shadow profiles” based on their contacts who are on Facebook.
2
Can better algorithms solve
our problem?
• In your assigned reading (Heilweil, 2021), a few solutions were presented to
improve/modify/change how social media algorithm works:
• Building “open” recommendation algorithms
• Providing better user choice in the platform design
• More oversight in regulating tech giants’ algorithms

• What about user agency and choice?

• Reflection: Do filtered media worlds cause the online segregation we see, or do people
construct self-reinforcing filters because they already have divergent beliefs?

“misdirect scholarly, journalistic, and regulatory attention to the technological rather than social and
societal factors underpinning these problems” (Bruns, 2019).

3
From Facebook Newsroom

Reflection: What do you think of these changes? Will they help FB


users be more exposed to diverse news and opinions?
In Google we trust…or should we?
• “Our search and recommendation systems reflect what people search for, the number
of videos available, and the videos people choose to watch on YouTube. That’s not a
bias towards any particular candidate; that is a reflection of viewer interest.” – Google
Spokesperson (Lewis, 2018)
• Google has expanded human moderators, removing offensive content identified by
journalists and de-monetized the channels that created them
• From a design perspective (Cho, Ahmed, Hilbert, Liu, & Luu, 2020):
• Algorithmic search recommendations personalized by user has the potential to
solidify personal convictions and encourage polarized opinions
• Offering a range of search terms that goes beyond users’ political beliefs can expose
users to more diverse political content

• This suggests that treating algorithms as neutral and simply reflecting users’ interests
does nothing to solve the inherent problems caused by its algorithmic design

5
Can better algorithms solve
our problem?
• Better algorithms to the rescue: Team of researchers in Finland and Denmark developed a new algorithm that
increases the diversity of exposure on social networks. The group’s algorithm provides a feed for social media users
that is at least three times more diverse

• It is easy to say “we need more diverse information” but how do you translate that into code?

• Issue: What is the algorithmic definition of diversity?


• What is the scope of diversity - Different political beliefs? Info from different countries? Fringe views?

• There are multiple ways to approach diversity, but we don’t yet have an ecology of custom filters that allows users
to choose what content diversity means to each of them

• Machine learning algorithms have mastered recommending what’s likely to be most engaging. Breaking the
feedback loop might mean mimicking the ways by which humans discover items of interest offline: through friends
and family, expert advisers, happy accidents or serendipitous chance (Sadagopan, 2019)

• An uncritical faith in technology? Are we expecting technology to save us from ourselves?


7
Let the blame game begin…
From your assigned reading this week (Martin, 2018)
• Who should be accountable for algorithmic decisions?
• Algorithms as neutral blank slate mirroring back to society what is accurate and efficient
• Minimal responsibility for developers who design the algorithms
• Minimal responsibility for companies who determine the algorithms
• Responsibility is shifted to the user
• Algorithms as value-laden autonomous black-boxes
• Minimal responsibility for users in how algorithms make decisions
• Negates users’ agency and their role in shaping the technology
• Responsibility is shifted to developers and companies who designed the algorithms
• Argument that machine learning algorithms are complex and autonomous that even developers cannot
predict outcomes/impact does not preclude them from taking responsibility

8
Let the blame game begin…
• Exactly what should developers & programmers be accountable for?

• Design of algorithms
• Identifying what goes into the decision-making process (e.g., principles & norms)
• Considering moral consequences of their decision making -
• Making decisions that favor human dignity and rights

• The call for more transparency may not always be feasible or the best outcome
• Too much transparency allows some groups of people to “game” the system; which may create new disparity between
those who can and those who cannot “game” the system
• Companies may take advantage of the transparency to avoid fraud detection or regulation

• Increased scrutiny, oversight and governing body


• Algorithmic decisions should have great oversight and be treated like professions that affect public welfare (e.g., civil
engineering, lawyers, doctors, etc)
• Big Data review boards to oversee algorithmic decisions
• Professional certification to ensure technical training and knowledge of ethical implication

9
Where is the USER in all of
this?
From Radical ideas spread through social media. Are the algorithms to blame? (Wu, 2019)

• Underlying reason for the reinforcing tendency of algorithms is that they are naturally designed to operate with human
tendencies
• One of the human tendencies (relevant to New Media literacy): People generally feel more comfortable with message
consistency and confirmation of beliefs (Cho, Ahmed, Hibert, Liu, & Luu, 2020)

• Impact is not predictable or clear-cut (challenges techno-deterministic perspective)


• Directional motivation: Individuals are motivated to seek information that they believe is consistent with their existing
attitudes
• Accuracy motivation: Individuals are motivated to seek information that they expect to be of high informational value.
• Interplay of these two motivations = algorithmic filtering do not affect users in a clear and predictable pattern

• Users influence algorithms as much as algorithms influence us


• The extent of exposure to diverse content and perspectives is also related to users’ social groups and online activity

10
Where is the USER in all of
this?
• Increased social awareness
• Awareness of the limits of digital media when it comes to acquiring knowledge and information
• Awareness that everyone is vulnerable to disinformation campaigns that take advantage of algorithms
• Public awareness of the different ways user data and ownership can be stored and managed
• Awareness of public services that provide necessary tools and structures for storing and managing user data

• Open discussion and education on algorithmic decisions


• Create independent review boards or committees made up of users who use the algorithm to weigh in on how
these algorithms can best be put into practice

• Rethink the environment that create these algorithms


• Include perspectives from experts in fields outside of computational science such as public policy, sociology,
psychology, communication, and media studies, etc.
• Design algorithms in accordance with changing values of users and society at large

11
Watch:
Social
Dilemma
Questions to guide you while you are watching:
•Because advertisers are the ones who pay for social media platforms, they
are the customers, which makes social media users the ____________
•The classic saying is "if you are not paying for the product, then
you ____________
•What is the "business model" of social media companies like Facebook,
Snapchat, Twitter, Instagram or Youtube?
•In order to be successful in business, the company needs to have good
predictions. And good predictions begin with ONE imperative, which is
_________________
•What does Facebook do with the data it collects?
•In the documentary, social media is portrayed as something that isnt and
cannot be addicting. Is that a true or false statement?
•According to Cathy O'Neil, algorithms are _____________ embedded in
code
•AI has the capability to know truth. Is this a true or false statement?
•When performing a Google search, users will see the same auto-complete
results no matter where they live. Is that a true or false statement?
12
Forum Discussion
• We discussed data capitalism and commodification last week. Thinking back to the times when you were shown a
sponsored post or display ad while browsing online or interacting on social media:
• Briefly discuss how brands and companies are collecting data on and/or tracking you online.
• Drawing from last week’s lecture, did you try to protect your privacy and conceal your digital footprint? Explain your
answer in the context of the convenience you would have to give up to protect your data – i.e., how much convenience
are you willing to give up to protect your privacy?

• This week, we discussed accountability and responsibility of algorithmic design and watched Netflix’s Social Dilemma.
Imagine that you have just been hired by the Canadian government to address the impact of algorithms on the access and
consumption of information online (social media, search engines, etc.).
• Drawing from this week’s lecture and readings, what solution would you propose? And why?

• Pick a key argument/insight/point from the documentary (or Katie Couric’s interview if you don’t have access to The Social
Dilemma). Explain what it is and why you picked it.
• Briefly discuss whether the documentary Social Dilemma (or Katie Couric’s interview) changed your views about social
media companies. Why or why not?

13

You might also like