mann-challengingalgorithmic-2019.pdf (266.04 kB)
Challenging algorithmic profiling: The limits of data protection and anti-discrimination in responding to emergent discrimination
journal contribution
posted on 2019-01-01, 00:00 authored by Monique Mann, T Matzner© The Author(s) 2019. The potential for biases being built into algorithms has been known for some time (e.g., Friedman and Nissenbaum, 1996), yet literature has only recently demonstrated the ways algorithmic profiling can result in social sorting and harm marginalised groups (e.g., Browne, 2015; Eubanks, 2018; Noble, 2018). We contend that with increased algorithmic complexity, biases will become more sophisticated and difficult to identify, control for, or contest. Our argument has four steps: first, we show how harnessing algorithms means that data gathered at a particular place and time relating to specific persons, can be used to build group models applied in different contexts to different persons. Thus, privacy and data protection rights, with their focus on individuals (Coll, 2014; Parsons, 2015), do not protect from the discriminatory potential of algorithmic profiling. Second, we explore the idea that anti-discrimination regulation may be more promising, but acknowledge limitations. Third, we argue that in order to harness anti-discrimination regulation, it needs to confront emergent forms of discrimination or risk creating new invisibilities, including invisibility from existing safeguards. Finally, we outline suggestions to address emergent forms of discrimination and exclusionary invisibilities via intersectional and post-colonial analysis.
History
Journal
Big Data & SocietyArticle number
July - DecemberPagination
1 - 11Publisher
SageLocation
London, Eng.Publisher DOI
Link to full text
eISSN
2053-9517Language
engPublication classification
C1 Refereed article in a scholarly journalUsage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC