Signed, sealed – deciphered? Holding algorithms accountable to protect fundamental rights – WS 09 2016: Difference between revisions

From EuroDIG Wiki
Jump to navigation Jump to search
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
----
----
'''Please use your own words to describe this session. You may use external references, websites or publications as a source of information or inspiration, if you decide to quote them, please clearly specify the source.'''
'''Please use your own words to describe this session. You may use external references, websites or publications as a source of information or inspiration, if you decide to quote them, please clearly specify the source.'''
== Session teaser ==


== Session description ==  
== Session description ==  
Line 39: Line 36:


'''Key participants'''  
'''Key participants'''  
* [https://be.linkedin.com/in/achimklabundeAchim Klabunde, Head of Sector IT Policy at European Data Protection Supervisor]
* [https://be.linkedin.com/in/achimklabunde Achim Klabunde, Head of Sector IT Policy at European Data Protection Supervisor]
* Daniel Drewer, Head of the Data Protection Office, Europol (tbc)
* Hans-Peter Dittler, ISOC board of trustees
* Irina Vasiliu, Data Protection Unit, DG Justice, European Commission (tbc)
* Matthias Spielkamp, Algorithm Watch


'''Moderator''': Matthias Spielkamp, Algorithm Watch
'''Ressource speakers:'''
* Cornelia Kutterer, Microsoft
* Elvana Thaci, Council of Europe


'''Remote moderator''': Ayden Férdeline, New Media Summer School
'''Remote moderator''': Ayden Férdeline, New Media Summer School


'''Org team'''  
'''Org team:''' Matthias Spielkamp, Algorithm Watch
* Matthias Spielkamp, Algorithm Watch


'''Reporter'''
'''Reporter''': Lorena Jaume-Palasí, EuroDIG


== Current discussion ==
== Current discussion ==
Line 63: Line 61:
== Mailing list ==  
== Mailing list ==  


== Remote participation ==
== Video record ==
[https://arkadin-emea.adobeconnect.com/telemak_140436_eurodig_studio211 Find the link here]
[https://youtu.be/zxMXlPqzqqA See the video record in our youtube channel]
 
== Transcript ==
[[Transcript: Signed, sealed - deciphered? Holding algorithms accountable to protect fundamental rights]]


== Final report ==   
== Messages ==   
Deadline 2016
* Regulators should focus on the social and economical aspects affected by algorithms.
* There is a need for transparency with regards to how algorithms are used instead of transparency on how data is being processed.
* There is a value in laws enabling users to request information on how algorithmic decision (supporting) processes are made, including the inputs and discriminatory criteria used, the relevance of outputs as well as purpose and function.
* Humans use criteria that still cannot be emulated by machines when interacting in daily life.
* In analogy to individuals who are accountable and supervised by others professionally and socially, algorithms should be held accountable to democratic control.
* As societies we have defined issues of responsibility and liability in a long process. When it comes to algorithmic decision making we are just starting this process.


== Session twitter hashtag ==   
== Session twitter hashtag ==   

Revision as of 14:07, 30 July 2016


Please use your own words to describe this session. You may use external references, websites or publications as a source of information or inspiration, if you decide to quote them, please clearly specify the source.

Session description

An increasing share of our social interactions depends on the mediation by algorithmic decision making processes (ADM) or by algorithmic decision supporting processes. ADM and data-driven models serve to automate many of the factors affecting how news and information is produced and distributed and therefore shape the public discourse. [1] In the US, they are used for risk assessments before deciding who can be set free at every stage of the criminal justice system, from assigning bond amounts to even more fundamental decisions about defendants’ freedom. [2] In medical centers they are used as decision supporting tools in the diagnostics. The credit scores of individual’s and the performances of teachers and students are assessed partially or fully with algorithms. It is uncontested that AMD holds enormous promise and may contribute to the creation of less subjective, fairer processes and reduce the risk of careless mistakes. At the same time it carries enormous dangers of delegating discrimination to subtle automated processes that are too hermetic to be noticed. We need to discuss different questions relating to these technologies:

  • What kind of scrutiny does ADM have to be submitted to?
  • What objectives are meaningful, necessary and sufficient?
  • Do we need to look for intelligibility, transparency, accountability?
  • Can we expect any kind of control in light of self-learning systems?
  • If not, what needs to be the result - a ban on ADM in cases when fundamental rights are affected?
  • Would such a ban be enforceable?
  • And last but not least: Who is responsible for the outcomes of ADM - the designers of the systems, the coders, the entities implementing them, the users?

[1] http://www.datasociety.net/pubs/ap/QuestionsAssumptions_background-primer_2016.pdf

[2] https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

Keywords

Algorithms, Big Data, algorithmic accountability, data protection, human rights, innovation

Format

Please try new interactive formats out. EuroDIG is about dialogue not about statements.

Further reading

People

Focal Point: Matthias Spielkamp, Algorithm Watch

Key participants

Ressource speakers:

  • Cornelia Kutterer, Microsoft
  • Elvana Thaci, Council of Europe

Remote moderator: Ayden Férdeline, New Media Summer School

Org team: Matthias Spielkamp, Algorithm Watch

Reporter: Lorena Jaume-Palasí, EuroDIG

Current discussion

See [the discussion tab] on the upper left side of this page.

Conference call. Schedules and minutes

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange
  • be as open and transparent as possible in order to allow others to get involved and contact you
  • use the wiki not only as the place to publish results but also to summarize and publish the discussion process

Mailing list

Video record

See the video record in our youtube channel

Transcript

Transcript: Signed, sealed - deciphered? Holding algorithms accountable to protect fundamental rights

Messages

  • Regulators should focus on the social and economical aspects affected by algorithms.
  • There is a need for transparency with regards to how algorithms are used instead of transparency on how data is being processed.
  • There is a value in laws enabling users to request information on how algorithmic decision (supporting) processes are made, including the inputs and discriminatory criteria used, the relevance of outputs as well as purpose and function.
  • Humans use criteria that still cannot be emulated by machines when interacting in daily life.
  • In analogy to individuals who are accountable and supervised by others professionally and socially, algorithms should be held accountable to democratic control.
  • As societies we have defined issues of responsibility and liability in a long process. When it comes to algorithmic decision making we are just starting this process.

Session twitter hashtag

Hashtag: #eurodig16 #alacc