GUEST EDITORS

Marcello D’Agostino (University of Milan, Italy) and Massimo Durante (University of Turin, Italy)

INTRODUCTION
In our information societies, we increasingly delegate tasks and decisions to automated systems, devices and agents that mediate human relationships, by taking decisions and acting on the basis of algorithms. Their increased intelligence, autonomous behavior and connectivity are changing crucially the life conditions of human beings as well as altering traditional concepts and ways of understanding reality. Algorithms are directed to solve problems that are not always detectable in their own relevance and timeliness. They are also meant to solve those problems through procedures that are not always visible and assessable in their own. In addition, technologies based on algorithmic procedures more and more infer personal information from aggregated data, thus profiling human beings and anticipating their expectations, views and behaviors. This may have normative, if not discriminatory, consequences. While algorithmic procedures and applications are meant to serve human needs, they risk to create an environment in which human beings tend to develop adaptive strategies by conforming their behaviour to the expected output of the procedures, with serious distortive effects. Against this backdrop, little room is often left for a process of rational argumentation able to challenge the results of algorithmic procedures by putting into question some of their hidden assumptions or by taking into account some neglected aspects of the problems under consideration. At the same time, it is widely recognized that scientific and social advances crucially depend on such an open and free critical discussion.

TOPICS
The aim of this special issue of Philosophy & Technology is to explore questions about the governance of algorithms in light of the technological dependence of our information societies. We ask how to face theoretical and practical challenges in order to assure that technological innovations go hand in hand with human needs, beliefs and expectations. We solicit the submission of papers from different disciplines (law, ethics, economics, computer science, social studies, epistemology and philosophy of science) to address questions such as:

  • How to deal with the “knowledge problem” (as Franck Pasquale put it), i.e., with the openness, transparency and fairness of algorithmic procedures and applications?

  •  How to govern those algorithmic procedures and applications once we delegate them the accomplishment of tasks or the solution of problems?

  •  By which standards the relevance and the timeliness of problems as well as the efficiency and the legitimacy of solutions are measured?

  • Is the extensive functioning of automated systems, devices and agents based on algorithms able to impair human freedom and autonomy, free critical discussion and reflexivity?

TIMETABLE

  • December 19, 2016: Deadline for paper submissions February 13, 2017: Deadline reviews papers
  • March 13, 2017: Deadline revised papers
  • April, 2017: Publication of accepted papers

SUBMISSION DETAILS
To submit a paper for this special issue, authors should go to the journal’s Editorial Manager http://www.editorialmanager.com/phte/
The author (or a corresponding author for each submission in case of co- authored papers) must register into EM.

The author must then select the special article type: “ The Governance of Algorithms” from the selection provided in the submission process. This is needed in order to assign the submissions to the Guest Editors.
Submissions will then be assessed according to the following procedure:
New Submission => Journal Editorial Office => Guest Editor(s) => Reviewers => Reviewers’ Recommendations => Guest Editor(s)’ Recommendation => Editor-in- Chief’s Final Decision => Author Notification of the Decision.
The process will be reiterated in case of requests for revisions.

For any further information please contact:

Marcello D’Agostino: marcello.dagostino@unimi.it

Massimo Durante: massimo.durante@unito.it