Error and Accountability in Digitised Public Decision Making (ERRATUM). Public authorities increasingly invest in automated decision making to streamline their resources. Though promising in principle, in practice often problematic due to a fundamental failure to understand the interaction between human and machine decision making.
More about the research
Public authorities increasingly invest in automated decision making to streamline their resources. Though promising in principle, in practice often problematic due to a fundamental failure to understand the interaction between human and machine decision making. Key to understand this interaction is the concept of an error. To a machine, an error is any difference between the output of the algorithm and the correct answer. The process that led to this error is ignored, as well as explainability and interpretability. In legal decision-making, errors cannot be understood outside the process that produced them and errors are not mathematically measurable. This project aims to develop algorithmic tools that model the legal understanding of error in decision-making and to explore the extent to which the legal understanding of an error must adapt to the Machine Learning concept of an error. |