About the project

Project duration: 01.11.2020 – 31.12.2025

Funded by the Norwegian Research Council, project number: 311680

Artificial intelligence (AI) applications can increase the efficiency and quality of processes across sectors including public services. However, their inner workings can be incomprehensible making it hard to explain how they transform data inputs to outputs. This is known as the “black box” problem which can impede the involvement of humans in shaping, operating and monitoring the use of AI in service delivery.

The Norwegian government promotes the responsible use of AI in public administration aiming to lead the way in developing human-friendly and trustworthy solutions. The responsible use of AI entails ensuring intelligibility and accountability. Intelligibility means that the AI applications must be intelligible as to their state, the information they possess, their capabilities and limitations. Accountability means that it is possible to trace and identify responsibility for their results. Intelligibility and accountability can help users understand, appropriately trust, and effectively manage the emerging generation of AI applications.

The AI4Users project addressed the “black box” problem contributing to the responsible use of AI for the digitalisation of public services. The novelty of AI4Users is that it targeted specifically non-experts extending the reach of research beyond data scientists. The project took a human-centred perspective addressing the needs of different groups including citizens, case handlers at the operational level, middle managers and policy makers. Specifically the project included a) design, prototyping and assessing tools enabling different categories of non-experts to maintain insight into AI applications, b) formalization of the design knowledge generated in actionable design principles, c) capacity building through collaborations between academia and the public sector nationally and internationally.

The Research Question addressed by the project was: how to design AI intelligibility and accountability tools for non-experts facilitating meaningful human control? The methodological approach employed was Action Design Research (ADR). ADR specifies an iterative and adaptive research process closely linked to practice; it emphasizes collaboration and is evaluation driven. The core of the project was performed through Building, Intervention and Evaluation (BIE) activities structured in a series of sprints. The project collaborated with the Norwegian Labour and Welfare Administration (NAV).

Researchers performed observations and interviews with individuals from the different target groups to gain insights on their needs and experiences. Based on the information collected a series of “design challenges” were organised. The design challenges fed the project design activities and helped in raising awareness. Further design activities included prototyping workshops and cooperative evaluation activities. A selection of processed project data can made available upon request. Due to the contextual character of the collected material, it is not possible to share unprocessed data. If interested contact: Polyxeni Vassilakopoulou.