Tag Archives: decision-making

Artificial Intelligence and decision making – experts discuss results of Citizens’ Juries to inform national guidance

1 Jul

Banner _Citizens Juries workshop image

Is it necessary to give reasons for decisions made using artificial intelligence (AI) software even if the results may not be as accurate? Leading academics and experts believe, with the development of AI advancing, guidance on how it can be used in decision making is needed.

The National Institute for Health Research Greater Manchester Patient Safety Translational Research Centre (NIHR Greater Manchester PSTRC) and the Information Commissioners Office (ICO) commissioned two citizens’ juries* to investigate AI and decision making. In healthcare, the juries prioritised accuracy over explanation, but in other scenarios** they reached different conclusions; clearly, context matters.

The results were presented and discussed recently at a workshop attended by NHS National Data Guardian, Dame Fiona Caldicott, and chaired by Professor Stephen Campbell, Director of the Greater Manchester PSTRC. More than 50 people from across the country participated, including members of the two juries, NHS executives, academics and researchers working in range of specialisms such as philosophy, computer science and the arts.

Professor Niels Peek, Theme Lead for Safety Informatics at the Greater Manchester PSTRC, said: “Before there is widespread adoption of AI across the NHS it is important to develop guidelines to make sure patient safety is assured. We were keen to work with the ICO on this to help inform the guidance it is writing. We chose to commission two citizens’ juries to find out what the public think about this complex issue.

“It was valuable to reflect on the findings of the juries with a large group of experts during the workshop. We presented the research and invited questions before taking part in table discussions about AI and decision making which helped to give extra weight to our existing findings.”

Kayshani Gibbon, from The Royal College of Art (RSA), also spoke about research it’s due to publish on the ethical use of AI, giving the attendees additional background information.

Dame Fiona Caldicott, NHS National Data Guardian, said: “I welcome this project and the way it has involved a cross-section of the public in these important considerations on AI that will have a major impact on our lives. We have also used the citizens’ jury approach, and have found it’s a valuable way to learn in more depth what members of the public think.”

Ben Bridgewater, CEO at Health Innovation Manchester, attended the event and expressed how important he felt it was to be establishing a position on AI and decision making in the NHS. Ben said: “In Manchester we’re at the forefront of innovation in healthcare so being involved in this research and having the opportunity to comment on the results has been valuable.”

Simon McDougall, Executive Director for Technology and Innovation at the ICO spoke about the wider importance of the research as well as its significance: “Better understanding the public’s views on explaining AI decisions is vital for the ICO and The Alan Turing Institute, who are working with us on this, as we are developing guidance for organisations in this area.”

“The citizens’ juries gave us the opportunity for an in-depth exploration and discussion about the issues arising from AI decision-making and equipped us with a unique and informed public opinion on this complex issue.

“Our research findings, informed by the juries, have recently been published in our interim report.”

The citizens’ juries were run by Dr Malcolm Oswald, Director of Citizens Juries c.i.c. He said: “Bringing together experts from across the country and varying specialisms to discuss AI and explainability was a great opportunity to open up the debate.  These questions on AI will become ever more crucial as it increasingly affects our daily lives. It’s a complicated topic, and these citizens’ juries gave us five days to bring expert evidence and the time for people to work together to reach reasoned recommendations that will inform national policy.”

To find out more about the NIHR Greater Manchester PSTRC visit –  http://www.patientsafety.manchester.ac.uk/ and you can learn more about the breadth of projects Citizens Juries c.i.c. are working on at https://citizensjuries.org/

Should Artificial Intelligence give reasons for decisions even if it affects accuracy – Citizens’ Juries deliberate

27 Feb

L Riste_Citizens Jury day 3 small groupwork_pZNJshV8

Is it necessary for artificial intelligence (AI) to give reasons for its decisions even if it means the results aren’t as accurate? When and why are explanations of AI decisions most important? A leading research team investigated these questions in Coventry and Manchester and the results could affect future national policy.

The National Institute for Health Research (NIHR) Greater Manchester Patient Safety Translational Research Centre (Greater Manchester PSTRC) and the Information Commissioner’s Office (ICO) commissioned Citizens Juries c.i.c to find out what the general public thinks. The findings of the research will inform guidance under development by the ICO and the Alan Turing Institute to help organisations explain decisions made by AI to the individuals affected. One question considered by the two juries was, if a computer gives a diagnosis, is it better to be given an explanation of how the computer reaches its diagnosis even if that means the computer’s diagnosis is likely to be less accurate?

Each jury was made up of 18 people from a cross-section of the public across two locations. The first week was the turn of Coventry and the second, the process was repeated in Manchester. The jurors came together for five days to hear expert evidence before making their recommendations.

Professor Niels Peek, Research Lead for Safety Informatics at the NIHR Greater Manchester PSTRC based at The University of Manchester, and Principal Investigator for this research, said: “AI is fast becoming extremely useful in healthcare diagnosis and, in some cases, can be more accurate than a doctor. The most advanced AI systems are now so complex that some aren’t able to give a reason for a diagnosis. If people are given a diagnosis or decision by a computer but aren’t able to ask for a reason, does that affect how much they trust it? Or, are they prepared to forgo an explanation if that means greater accuracy?

“We need to find out what the general public thinks about this and that’s why we’re conducting this research and asking for their feedback to ensure they have the opportunity to give their opinion on something that will affect patient safety.”

The two juries considered the importance of explanations and the trade-off between accuracy and explanations for decisions made by AI in four different scenarios:

  • Healthcare: diagnosis of acute stroke
  • Healthcare: finding matches between kidney transplant donors and recipients
  • Criminal Justice: deciding which offenders should be referred to a rehabilitation programme
  • Recruitment: screening job applications and making shortlisting decisions

Dr Malcolm Oswald, Director of Citizens Juries c.i.c, said: ”These are important questions now, and they will become ever more crucial as artificial intelligence increasingly affects our daily lives. It’s a complicated topic, and these citizens’ juries give us five days to bring expert evidence and the time for people to work together to reach reasoned recommendations that will inform national policy.”

To find out more about the NIHR Greater Manchester PSTRC visit: http://www.patientsafety.manchester.ac.uk and you can learn more about the breadth of projects Citizens Juries c.i.c. are working on at https://citizensjuries.org/.