Dr Federica Lucivero, Associate Professor at the Ethox Centre at Oxford Population Health, has been awarded a Bridging Responsible AI Divides (BRAID) Fellowship to explore the creation of ethical tools for responsible artificial intelligence (AI) governance.
Dr Lucivero is one of 17 researchers across the UK to have been awarded a fellowship to apply research expertise from humanities and arts including data ethics, copyright law, digital design, and qualitative analysis to develop solutions for pressing questions around the responsible use of AI.
Each fellow will partner with an organisation from the public, private or third sector to unite expertise for tackling existing, new or emerging AI challenges. Dr Lucivero will partner with the Ada Lovelace Institute to develop and pilot tools to support policymakers to anticipate the broader societal implications of emerging AI technologies.
Dr Lucivero said ‘Policymakers are currently expected to proactively anticipate and govern impacts of technological innovation. This is particularly needed in the fast-advancing field of AI where new applications are often pushed into the market with consequences for entire sectors. Technology foresight is more than a predictive exercise and requires us to engage in grounded yet exploratory thinking around societal and ethical implications.’
In collaboration with the Ada Lovelace Institute, the Nuffield Council on Bioethics, humanity scholars, artists, and creatives, this project will use a co-design approach to develop tools to foster policymakers’ “moral imagination” and exploratory thinking around future AI technologies.
Professor Christopher Smith, AHRC Executive Chair, said ‘The impact of AI can already be felt in many areas of our lives. It will transform our jobs and livelihoods, and impact on areas as diverse as education, policing and the creative industries. It is vital that we ensure its responsible development and use. The BRAID fellowships announced today will play an invaluable role informing the practice and tools crucial to ensuring this transformative technology is used responsibly to provide benefits for all of society.’
BRAID Co-director Professor Ewa Luger, Chair in Human-Data Interaction at Edinburgh College of Art, said ‘The 17 Fellowships offer opportunities for deeper relationships and joint impact, moving towards a genuine embedding of arts and humanities knowledge within how we think about, develop, and deploy AI in practice and in the world. It is our hope that with these connections, and working towards common challenges across sectors and diverse communities, we will take substantial strides towards a more responsible AI ecosystem.’
BRAID Co-director Professor Shannon Vallor, Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute (EFI), said ‘We are reaching a critical point in society where businesses and the public sector recognise that deploying AI systems safely and responsibly requires new kinds of knowledge and expertise, which can be challenging to access - the BRAID fellowships aim to bring together researchers with industry and the public sector to help bridge that divide between technical capability and the knowledge of how to use it wisely and well, to ensure that the benefits of AI are realised for the good of us all.’
The Fellowships are part of the BRAID programme, which is led by the University of Edinburgh in partnership with the Ada Lovelace Institute and the BBC. The £15.9 million, six-year programme is funded by the Arts and Humanities Research Council, part of UK Research and Innovation).