On Explainability with Beatrice Fazi

NGI Forward Salon Foundationals

On Explainability with Beatrice Fazi

29th March 2021

PRESS RELEASE

 

Ms Beatrice Fazi started her speech with a presentation of her article “Beyond Human: Deep Learning, Explainability and Representation’, Theory, Culture & Society”. She focused on the political and philosophical aspects of the digital transition. As she mentioned from a political dimension, the public is asking for transparency of media and governance. On the other hand explainability is also of philosophical significance, as the internal working of the black boxes of AI remains concealed. Many philosophical questions about machines will rise. Do they have the ability of thinking? What is the definition of thought? What is algorithmic thought? According to Ms Fazi incommensurability is a key concept of the philosophy of science. AI is a representational and communicational issue between human and algorithmic relationship. Multiple levels of representation are changing the epistemic ways of explainability. Ms Fazi believes that we should rethink the alliance between humans and machines and to acknowledge the future prospects. Everyone is talking about fair, trustworthy, accountable and transparent AI, a human centric AI, but this is a human cognitive representation, and it has become necessary.
In the Roundtable that followed, Ms Loretta Anania talked about the issues of transparency, explainabilty and the abstract character of thought itself. Representing NGI she highlighted that explainability is important but from an interdisciplinary way. Her speech was oriented to the human values and ethics that AI should follow as the main concern of the public is the impact of AI and the interconnection of the real and virtual world. Ms Fazi agreed and while talking with Ms Gaelle Le Gars added to the discussion the different types of opacity, for example a system can be opaque due to lack of literacy/expertise.
In the end of this roundtable a lot of questions from the audience were answered.
The main point was that the comparative epistemology plays a huge role on shaping a common human understanding on these issues. And if we need trust, we need to trust machines because they do things with a different way and bring the same results. Closing, everyone understood that when you understand how it works, you are willing to accept it. This is why explainability is such an important matter.

Meet our guest speaker!
Dr M. Beatrice Fazi is Lecturer in Digital Humanities in the School of Media, Arts and Humanities at the University of Sussex, United Kingdom. Her research focuses on the ontologies and epistemologies produced by contemporary technoscience, particularly in relation to issues in artificial intelligence and computation and to their impact on culture and society.

Roundtable Speakers

Loretta Anania @LorettaAnania
Scientific officer at DG Connect, Next Generation Internet
Gaelle Le Gars
Independent Policy Analyst Specialising on intersecting Digital and Urban Policy
Rob van Kranenburg
Founder of #IoT Council @robvank

The event is organized by ELONTech, NGI Forward, IoT Council and will be web streamed on the site of ELONTECH, the MADE group.