This invited paper reviews a framework to assist in mitigating societal risks that software can pose. This is to promote effective human oversight, which is a central requirement enforced by the European Union’s upcoming AI Act [29]. The paper advertises fragments of an upcoming journal publication [12], and as such is itself low in genuine originality. Yet it offers a specific perspective on that original work. Extrapolating earlier work on software doping, we report on the combination of established techniques for runtime monitoring and for probabilistic falsification to arrive at a black-box analysis technique for identifying undesired effects of software. We describe its application to high-risk systems that evaluate humans in a possibly unfair or discriminating way. The approach can assist humans-in-the-loop to make better informed and more responsible decisions. Our technical contribution is complemented by juridically, philosophically, and psychologically informed perspectives on the potential problems caused by AI systems.
Taming the AI Monster: Monitoring of Individual Fairness for Effective Human Oversight
Lect.Notes Computer
International Symposium on Model Checking Software ; 2024 ; Luxembourg City, Luxembourg April 10, 2024 - April 11, 2024
2024-10-13
23 pages
Article/Chapter (Book)
Electronic Resource
English
artificial intelligence , algorithmic fairness , probabilistic falsification , adequate trust , human oversight Engineering , Control, Robotics, Mechatronics , Software Engineering/Programming and Operating Systems , Artificial Intelligence , Computer Systems Organization and Communication Networks , Computer Science
Traffic logistics: the taming of the traffic monster
British Library Conference Proceedings | 2000
|Kraft Monster - Ducati Monster Prototyp
Automotive engineering | 2013
|Monster Sause - Ducati Monster 821
Automotive engineering | 2014
|Nachgeschaerfte Monster - Ducati Monster 1200 R
Automotive engineering | 2015
|Monster Plus : Fahrzeugtest Ducati Monster S4Rs
Automotive engineering | 2006
|