BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IORA - Institute of Operations Research and Analytics - ECPv6.15.11//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:IORA - Institute of Operations Research and Analytics
X-ORIGINAL-URL:https://iora.nus.edu.sg
X-WR-CALDESC:Events for IORA - Institute of Operations Research and Analytics
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Asia/Singapore
BEGIN:STANDARD
TZOFFSETFROM:+0800
TZOFFSETTO:+0800
TZNAME:+08
DTSTART:20250101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Asia/Singapore:20260206T100000
DTEND;TZID=Asia/Singapore:20260206T113000
DTSTAMP:20260508T122322
CREATED:20260128T061604Z
LAST-MODIFIED:20260203T030133Z
UID:27373-1770372000-1770377400@iora.nus.edu.sg
SUMMARY:DAO-ISEM-IORA Seminar Series: Francis de Véricourt
DESCRIPTION:Name of Speaker\n\n\nFrancis de Véricourt\n\n\n\n\nSchedule \n\n\n6 Feb 2026\, 10am – 11.30am \n (60 min talk + 30 min Q&A) \n\n\n\n\nVenue \n\n\nHSS 4-2\n\n\n\n\nLink to register \n(via Zoom) \n\n\nhttps://nus-sg.zoom.us/meeting/register/KcwXsVRZSI2rLe4DvkXNFQ\n\n\n\n\nTitle\n\n\nBeyond the Black Box: Unraveling the Role of Explainability in Human-AI Collaboration\n\n\n\n\nAbstract \n\n\nExplainable Artificial Intelligence (AI) models have been proposed to mitigate overreliance and underreliance on AI\, which reduce the effectiveness of human-AI collaborative tools. Yet\, empirical evidence is mixed\, and the impact of explainable AI on a decision-maker (DM)’s cognitive load and fatigue is often ignored. This paper offers a theoretical perspective on these issues. We develop an analytical model that incorporates the defining features of human and machine intelligence\, capturing the limited but flexible nature of human cognition with imperfect machine recommendations. Crucially\, we represent how AI-based explanations influence the DM’s belief in the algorithm’s predictive quality. Our results indicate that explainable AI has varying effects depending on the level of explainability it provides. While low explainability levels have no impact on decision accuracy and reliance levels\, they lessen the cognitive burden of the DM. In contrast\, higher explainability levels enhance accuracy by improving overreliance but at the expense of increased underreliance. Further\, the relative impact of explainability (c.f. a black-box system) is higher when the DM is more cognitively constrained\, the decision task is sufficiently complex or when the stakes are lower. Importantly\, higher explainability levels can escalate the DM’s cognitive burden and hence overall processing time and fatigue\, precisely when explanations are most needed\, i.e. when the DM is pressed for time to complete a complex task and doubts the machine’s quality. Our study elicits comprehensive effects of explainability on decision outcomes and cognitive effort\, enhancing our understanding of designing effective human-AI systems in diverse decision-making environments.\n\n\n\n\nAbout the Speaker\n\n\nFrancis de Véricourt is Professor of Management Science and the founding Academic Director of the Institute for Deep Tech Innovation (DEEP) at ESMT Berlin. He also holds the Joachim Faber Chair in Business and Technology\, and is the co-author of Framers\, a Penguin Random House book listed on Financial Times’ Best Books. He lived and worked in France\, USA\, Germany and Singapore.\n\nFrancis was the first Associate Dean of Research and holder of the President’s Chair at ESMT Berlin. He held faculty positions at Duke University and INSEAD\, where he was the Paul Dubrule Chaired professor in Sustainable Development\, and was a post-doctoral researcher at Massachusetts Institute of Technology (MIT).  His general research interest is in the area of decision science\, analytics and operations\, with applications in health care\, sustainability and human-AI interaction. He is the author of numerous academic articles in prominent management\, analytics and economics journals such as Management Science\, Operations Research\, American Economics Review and others. He received several outstanding research awards and is currently an Area Editor at Operations Research.\n\nFrancis has been the recipient of many teaching awards for delivering classes to MBA and Executive MBA students at ESMT and INSEAD. He has extensive experience in executive education and corporate learning solutions\, and is a regular speaker in academic and industry forums.
URL:https://iora.nus.edu.sg/events/dao-isem-iora-seminar-series-francis-de-vericourt/
CATEGORIES:IORA Seminar Series
END:VEVENT
END:VCALENDAR