Data
Viewed: [[ro.stat.viewed]] Cited: [[ro.stat.cited]] Accessed: [[ro.stat.accessed]]
ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Adc&rfr_id=info%3Asid%2FANDS&rft_id=info:doi10.26180/28443197.v2&rft.title=DEFAULTS Dataset&rft.identifier=https://doi.org/10.26180/28443197.v2&rft.publisher=Monash University&rft.description=Facial expression recognition (FER) has emerged as a promising approach to the development of emotion-aware intelligent agents and systems. However, key challenges remain in utilizing FER in real-world contexts, including ensuring user understanding and establishing a suitable level of user trust. We developed a novel explanation method utilizing Facial Action Units (FAUs) to explain the output of a FER model through both textual and visual modalities. We conducted an empirical user study evaluating user understanding and trust, comparing our approach to state-of-the-art eXplainable AI (XAI) methods. Our results indicate that visual AND textual as well as textual-only FAU-based explanations resulted in better user understanding of the FER model. We also show that all modalities of FAU-based methods improved appropriate trust of the users towards the FER model.&rft.creator=Dana Kulic&rft.creator=Leimin Tian&rft.creator=Mor Vered&rft.creator=Sanjeev Nahulanthran&rft.date=2025&rft_rights=CC-BY-4.0&rft_subject=Facial Expression Recognition.&rft_subject=Transparency&rft_subject=Trust&rft_subject=Explainability&rft_subject=Facial Action Unit&rft_subject=Affective computing&rft_subject=Fairness, accountability, transparency, trust and ethics of computer systems&rft_subject=Human-computer interaction&rft.type=dataset&rft.language=English Access the data

Licence & Rights:

Open Licence view details
CC-BY

CC-BY-4.0

Full description

Facial expression recognition (FER) has emerged as a promising approach to the development of emotion-aware intelligent agents and systems. However, key challenges remain in utilizing FER in real-world contexts, including ensuring user understanding and establishing a suitable level of user trust. We developed a novel explanation method utilizing Facial Action Units (FAUs) to explain the output of a FER model through both textual and visual modalities. We conducted an empirical user study evaluating user understanding and trust, comparing our approach to state-of-the-art eXplainable AI (XAI) methods. Our results indicate that visual AND textual as well as textual-only FAU-based explanations resulted in better user understanding of the FER model. We also show that all modalities of FAU-based methods improved appropriate trust of the users towards the FER model.

Issued: 2025-02-19

Created: 2025-02-20

This dataset is part of a larger collection

Click to explore relationships graph
Subjects

User Contributed Tags    

Login to tag this record with meaningful keywords to make it easier to discover

Identifiers