Guidotti / Schmid / Longo | Explainable Artificial Intelligence | Buch | 978-3-032-08332-6 | sack.de

Buch, Englisch, 414 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 657 g

Reihe: Communications in Computer and Information Science

Guidotti / Schmid / Longo

Explainable Artificial Intelligence

Third World Conference, xAI 2025, Istanbul, Turkey, July 9-11, 2025, Proceedings, Part V
Erscheinungsjahr 2025
ISBN: 978-3-032-08332-6
Verlag: Springer

Third World Conference, xAI 2025, Istanbul, Turkey, July 9-11, 2025, Proceedings, Part V

Buch, Englisch, 414 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 657 g

Reihe: Communications in Computer and Information Science

ISBN: 978-3-032-08332-6
Verlag: Springer


This open access five-volume set constitutes the refereed proceedings of the Second World Conference on Explainable Artificial Intelligence, xAI 2025, held in Istanbul, Turkey, during July 2025. 


The 96 revised full papers presented in these proceedings were carefully reviewed and selected from 224 submissions. The papers are organized in the following topical sections:

Volume I:

Concept-based Explainable AI; human-centered Explainability; explainability, privacy, and fairness in trustworthy AI; and XAI in healthcare.

Volume II:

Rule-based XAI systems & actionable explainable AI; features importance-based XAI; novel post-hoc & ante-hoc XAI approaches; and XAI for scientific discovery.

Volume III:

Generative AI meets explainable AI; Intrinsically interpretable explainable AI; benchmarking and XAI evaluation measures; and XAI for representational alignment.

Volume IV:

XAI in computer vision; counterfactuals in XAI; explainable sequential decision making; and explainable AI in finance & legal frameworks for XAI technologies.

Volume V:

Applications of XAI; human-centered XAI & argumentation; explainable and interactive hybrid decision making; and uncertainty in explainable AI.

Guidotti / Schmid / Longo Explainable Artificial Intelligence jetzt bestellen!

Zielgruppe


Research

Weitere Infos & Material


Applications of XAI.- Global Explanations of Expected Goal Models in Football.- Comprehensive Explanations Using Natural Language Queries.- A Human-in-the-Loop Approach to Learning Social Norms as Defeasible Logical Constraints.- A Cautionary Tale About ''Neutrally'' Informative AI Tools Ahead of the 2025 Federal Elections in Germany.- Human-Centered XAI & Argumentation.- Evaluating Argumentation Graphs as Global Explainable Surrogate Models for Dense Neural Networks and their Comparison with Decision Trees.- Mind the XAI Gap: A Human-Centered LLM Framework for Democratizing Explainable AI.- Explanations for Medical Diagnosis Predictions Based on Argumentation Schemes.- Spectral Occlusion - Attribution Beyond Spatial Relevance Heatmaps.- Non-experts' Trust in XAI is Unreasonably High.- Explainable and Interactive Hybrid Decision Making.- Exploring Annotator Disagreement in Sexism Detection: Insights from Explainable AI.- Can You Regulate Your Emotions? An Empirical Investigation of the Influence of AI Explanations and Emotion Regulation on Human Decision-Making Factors.- When Bias Backfires: The Modulatory Role of Counterfactual Explanations on the Adoption of Algorithmic Bias in XAI-Supported Human Decision-Making.- Understanding Disagreement Between Humans and Machines in XAI: Robustness, Fidelity, and Region-Based Explanations in Automatic Neonatal Pain Assessment.- On Combining Embeddings, Ontology and LLM to Retrieve Semantically Similar Quranic Verses and Generate their Explanations.- Uncertainty in Explainable AI.- Improving Counterfactual Truthfulness for Molecular Property Prediction through Uncertainty Quantification.- Fast Calibrated Explanations: Efficient and Uncertainty-Aware Explanations for Machine Learning Models.- Explaining Low Perception Model Competency with High-Competency Counterfactuals.- Uncertainty Propagation in XAI: A Comparison of Analytical and Empirical Estimators.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.