A case base of eXplainable Artificial Intelligence of the Things (XAIoT) systems

Citation Author(s):
Universidad Complutense de Madrid
Universidad Complutense de Madrid
Universidad Complutense de Madrid
Juan A.
Universidad Complutense de Madrid
Submitted by:
Juan Recio-Garcia
Last updated:
Fri, 04/21/2023 - 16:20
Data Format:
0 ratings - Please login to submit your rating.


The increasing complexity of intelligent systems in the Internet of Things (IoT) domain makes it essential to explain their behavior and decision-making processes to users. However, selecting an appropriate explanation method for a particular intelligent system in this domain can be challenging, given the diverse range of available XAI (eXplainable Artificial Intelligence) methods and the heterogeneity of IoT applications. This dataset is a case base elicited from an exhaustive literature review on existing explanation solutions of AIoT (Artificial Intelligence of the Things) systems.


Standard CSV file.

Domains: Aviation, Energy Management, Environment, Healthcare, Industry, Security, Smart Agriculture, and Transportation

AI Model: Case Base Reasoning (CBR), Ensemble Model (EM), Fuzzy Model (FM), Neuro-Fuzzy Model (NFM), Neuronal Network (NN), Nearest Neighbours Model (NNN), Tree-Based Model (TB), Unsorted Model (UM)

AI Task: Anomaly detection, Assistance, Automated maneuvering, Autonomous processes and robotics, Business Management, Cyber attacks detection, Decision support, Facial recognition, Image processing,  Process quality improvement., Internet of Behaviour, Intrusion detection, Modelling, Predictive maintenance, Recommendation, and Risk prediction

AI Problem: classification or regression

XAI method Concurrentness: ante-hoc or post-hoc

XAI method Scope: local or global

XAI method Portability: model-specific or model-agnostic

XAI method: ANFIS, FDE, LIME, SHAP, t-SNE, Integrated Gradients, OC-Tree, Ada-WHIPS, ALMMo-0*, ApparentFlow-net, CGP, CTree, Grad-CAM, KSL, LORE, Ontological Perturbation, RBIA, RetainVis, RPART, xDNN, Concept Attribution, Encoder-Decoder, HihO, SIDU, CIT2FS, iNNvestigate, SALIENCY (XAI-CBIR), CART, Feature Importance, FFT, J48, Prescience, DIFFI, GSX, Saliency Map, Decision Tree, Deep-SHAP, ELI5, XAI360*, CAM, TRUST, QMC, RuleFit, PDP, Attention Maps, RISE, XRAI.

XAI technique: Activation Clusters, Architecture Modification, Composite, Data-drive, Feature Relevance, Filter, Knowledge Extraction, Optimisation Based, Probabilistic, Simplification, and Statistics


Funding Agency: 
Grant Number: 
CHIST-ERA-19-XAI-008, PCI2020-120720-2


Need a daaset to learn to use explainable ai

Submitted by a a on Tue, 05/30/2023 - 04:26