These tutorials show you how Argilla can be used in combination with SHAP.
SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model.

๐Ÿ•ต๏ธโ€โ™€๏ธ Analyzing predictions with model explainability methods

MLOps Steps: Monitoring
NLP Tasks: TextClassification
Libraries: shap, transformers-interpret
Techniques: Explainability