using Flux, Metalhead
model = VGG(19; pretrain=true).layers
testmode!(model)
prediction = model(input)
class = argmax(prediction) # class 484: castle Downloading artifact: vgg19-IMAGENET1K_V1
CartesianIndex(484, 1)
An invitation to the Julia-XAI ecosystem
2024-07-11
What kind of explanations?
Post-hoc, local input-space explanations of black-box models:
“Which part of the input is responsible for the model’s output?”
Related work: Taija ecosystem by Patrick Altmeyer et al.
using Flux, Metalhead
model = VGG(19; pretrain=true).layers
testmode!(model)
prediction = model(input)
class = argmax(prediction) # class 484: castle Downloading artifact: vgg19-IMAGENET1K_V1
CartesianIndex(484, 1)
Pretrained model predicts class “castle”. But why?

Visualize explanations on language models
Interface definition:
AbstractXAIMethodanalyze functionExplanationBenefits
Adrian Hill acknowledges support by the Federal Ministry of Education and Research (BMBF) for the Berlin Institute for the Foundations of Learning and Data (BIFOLD) (01IS18037A).