Skip to content

tamararruda/DnX

Repository files navigation

DnX and FastDnX

This is repo for Distill n' Explain: explaining graph neural networks using simple surrogates. This folder contains the source code of using DnX and FastDnX to explain node predictions.

Authors:

Requirements

We have built this implementation using:

  • PyTorch
  • PyTorch Geometric

Folder description:

  • trained_gcn: contains the models that will be explained
  • trained_distiller: contains the distilled models
  • explanation: contains the explanations generated by our models

file description

  • model.py: the model used to distill.
  • utils.py: all functions used to load, save and evaluate the dataset
  • main_distiller.py: code with the functions to distill the models to be explained
  • main_syn.ipynb: notebook to perform the task of generating explanations for the synthetic datasets

Datasets:

The nomenclature (syn1, syn2 ...., syn6) is used to facilitate experiments

  • BA-HouseShapes (syn1),
  • BA-Community (syn2),
  • BA-Grids (syn3),
  • Tree-Cycles (syn4),
  • TreeGrids (syn5),
  • BA-Bottle-Shaped (syn6)

To run the DnX and Fast DnX explainers for the synthetic datasets just run the notebook main_syn.ipynb:

Cite

@inproceedings{distillnexplain2023,
  title={Distill n' Explain: explaining graph neural networks using simple surrogates},
  author={T. A. Pereira and E. J. Nascimento and L. E. Resck and D. Mesquita and A. H. Souza},
  booktitle={Artificial Intelligence and Statistics (AISTATS)},
  year={2023}
}

About

explainer in process

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published