This project aims to implement the LRP rules for any tensorflow:1.4 graph consisting of simple components such as linear layers, convolutions, LSTMs and simple pooling layers. See Status to see how far we have come.
- Linear layers
- Convolutions
- Max pooling
- Nonlinearities
- LSTM
- Concatenates
- Splits
- Tile
- Reshaping (Reshape, Expand_dims, Squeeze)
- Sparse matrix multiplication
- Sparse reshape
A simple usage of the framework:
from lrp import lrp
with g.as_default():
inp = ...
pred = ...
config = LRPConfiguration()
# Set propagation rule for, e.g., linear layers
config.set(LAYER.LINEAR, AlphaBetaConfiguration(alpha=2, beta=-1))
# Calculate the relevance scores using lrp
expl = lrp.lrp(inp, pred, config)
with tf.Session() as sess:
# Compute prediction and explanation
prediction, explanation = sess.run([pred, expl], feed_dict={inp: ...})