Please use this identifier to cite or link to this item:
Title: HyDRA: hypergradient data relevance analysis for interpreting deep neural networks.
Authors: Chen, Yuanyuan
Li, Boyang
Yu, Han
Wu, Pengcheng
Miao, Chunyan
Keywords: Engineering::Computer science and engineering
Issue Date: 2020
Source: Chen, Y., Li, B., Yu, H., Wu, P. & Miao, C. (2020). HyDRA: hypergradient data relevance analysis for interpreting deep neural networks.. 35th AAAI Conference on Artificial Intelligence (AAAI 2021).
Project: Alibaba-NTU-AIR2019B1
Abstract: The behaviors of deep neural networks (DNNs) are notoriously resistant to human interpretations. In this paper, we propose Hypergradient Data Relevance Analysis, or HYDRA, which interprets the predictions made by DNNs as effects of their training data. Existing approaches generally estimate data contributions around the final model parameters and ignore how the training data shape the optimization trajectory. By unrolling the hypergradient of test loss w.r.t. the weights of training data, HYDRA assesses the contribution of training data toward test data points throughout the training trajectory. In order to accelerate computation, we remove the Hessian from the calculation and prove that, under moderate conditions, the approximation error is bounded. Corroborating this theoretical claim, empirical results indicate the error is indeed small. In addition, we quantitatively demonstrate that HYDRA outperforms influence functions in accurately estimating data contribution and detecting noisy data labels. The source code is available at
Rights: © 2021 Association for the Advancement of Artificial Intelligence ( All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Conference Papers

Page view(s)

Updated on Oct 1, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.