Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/147652
Title: HyDRA: hypergradient data relevance analysis for interpreting deep neural networks.
Authors: Chen, Yuanyuan
Li, Boyang
Yu, Han
Wu, Pengcheng
Miao, Chunyan
Issue Date: 2020
Source: Chen, Y., Li, B., Yu, H., Wu, P. & Miao, C. (2020). HyDRA: hypergradient data relevance analysis for interpreting deep neural networks.. 35th AAAI Conference on Artificial Intelligence (AAAI 2021).
Project: Alibaba-NTU-AIR2019B1
NSC-2019-011
AISG-GC-2019-003
A20G8b0102
NRFI05-2019-0002
Conference: 35th AAAI Conference on Artificial Intelligence (AAAI 2021)
Abstract: The behaviors of deep neural networks (DNNs) are notoriously resistant to human interpretations. In this paper, we propose Hypergradient Data Relevance Analysis, or HYDRA, which interprets the predictions made by DNNs as effects of their training data. Existing approaches generally estimate data contributions around the final model parameters and ignore how the training data shape the optimization trajectory. By unrolling the hypergradient of test loss w.r.t. the weights of training data, HYDRA assesses the contribution of training data toward test data points throughout the training trajectory. In order to accelerate computation, we remove the Hessian from the calculation and prove that, under moderate conditions, the approximation error is bounded. Corroborating this theoretical claim, empirical results indicate the error is indeed small. In addition, we quantitatively demonstrate that HYDRA outperforms influence functions in accurately estimating data contribution and detecting noisy data labels. The source code is available at https://github.com/cyyever/aaaihydra8686.
URI: https://hdl.handle.net/10356/147652
Schools: School of Computer Science and Engineering 
Research Centres: Alibaba-NTU Joint Research Institute
Rights: © 2021 Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Conference Papers

Page view(s)

374
Updated on May 7, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.