Standard
GradAuto: Energy-oriented Attack on Dynamic Neural Networks. / Pan, Jianhong; Zheng, Qichen; Fan, Zhipeng et al.
European Conference on Computer Vision (ECCV). ed. / Shai Avidan; Gabriel Brostow; Moustapha Cissé; Giovanni Maria Farinella; Tal Hassner. Springer, 2022. p. 637-653 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 13664).
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
Harvard
Pan, J, Zheng, Q, Fan, Z
, Rahmani, H, Ke, Q & Liu, J 2022,
GradAuto: Energy-oriented Attack on Dynamic Neural Networks. in S Avidan, G Brostow, M Cissé, GM Farinella & T Hassner (eds),
European Conference on Computer Vision (ECCV). Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 13664, Springer, pp. 637-653, 17th European Conference on Computer Vision, ECCV 2022, Tel Aviv, Israel,
23/10/22.
https://doi.org/10.1007/978-3-031-19772-7_37
APA
Pan, J., Zheng, Q., Fan, Z.
, Rahmani, H., Ke, Q., & Liu, J. (2022).
GradAuto: Energy-oriented Attack on Dynamic Neural Networks. In S. Avidan, G. Brostow, M. Cissé, G. M. Farinella, & T. Hassner (Eds.),
European Conference on Computer Vision (ECCV) (pp. 637-653). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 13664). Springer.
https://doi.org/10.1007/978-3-031-19772-7_37
Vancouver
Pan J, Zheng Q, Fan Z
, Rahmani H, Ke Q, Liu J.
GradAuto: Energy-oriented Attack on Dynamic Neural Networks. In Avidan S, Brostow G, Cissé M, Farinella GM, Hassner T, editors, European Conference on Computer Vision (ECCV). Springer. 2022. p. 637-653. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.1007/978-3-031-19772-7_37
Author
Pan, Jianhong ; Zheng, Qichen ; Fan, Zhipeng et al. /
GradAuto : Energy-oriented Attack on Dynamic Neural Networks. European Conference on Computer Vision (ECCV). editor / Shai Avidan ; Gabriel Brostow ; Moustapha Cissé ; Giovanni Maria Farinella ; Tal Hassner. Springer, 2022. pp. 637-653 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Bibtex
@inproceedings{56a2359fffb3449abc6b263cb7d827a0,
title = "GradAuto: Energy-oriented Attack on Dynamic Neural Networks",
abstract = "Dynamic neural networks could adapt their structures or parameters based on different inputs. By reducing the computation redundancy for certain samples, it can greatly improve the computational efficiency without compromising the accuracy. In this paper, we investigate the robustness of dynamic neural networks against energy-oriented attacks. We present a novel algorithm, named GradAuto, to attack both dynamic depth and dynamic width models, where dynamic depth networks reduce redundant computation by skipping some intermediate layers while dynamic width networks adaptively activate a subset of neurons in each layer. Our GradAuto carefully adjusts the direction and the magnitude of the gradients to efficiently find an almost imperceptible perturbation for each input, which will activate more computation units during inference. In this way, GradAuto effectively boosts the computational cost of models with dynamic architectures. Compared to previous energy-oriented attack techniques, GradAuto obtains the state-of-the-art result and recovers 100% dynamic network reduced FLOPs on average for both dynamic depth and dynamic width models. Furthermore, we demonstrate that GradAuto offers us great control over the attacking process and could serve as one of the keys to unlock the potential of the energy-oriented attack. Please visit https://github.com/JianhongPan/GradAuto for code.",
author = "Jianhong Pan and Qichen Zheng and Zhipeng Fan and Hossein Rahmani and Qiuhong Ke and Jun Liu",
year = "2022",
month = oct,
day = "28",
doi = "10.1007/978-3-031-19772-7_37",
language = "English",
isbn = "9783031197710",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer",
pages = "637--653",
editor = "Shai Avidan and Gabriel Brostow and Ciss{\'e}, {Moustapha } and Farinella, {Giovanni Maria} and Hassner, {Tal }",
booktitle = "European Conference on Computer Vision (ECCV)",
note = "17th European Conference on Computer Vision, ECCV 2022 ; Conference date: 23-10-2022 Through 27-10-2022",
}
RIS
TY - GEN
T1 - GradAuto
T2 - 17th European Conference on Computer Vision, ECCV 2022
AU - Pan, Jianhong
AU - Zheng, Qichen
AU - Fan, Zhipeng
AU - Rahmani, Hossein
AU - Ke, Qiuhong
AU - Liu, Jun
PY - 2022/10/28
Y1 - 2022/10/28
N2 - Dynamic neural networks could adapt their structures or parameters based on different inputs. By reducing the computation redundancy for certain samples, it can greatly improve the computational efficiency without compromising the accuracy. In this paper, we investigate the robustness of dynamic neural networks against energy-oriented attacks. We present a novel algorithm, named GradAuto, to attack both dynamic depth and dynamic width models, where dynamic depth networks reduce redundant computation by skipping some intermediate layers while dynamic width networks adaptively activate a subset of neurons in each layer. Our GradAuto carefully adjusts the direction and the magnitude of the gradients to efficiently find an almost imperceptible perturbation for each input, which will activate more computation units during inference. In this way, GradAuto effectively boosts the computational cost of models with dynamic architectures. Compared to previous energy-oriented attack techniques, GradAuto obtains the state-of-the-art result and recovers 100% dynamic network reduced FLOPs on average for both dynamic depth and dynamic width models. Furthermore, we demonstrate that GradAuto offers us great control over the attacking process and could serve as one of the keys to unlock the potential of the energy-oriented attack. Please visit https://github.com/JianhongPan/GradAuto for code.
AB - Dynamic neural networks could adapt their structures or parameters based on different inputs. By reducing the computation redundancy for certain samples, it can greatly improve the computational efficiency without compromising the accuracy. In this paper, we investigate the robustness of dynamic neural networks against energy-oriented attacks. We present a novel algorithm, named GradAuto, to attack both dynamic depth and dynamic width models, where dynamic depth networks reduce redundant computation by skipping some intermediate layers while dynamic width networks adaptively activate a subset of neurons in each layer. Our GradAuto carefully adjusts the direction and the magnitude of the gradients to efficiently find an almost imperceptible perturbation for each input, which will activate more computation units during inference. In this way, GradAuto effectively boosts the computational cost of models with dynamic architectures. Compared to previous energy-oriented attack techniques, GradAuto obtains the state-of-the-art result and recovers 100% dynamic network reduced FLOPs on average for both dynamic depth and dynamic width models. Furthermore, we demonstrate that GradAuto offers us great control over the attacking process and could serve as one of the keys to unlock the potential of the energy-oriented attack. Please visit https://github.com/JianhongPan/GradAuto for code.
U2 - 10.1007/978-3-031-19772-7_37
DO - 10.1007/978-3-031-19772-7_37
M3 - Conference contribution/Paper
SN - 9783031197710
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 637
EP - 653
BT - European Conference on Computer Vision (ECCV)
A2 - Avidan, Shai
A2 - Brostow, Gabriel
A2 - Cissé, Moustapha
A2 - Farinella, Giovanni Maria
A2 - Hassner, Tal
PB - Springer
Y2 - 23 October 2022 through 27 October 2022
ER -