欢迎访问《空军工程大学学报》官方网站!

咨询热线:029-84786242 RSS EMAIL-ALERT
基于NadaMax更新与动态正则化的对抗样本迁移性增强方法
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP391.4

基金项目:

国家自然科学基金(62402521)


A Method of Enhancing Adversarial Example Transferability Based on NadaMax Update and Dynamic Regularization
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对深度学习模型中对抗样本迁移性和黑盒攻击能力不足的问题,研究设计了一种基于NadaMax优化器的迭代快速梯度方法(NM-FGSM)。该方法结合了Nesterov加速梯度和Adamax优化器的优势,通过自适应学习率和前瞻动量向量提高梯度更新精确度,并引入动态正则化增强问题凸性,优化算法稳定性和针对性。实验结果表明,NM-FGSM 在不同攻击策略下优于现有方法,尤其在先进防御场景中攻击成功率提高了4%~8%。通过动态正则化的损失函数,对抗样本的跨模型迁移能力得到提升,进一步增强了黑盒攻击效果。最后,讨论了未来优化NM-FGSM 算法和设计防御措施的研究方向,为深度学习模型的安全性研究提供了新的思路。

    Abstract:

    To address the problem of insufficient transferability of adversarial examples and inadequate black-box attack capabilities in deep learning models, this study designs an iterative fast gradient method based on the NadaMax optimizer (NM-FGSM). This method integrates the advantages of Nesterov Accelerated Gradient and the Adamax optimizer, improving the accuracy of gradient updates through adaptive learning rates and lookahead momentum vectors. Additionally, dynamic regularization is introduced to enhance the convexity of the problem, optimizing algorithm stability and specificity. The experimental results demonstrate that the NM-FGSM is prior to the existing methods under conditions of various attack strategies, particularly in advanced defense scenarios, attack success rate increases by 4%~8%. The dynamically regularized loss function enhances the cross-model transferability of adversarial examples, thereby further improving black-box attack effectiveness. Finally, points out the way forward for the NMFGSM algorithm and defense measures, providing a new insight into the security research of deep learning models.

    参考文献
    相似文献
    引证文献
引用本文

宋亚飞, 仇文博, 王艺菲, 冯存前.基于NadaMax更新与动态正则化的对抗样本迁移性增强方法[J].空军工程大学学报,2025,26(3):119-127

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2025-06-04
  • 出版日期: