On this week's Journal Club session, Na Helian will talk about "Attention in Neural Networks".
Based on the following papers, I will introduce different types of attention techniques for natural language and image processing applications.
Papers:
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. Gomez, Ł. Kaiser, I. Polosukhin, "Attention is All you Need", 2017, Advances in Neural Information Processing Systems, 30,
- A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, N. Houlsby, "An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale", 2021, International Conference on Learning Representations,
- Q. Wang, B. Wu, P. Zhu, P. Li, W. Zuo, Q. Hu, "ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks", 2020, IEEE/CVF Conference on Computer Vision and Pattern Recognition,
- S. Woo, J. Park, J. Lee, I. Kweon, "CBAM: Convolutional Block Attention Module", 2018, European Conference on Computer Vision,
Date: 2024/04/12
Time: 14:00
Location: C258 & online