Tag: Attention

Attention is all you need

Journal/Conference: NIPSYear(published year): 2017Author: Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia PolosukhinSubject: NLP Atte

What is Transformer

Transformer란?트랜스포머(Transformer)는 구글에서 발표한 논문 “Attention is all you need”에 나오는 모델이다. 아래 글은 이 논문 abstract의 일부분이다. The dominant sequence transduction models are based on complex recurrent or convolutio