Attention Mechanisms


Sequence-to-sequence learning (Seq2Seq) is all about models that take a sequence as an input and outputs a sequence too. There are many examples and applications of this but today I will focus on one specific application which is a machine language translation. For eg English to Hindi.

There are many approaches you can follow to solve this particular task but the state of the art technique is the Encoder-Decoder approach(More on this later).

Click here to check his complete article