[NLP] Intro
Here, under NLP categories, the historical flow of the overall model / process will be presented.
The timeline is as follows:
- Recurrent Neural Network (RNN) - Rumelhart, 1986,
- Long Short-Term Memory (LSTM) - Hochreiter, Schmidhuber - 1997,
- Sequence to Sequence (Seq2Seq)</u> - Sutskever, Vinyals, Le - 2014,
- Attention pt. 1 & Attention pt. 2 - Google - 2017
- GPT 1, 2, 3 - OpenAI - 2018 ~ 2020,
- Bidirectional Encoder Representations from Transformers - Google - 2018.
Plus, some interesting models are to be reviewed and explained in simpler ways.