# Neural NLP

## **CONVOLUTION NEURAL NETS (CNN)**

1. [**Cnn for text**](https://medium.com/@TalPerry/convolutional-methods-for-text-d5260fd5675f) **- tal perry**
2. [**1D CNN using KERAS**](https://blog.goodaudience.com/introduction-to-1d-convolutional-neural-networks-in-keras-for-time-sequences-3a7ff801a2cf)

## **SEQ2SEQ SEQUENCE TO SEQUENCE**

1. [**Keras blog**](https://blog.keras.io/a-ten-minute-introduction-to-sequence-to-sequence-learning-in-keras.html) **- char-level, token-using embedding layer, teacher forcing**
2. [**Teacher forcing explained**](https://towardsdatascience.com/what-is-teacher-forcing-3da6217fed1c)
3. [**Same as keras but with token-level**](https://towardsdatascience.com/machine-translation-with-the-seq2seq-model-different-approaches-f078081aaa37)
4. [**Medium on char, word, byte-level**](https://medium.com/@petepeeradejtanruangporn/experimenting-with-neural-machine-translation-for-thai-1681fd2b375a)
5. [**Mastery on enc-dec using the keras method**](https://machinelearningmastery.com/develop-encoder-decoder-model-sequence-sequence-prediction-keras/)**, and on** [**neural translation**](https://machinelearningmastery.com/define-encoder-decoder-sequence-sequence-model-neural-machine-translation-keras/)
6. [**Machine translation git from eng to jap**](https://github.com/samurainote/seq2seq_translate_slackbot/blob/master/seq2seq_translate.py)**,** [**another**](https://github.com/samurainote/seq2seq_translate_slackbot)**, and its** [**medium**](https://towardsdatascience.com/how-to-implement-seq2seq-lstm-model-in-keras-shortcutnlp-6f355f3e5639)

![](https://lh6.googleusercontent.com/bcrIRzPLlcnQBl1zWR2s0_tB-NNEQxd8ZNQK8oK2NJsc29Fv6RdfKynfjHeNsSvl5d0SqK55k8xN1NAIrvEcnFEtpfZCfOHZzCSFLKmxeBWXn903VOJKiKTMV4Ynm_HL6Sgls2BN)

1. [**Incorporating Copying Mechanism in Sequence-to-Sequence Learning**](https://arxiv.org/abs/1603.06393) **- In this paper, we incorporate copying into neural network-based Seq2Seq learning and propose a new model called CopyNet with encoder-decoder structure. CopyNet can nicely integrate the regular way of word generation in the decoder with the new copying mechanism which can choose sub-sequences in the input sequence and put them at proper places in the output sequence.**
