Neural NLP

CONVOLUTION NEURAL NETS (CNN)

SEQ2SEQ SEQUENCE TO SEQUENCE

  1. Keras blogarrow-up-right - char-level, token-using embedding layer, teacher forcing

  1. Incorporating Copying Mechanism in Sequence-to-Sequence Learningarrow-up-right - In this paper, we incorporate copying into neural network-based Seq2Seq learning and propose a new model called CopyNet with encoder-decoder structure. CopyNet can nicely integrate the regular way of word generation in the decoder with the new copying mechanism which can choose sub-sequences in the input sequence and put them at proper places in the output sequence.

Last updated

Was this helpful?