Tiny implementation of deep learning models for NLP with Sony's NNabla.
- Python 3.7.2
- NNabla v1.0.11
- A vanilla recurrent neural network language model (
language-models/rnnlm/) - LSTM language model (
language-models/lstmlm/) - Character-level convolutional LSTM language model (
language-models/char-cnn-lstmlm/) - Continuous Bag-of-Words (CBOW) model (
language-models/cbow/)
- GloVe model (
word-embeddings/glove/) - Poincaré embeddngs (
word-embeddings/poincare-embeddings/)
- Encoder-decoder (
seq2seq/encoder-decoder/) - Encoder-decoder + global attention (
seq2seq/encoder-decoder-with-attention/)
- fastText (
text-classifications/fasttext/) - Self attention (
text-classifications/self_attention/) - LSTM classifier (
text-classifications/lstm-classifier/)
- Skip-gram model
- Peephole LSTM
- GRU
- Transformer
- ELMo
- etc.