Textclassifiers: Collection of Text Classification/Document Classification/Sentence Classification/Sentiment Analysis models for PyTorch
Install dependencies:
pip3 install -r requirements.txt
python3 run.py --mode train --config configs/fasttext_config.yaml
The overall model performances on test set.
**Note: The test's model parameter configuration is saved in ./examples/
Model | Score | |||
---|---|---|---|---|
Query Well formedness | AG News | |||
Accuracy | F1 Score | Accuracy | F1 Score | |
FastText [1] | 66.33% | 66.20% | __.__% | __.__% |
TextRNN | 69.35% | 68.98% | __.__% | __.__% |
TextCNN [2] | 68.08% | 67.72% | __.__% | __.__% |
RCNN [3] | 68.00% | 67.72% | __.__% | __.__% |
LSTM + Attention [4] | 67.27% | 66.70% | __.__% | __.__% |
Transformer [5] | 68.31% | 67.78% | __.__% | __.__% |
BERT [6] | __.__% | __.__% | __.__% | __.__% |
HAN [7] | __.__% | __.__% | __.__% | __.__% |
DNN | __.__% | __.__% | __.__% | __.__% |
- FastText released with the paper Bag of tricks for efficient text classification by Joulin, Armand, et al.
- TextRNN
- TextCNN released with the paper Convolutional neural networks for sentence classification by Kim, Yoon.
- RCNN released with the paper Recurrent convolutional neural networks for text classification by Lai, Siwei, et al.
- LSTM + Attention released with the paper Text classification research with attention-based recurrent neural networks by Du, Changshun, and Lei Huang.
- Transformer released with the paper Attention is all you need by Vaswani, Ashish, et al.
- BERT released with the paper BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding by Devlin, Jacob, et al.
- Hierarchical Attention Network released with the paper Hierarchical Attention Networks for Document Classification by Yang, Zichao, et al.
- Dynamic Memory Network
[1] Joulin, Armand, Edouard Grave, and Piotr Bojanowski Tomas Mikolov. "Bag of Tricks for Efficient Text Classification." EACL 2017 (2017): 427.
[2] Kim, Yoon. "Convolutional Neural Networks for Sentence Classification." Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014.
[3] Lai, Siwei, et al. "Recurrent convolutional neural networks for text classification." In Proc. Conference of the Association for the Advancement of Artificial Intelligence (AAAI). 2015.
[4] Du, Changshun, and Lei Huang. "Text classification research with attention-based recurrent neural networks." International Journal of Computers Communications & Control 13.1 (2018): 50-61.
[5] Vaswani, Ashish, et al. "Attention is all you need." Proceedings of the 31st International Conference on Neural Information Processing Systems. Curran Associates Inc., 2017.
[6] Devlin, Jacob, et al. "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding." Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019.
[7] Yang, Zichao, et al. "Hierarchical attention networks for document classification." Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies. 2016.