BERT for Joint Intent Classification and Slot Filling. Chen, Q., Zhuo, Z., & Wang, W.
BERT for Joint Intent Classification and Slot Filling [link]Paper  abstract   bibtex   
Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability, especially for rare words. Recently a new language representation model, BERT (Bidirectional Encoder Representations from Transformers), facilitates pre-training deep bidirectional representations on large-scale unlabeled corpora, and has created state-of-the-art models for a wide variety of natural language processing tasks after simple fine-tuning. However, there has not been much effort on exploring BERT for natural language understanding. In this work, we propose a joint intent classification and slot filling model based on BERT. Experimental results demonstrate that our proposed model achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.
@article{chenBERTJointIntent2019,
  archivePrefix = {arXiv},
  eprinttype = {arxiv},
  eprint = {1902.10909},
  primaryClass = {cs},
  title = {{{BERT}} for {{Joint Intent Classification}} and {{Slot Filling}}},
  url = {http://arxiv.org/abs/1902.10909},
  abstract = {Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability, especially for rare words. Recently a new language representation model, BERT (Bidirectional Encoder Representations from Transformers), facilitates pre-training deep bidirectional representations on large-scale unlabeled corpora, and has created state-of-the-art models for a wide variety of natural language processing tasks after simple fine-tuning. However, there has not been much effort on exploring BERT for natural language understanding. In this work, we propose a joint intent classification and slot filling model based on BERT. Experimental results demonstrate that our proposed model achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.},
  urldate = {2019-03-01},
  date = {2019-02-28},
  keywords = {Computer Science - Computation and Language},
  author = {Chen, Qian and Zhuo, Zhu and Wang, Wen},
  file = {/home/dimitri/Nextcloud/Zotero/storage/ZNEXFXZ9/Chen et al. - 2019 - BERT for Joint Intent Classification and Slot Fill.pdf;/home/dimitri/Nextcloud/Zotero/storage/QK6RL8QE/1902.html}
}

Downloads: 0