Skip to content

Latest commit

 

History

History
46 lines (40 loc) · 5.54 KB

intent_detection_slot_filling.md

File metadata and controls

46 lines (40 loc) · 5.54 KB

Intent Detection and Slot Filling

Intent Detection and Slot Filling is the task of interpreting user commands/queries by extracting the intent and the relevant slots.

Example (from ATIS):

Query: What flights are available from pittsburgh to baltimore on thursday morning
Intent: flight info
Slots: 
    - from_city: pittsburgh
    - to_city: baltimore
    - depart_date: thursday
    - depart_time: morning

ATIS

ATIS (Air Travel Information System) (Hemphill et al.) is a dataset by Microsoft CNTK. Available from the github page. The slots are labeled in the BIO (Inside Outside Beginning) format (similar to NER). This dataset contains only air travel related commands. Most of the ATIS results are based on the work here.

Model Slot F1 Score Intent Accuracy Paper / Source Code
Bi-model with decoder 96.89 98.99 A Bi-model based RNN Semantic Frame Parsing Model for Intent Detection and Slot Filling
Stack-Propagation + BERT 96.10 97.50 A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language Understanding Official
Stack-Propagation 95.90 96.90 A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language Understanding Official
Attention Encoder-Decoder NN 95.87 98.43 Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
SF-ID (BLSTM) network 95.80 97.76 A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling Official
Capsule-NLU 95.20 95.00 Joint Slot Filling and Intent Detection via Capsule Neural Networks Official
Joint GRU model(W) 95.49 98.10 A Joint Model of Intent Determination and Slot Filling for Spoken Language Understanding
Slot-Gated BLSTM with Attension 95.20 94.10 Slot-Gated Modeling for Joint Slot Filling and Intent Prediction Official
Joint model with recurrent slot label context 94.64 98.40 Joint Online Spoken Language Understanding and Language Modeling with Recurrent Neural Networks Official
Recursive NN 93.96 95.40 JOINT SEMANTIC UTTERANCE CLASSIFICATION AND SLOT FILLING WITH RECURSIVE NEURAL NETWORKS
Encoder-labeler Deep LSTM 95.66 NA Leveraging Sentence-level Information with Encoder LSTM for Natural Language Understanding
RNN with Label Sampling 94.89 NA Recurrent Neural Network Structured Output Prediction for Spoken Language Understanding
Hybrid RNN 95.06 NA Using recurrent neural networks for slot filling in spoken language understanding.
RNN-EM 95.25 NA Recurrent neural networks with external memory for language understanding
CNN-CRF 94.35 NA Convolutional neural network based triangular crf for joint intent detection and slot filling

SNIPS

SNIPS is a dataset by Snips.ai for Intent Detection and Slot Filling benchmarking. Available from the github page. This dataset contains several day to day user command categories (e.g. play a song, book a restaurant).

Model Slot F1 Score Intent Accuracy Paper / Source Code
Stack-Propagation + BERT 97.00 99.00 A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language Understanding Official
Stack-Propagation 94.20 98.00 A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language Understanding Official
SF-ID (BLSTM) network 92.23 97.43 A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling Official
Capsule-NLU 91.80 97.70 Joint Slot Filling and Intent Detection via Capsule Neural Networks Official
Slot-Gated BLSTM with Attension 88.80 97.00 Slot-Gated Modeling for Joint Slot Filling and Intent Prediction Official