Github for original TF code: https://github.com/prdwb/attentive_history_selection
Paper we are implementing in PyTorch: https://arxiv.org/abs/1908.09456
Rewritten scripts:
- pt_cqa_supports.py python pt_cqa_supports.py
- pt_cqa_gen_batches.py
- bert_model.py
- cqa_model.py
- bert_rep (updated) [READY]
- bert_segment_rep [No need to re-implement, this function is not used]
- cqa_model [READY]
- aux_cqa_model [No need to re-implement, this function is exactly the same as cqa_model]
- yesno_model [READY]
- followup_model [No need to re-implement, this function is exactly the same as yesno_model]
- history_attention_net [READY]
- disable_history_attention_net [READY]
- fine_grained_history_attention_net [READY]
To run main script from within the 'script' directory: (15.05.2020)
python main_qa_script.py --quac_data_dir ../data/ --cache_dir ../data/final_cache_dir --output_dir ../exps_dir --batch_size 24 --do_train --fine_grained_attention