_# Quantum-Natural-Language-Processing-with-lambeq---Quantinuum Womanium Quantum Hackathon 2022
Quantum Natural Language Processing with Lambeq Challenge by Quantinuum
Group : QLinguists
Members :
Amirali Malekani Nezhad (Email : [email protected], Github username and dicord ID: A.C.E07#8672, https://github.com/ACE07-Sev)
Yousra Bouakba (Email : [email protected], Github username and dicord ID : youyaQ#8253, https://github.com/yousrabou)
Vishal Mandal (Email : [email protected], Github username and dicord ID : Vishal Mandal#7391, https://github.com/Vishal-Mandal
Mostafa Shabani (Email : [email protected], Github username and dicord ID : Mostafa#5738, https://github.com/Spintronic6889)
The model provided is chosen based on the experiments run with a wide array of readers, preprocessing function, rewrite rules and ansatz parameters as well as ansatz classes themselves (inclusive of subclassesing and writing an original ansatz class). The document below summarises all the experiments run based on including and excluding certain features such as UNK tokenization, sentence level preprocessing, and syntax rewriting at diagram level. We have provided two models based on sequence dependency.
Current accuracies for each category sits at :
Sequence Independent Spiders_reader model : Train acc = 100 , Valid acc = 93, Test acc = 84
Sequence Dependent cups_reader model : Train acc = 87, Valid acc = 67, Test acc = 55
Link of document for reference : https://docs.google.com/document/d/1A_Wk0rcOOAFr3w6K7tsWd8YB4VOL9dXteOaea0Bgyy4/edit?usp=sharing