Author: Tyler Osborne
Title: Cognitive States: Belief State Inference via Deep Learning
Abstract: The Cognitive States project is an ongoing investigation of how well a pre-trained deep learning model, fine-tuned with various corpora of annotated texts, can infer belief states. Overall, the goal of this project is to make incremental progress towards more advanced belief state and sentiment detection capabilities. Previous research has focused on achieving state-of-the-art F1 results on classification tasks as well as end-to-end generative tasks defined on two corpora annotated for belief, sentiment, or both, Factbank and MPQA, using two models, BERT and Flan-T5 (Murzaku et al). We use the same models to define similar tasks on the Language Understanding (LU) corpus, in order to corroborate insights gained from previous work. Furthermore, we present a novel database representation for fine-tuning data, allowing for the unification of Factbank, MPQA, LU, and additional annotation-based belief/sentiment corpora into a single dataset for seamless use in multi-task learning contexts, requiring unifying data transformations such as converting unigram head words to n-gram spans. Our results for LU's majority class align with those of Murzaku et al. on all tasks, whereas our approaches performed less well on minority classes. Plans to improve minority performance include leveraging a few-shot approach or generating synthetic data by swapping out words in existing examples with close synonyms.