7

[2210.06781] Closed-book Question Generation via Contrastive Learning

 1 year ago
source link: https://arxiv.org/abs/2210.06781
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

[Submitted on 13 Oct 2022]

Closed-book Question Generation via Contrastive Learning

Download PDF

Question Generation (QG) is a fundamental NLP task for many downstream applications. Recent studies on open-book QG, where supportive question-context pairs are provided to models, have achieved promising progress. However, generating natural questions under a more practical closed-book setting that lacks these supporting documents still remains a challenge. In this work, to learn better representations from semantic information hidden in question-answer pairs under the closed-book setting, we propose a new QG model empowered by a contrastive learning module and an answer reconstruction module. We present a new closed-book QA dataset -- WikiCQA involving abstractive long answers collected from a wiki-style website. In the experiments, we validate the proposed QG model on both public datasets and the new WikiCQA dataset. Empirical results show that the proposed QG model outperforms baselines in both automatic evaluation and human evaluation. In addition, we show how to leverage the proposed model to improve existing closed-book QA systems. We observe that by pre-training a closed-book QA model on our generated synthetic QA pairs, significant QA improvement can be achieved on both seen and unseen datasets, which further demonstrates the effectiveness of our QG model for enhancing unsupervised and semi-supervised QA.

Subjects: Computation and Language (cs.CL)
Cite as: arXiv:2210.06781 [cs.CL]
  (or arXiv:2210.06781v1 [cs.CL] for this version)
  https://doi.org/10.48550/arXiv.2210.06781

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK