Multi-hop Question Generation without Supporting Fact Information

Question generation is the parallel task of question answering, where given an input context and optionally, an answer, the goal is to generate a relevant and fluent natural language question. Although recent works on question generation have experienced success by utilizing sequence-to-sequence mod...

Full description

Saved in:
Bibliographic Details
Main Authors: John Emerson, Yllias Chali
Format: Article
Language:English
Published: LibraryPress@UF 2023-05-01
Series:Proceedings of the International Florida Artificial Intelligence Research Society Conference
Online Access:https://journals.flvc.org/FLAIRS/article/view/133320
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Question generation is the parallel task of question answering, where given an input context and optionally, an answer, the goal is to generate a relevant and fluent natural language question. Although recent works on question generation have experienced success by utilizing sequence-to-sequence models, there is a need for question generation models to handle increasingly complex input contexts with the goal of producing increasingly elaborate questions. Multi-hop question generation is a more challenging task that aims to generate questions by connecting multiple facts from multiple input contexts. In this work we apply a transformer model to the task of multi-hop question generation, without utilizing any sentence-level supporting fact information. We utilize concepts that have proven effective in single-hop question generation, including a copy mechanism and placeholder tokens. We evaluate our model's performance on the HotpotQA dataset using automated evaluation metrics and human evaluation, and show an improvement over the previous works.
ISSN:2334-0754
2334-0762