Paper to appear in EMNLP 2020.
We propose encoder-centric stepwise models for extractive summarization using the structured transformer architecture – Extended Transformer Construction (Ainslie et al., 2020). We enable stepwise summarization by injecting the previously generated summary into the structured transformer as an auxiliary substructure. Our models are not only efficient in modeling the structure of long inputs, but they also do not rely on task-specific redundancy-aware modeling, making them a general purpose extractive content planner for different tasks.
Code release in progress.