Tim Van de Cruys
In the last few years, transformer-based neural network architectures have improved a wide range of NLP tasks. In this work, we examine the possibility of generating poetry based on the content of news stories by making use of transformer architectures. The task is framed as a constraint-based sequence to sequence task for summarization. A news article is encoded using a bi-directional encoder, and a summary of the article is generated using a conditional generative decoder. By enforcing constraints on the output probability distribution of the decoder, the resulting summary is transformed into poetic form. The model is trained using standard datasets for summarization, and it is the constraints that ensure a poetic rendering of the content expressed in the news article.