Neural Data-to-Text Generation with Dynamic Content Planning. Chen, K., Li, F., Hu, B., Peng, W., Chen, Q., & Yu, H. arXiv:2004.07426 [cs], April, 2020. arXiv: 2004.07426Paper abstract bibtex Neural data-to-text generation models have achieved significant advancement in recent years. However, these models have two shortcomings: the generated texts tend to miss some vital information, and they often generate descriptions that are not consistent with the structured input data. To alleviate these problems, we propose a Neural data-to-text generation model with Dynamic content Planning, named NDP for abbreviation. The NDP can utilize the previously generated text to dynamically select the appropriate entry from the given structured data. We further design a reconstruction mechanism with a novel objective function that can reconstruct the whole entry of the used data sequentially from the hidden states of the decoder, which aids the accuracy of the generated text. Empirical results show that the NDP achieves superior performance over the state-of-the-art on ROTOWIRE dataset, in terms of relation generation (RG), content selection (CS), content ordering (CO) and BLEU metrics. The human evaluation result shows that the texts generated by the proposed NDP are better than the corresponding ones generated by NCP in most of time. And using the proposed reconstruction mechanism, the fidelity of the generated text can be further improved significantly.
@article{chen_neural_2020-1,
title = {Neural {Data}-to-{Text} {Generation} with {Dynamic} {Content} {Planning}},
url = {http://arxiv.org/abs/2004.07426},
abstract = {Neural data-to-text generation models have achieved significant advancement in recent years. However, these models have two shortcomings: the generated texts tend to miss some vital information, and they often generate descriptions that are not consistent with the structured input data. To alleviate these problems, we propose a Neural data-to-text generation model with Dynamic content Planning, named NDP for abbreviation. The NDP can utilize the previously generated text to dynamically select the appropriate entry from the given structured data. We further design a reconstruction mechanism with a novel objective function that can reconstruct the whole entry of the used data sequentially from the hidden states of the decoder, which aids the accuracy of the generated text. Empirical results show that the NDP achieves superior performance over the state-of-the-art on ROTOWIRE dataset, in terms of relation generation (RG), content selection (CS), content ordering (CO) and BLEU metrics. The human evaluation result shows that the texts generated by the proposed NDP are better than the corresponding ones generated by NCP in most of time. And using the proposed reconstruction mechanism, the fidelity of the generated text can be further improved significantly.},
urldate = {2020-12-29},
journal = {arXiv:2004.07426 [cs]},
author = {Chen, Kai and Li, Fayuan and Hu, Baotian and Peng, Weihua and Chen, Qingcai and Yu, Hong},
month = apr,
year = {2020},
note = {arXiv: 2004.07426},
keywords = {Computer Science - Computation and Language},
}
Downloads: 0
{"_id":"JfQLCAFW6SBRjZEiX","bibbaseid":"chen-li-hu-peng-chen-yu-neuraldatatotextgenerationwithdynamiccontentplanning-2020","author_short":["Chen, K.","Li, F.","Hu, B.","Peng, W.","Chen, Q.","Yu, H."],"bibdata":{"bibtype":"article","type":"article","title":"Neural Data-to-Text Generation with Dynamic Content Planning","url":"http://arxiv.org/abs/2004.07426","abstract":"Neural data-to-text generation models have achieved significant advancement in recent years. However, these models have two shortcomings: the generated texts tend to miss some vital information, and they often generate descriptions that are not consistent with the structured input data. To alleviate these problems, we propose a Neural data-to-text generation model with Dynamic content Planning, named NDP for abbreviation. The NDP can utilize the previously generated text to dynamically select the appropriate entry from the given structured data. We further design a reconstruction mechanism with a novel objective function that can reconstruct the whole entry of the used data sequentially from the hidden states of the decoder, which aids the accuracy of the generated text. Empirical results show that the NDP achieves superior performance over the state-of-the-art on ROTOWIRE dataset, in terms of relation generation (RG), content selection (CS), content ordering (CO) and BLEU metrics. The human evaluation result shows that the texts generated by the proposed NDP are better than the corresponding ones generated by NCP in most of time. And using the proposed reconstruction mechanism, the fidelity of the generated text can be further improved significantly.","urldate":"2020-12-29","journal":"arXiv:2004.07426 [cs]","author":[{"propositions":[],"lastnames":["Chen"],"firstnames":["Kai"],"suffixes":[]},{"propositions":[],"lastnames":["Li"],"firstnames":["Fayuan"],"suffixes":[]},{"propositions":[],"lastnames":["Hu"],"firstnames":["Baotian"],"suffixes":[]},{"propositions":[],"lastnames":["Peng"],"firstnames":["Weihua"],"suffixes":[]},{"propositions":[],"lastnames":["Chen"],"firstnames":["Qingcai"],"suffixes":[]},{"propositions":[],"lastnames":["Yu"],"firstnames":["Hong"],"suffixes":[]}],"month":"April","year":"2020","note":"arXiv: 2004.07426","keywords":"Computer Science - Computation and Language","bibtex":"@article{chen_neural_2020-1,\n\ttitle = {Neural {Data}-to-{Text} {Generation} with {Dynamic} {Content} {Planning}},\n\turl = {http://arxiv.org/abs/2004.07426},\n\tabstract = {Neural data-to-text generation models have achieved significant advancement in recent years. However, these models have two shortcomings: the generated texts tend to miss some vital information, and they often generate descriptions that are not consistent with the structured input data. To alleviate these problems, we propose a Neural data-to-text generation model with Dynamic content Planning, named NDP for abbreviation. The NDP can utilize the previously generated text to dynamically select the appropriate entry from the given structured data. We further design a reconstruction mechanism with a novel objective function that can reconstruct the whole entry of the used data sequentially from the hidden states of the decoder, which aids the accuracy of the generated text. Empirical results show that the NDP achieves superior performance over the state-of-the-art on ROTOWIRE dataset, in terms of relation generation (RG), content selection (CS), content ordering (CO) and BLEU metrics. The human evaluation result shows that the texts generated by the proposed NDP are better than the corresponding ones generated by NCP in most of time. And using the proposed reconstruction mechanism, the fidelity of the generated text can be further improved significantly.},\n\turldate = {2020-12-29},\n\tjournal = {arXiv:2004.07426 [cs]},\n\tauthor = {Chen, Kai and Li, Fayuan and Hu, Baotian and Peng, Weihua and Chen, Qingcai and Yu, Hong},\n\tmonth = apr,\n\tyear = {2020},\n\tnote = {arXiv: 2004.07426},\n\tkeywords = {Computer Science - Computation and Language},\n}\n\n","author_short":["Chen, K.","Li, F.","Hu, B.","Peng, W.","Chen, Q.","Yu, H."],"key":"chen_neural_2020-1","id":"chen_neural_2020-1","bibbaseid":"chen-li-hu-peng-chen-yu-neuraldatatotextgenerationwithdynamiccontentplanning-2020","role":"author","urls":{"Paper":"http://arxiv.org/abs/2004.07426"},"keyword":["Computer Science - Computation and Language"],"metadata":{"authorlinks":{}},"html":""},"bibtype":"article","biburl":"http://fenway.cs.uml.edu/papers/pubs-all.bib","dataSources":["TqaA9miSB65nRfS5H"],"keywords":["computer science - computation and language"],"search_terms":["neural","data","text","generation","dynamic","content","planning","chen","li","hu","peng","chen","yu"],"title":"Neural Data-to-Text Generation with Dynamic Content Planning","year":2020}