Page 176 - The-5th-MCAIT2021-eProceeding
P. 176

Although length embeddings can control where to stop decoding, they do not decide which information
        should  be  included  in  the  summary  within  the  length  constraint  (Saito  et  al.,  2020).    However,  length
        embeddings  only  add  length  information  on  the  decoder  side.  Consequently,  they  may  miss  important
        information because it is difficult to take into account which content should be included in the summary for
        certain length constraints. This research will focus on developing an arbitrary length controllable abstractive
        text summarization model which will enable automatic text summarizations based on desired length constraint
        or output area constraint.

        2.  Related Work

           Kikuchi  et  al.  (2016)  were  the  first  to  propose  length  embedding  for  length-controlled  abstractive
        summarization. Fan et al. (2018) also used length embeddings at the beginning of the decoder module for length
        control. They present a neural summarization model with a simple but effective mechanism to enable users to
        specify these high level attributes in order to control the shape of the final summaries to better suit their needs.
        Liu, et al., (2018) proposed a CNN-based length-controllable summarization model that uses the desired length
        as an input to the initial state of the decoder.  Takase and Okazaki (2019) introduced positional encoding that
        represents the remaining length at each decoder step of the Transformer-based encoder-decoder model. Saito et
        al.  (2020)  used  extractive-and-abstractive  summarization  which  incorporates  an  extractive  model  in  an
        abstractive encoder-decoder model.

        2.1. Critical Analysis of Previous Work on Length Control

           As can be seen in Table 2.1, some of the previous works analyzed have achieved summary length control
        either pre-defined (Fan et al 2017; Kikuchi et al., 2016; Yizhu et al., 2018) or arbitrary (Takase & Okazaki,
        2019; Makino et al., 2019; Saito et al., 2020).  Despite having enhanced length constrained summarization
        quality, all the models require a specific length to be provided before summary is generated. In Saito et al.,
        (2020), specific length of the prototype text must be given before it is inputted to their encoder decoder model
        for summary generation. Likewise in Takase and Okazaki (2019), remaining length must be defined at each
        decoder step of the Transformer-based encoder-decoder model.

        Table 1: Literature on abstractive text summarization
         Author         Title                              Technique               Length     Area
                                                                                   Control
         Fan et al., 2017   Controllable Abstractive Summarization   Convolutional Seq2Seq Model   Yes   No
         Yizhu et al., 2018   Controlling Length in AS Using a CNN   CNN Seq2Seq Model   Yes   No
         Yao et al., 2019   Multi-Task Learning Framework for AS   Long   Short-Term   Memory   No   No
                                                           (LSTM)
         Yong et al., 2019   AS with a Convolutional Seq2Seq   Convolutional Seq2Seq Model   No   No
         Petr et al., 2019   AS: A Low Resource Challenge   Transformer Model      No         No
         Makino et al., 2019   Global  Optimization  under  Length  Constraint  for   CNN based encoder decoders   Yes    No
                        Neural Text Summarization
         Takase & Okazaki,   Positional  Encoding  to  Control  Output  Sequence   neural  encoder-decoder  model,   Yes   No
         2019           Length                             Transformer
         Saito et al., 2020   Length-controllable AS by Guiding with Summary   Pointer-Generator,   Prototype   Yes   No
                        Prototype                          Extraction
           In Figure 2.1 for example, given a long newspaper story that need to be summarized to fit a portion in the
        newspaper cover, the previous work will not be able to provide summary for this since they are all based on
        specified number of summary word length.











        E- Proceedings of The 5th International Multi-Conference on Artificial Intelligence Technology (MCAIT 2021)   [163]
        Artificial Intelligence in the 4th Industrial Revolution
   171   172   173   174   175   176   177   178   179   180   181