Page 179 - The-5th-MCAIT2021-eProceeding
P. 179

5.  Conclusion
           The proposed model is expected to compute area of a specified portion within a given image. The proposed
        model will produce an abstractive summary using computed area from the image making it fit perfectly to the
        portion.  The  proposed  model  is  expected  to  outperform  other  approaches  and  can  be  used  for  text
        summarization. Future works will include modifying other abstractive summarization techniques with the area
        constraint such as pointer generator, reinforcement learning and convolutional sequence to sequence models to
        determine which performs with greater results.


        References

        Fan,  A.,  Grangier,  D.,  &  Auli,  M.  (2017).  Controllable  abstractive  summarization.  arXiv  preprint
        arXiv:1711.05217.
        Kikuchi, Y., Neubig, G., Sasano, R., Takamura, H., & Okumura, M. (2016). Controlling output length in neural
        encoder-decoders. arXiv preprint arXiv:1609.09552.
        Li,  P.,  Lam,  W.,  Bing,  L.,  &  Wang,  Z.  (2017).  Deep  recurrent  generative  decoder  for  abstractive  text
        summarization. arXiv preprint arXiv:1708.00625.
        Liu, Y., Luo, Z., & Zhu, K. (2018). Controlling length in abstractive summarization using a convolutional neural
        network. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (pp.
        4110-4119).
        Makino,  T.,  Iwakura,  T.,  Takamura,  H.,  &  Okumura,  M.  (2019,  July).  Global  optimization  under  length
        constraint for neural text summarization. In Proceedings of the 57th Annual Meeting of the Association for
        Computational Linguistics (pp. 1039-1048).
        Parida,  S.,  &  Motlicek,  P.  (2019,  November).  Abstract  text  summarization:  A  low  resource  challenge.  In
        Proceedings  of  the  2019  Conference  on  Empirical  Methods  in  Natural  Language  Processing  and  the  9th
        International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) (pp. 5994-5998).
        Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., Zhou, Y., Li, W., & Liu, P. J. (2019).
        Exploring  the  limits  of  transfer  learning  with  a  unified  text-to-text  transformer.  arXiv  preprint
        arXiv:1910.10683.
        Rani, R. S., & Devi, P. L. (2020). A Literature Survey on Computer Vision Towards Data Science. 8(6), 3305–
        3309.
        Saito, I., Nishida, K., Nishida, K., Otsuka, A., Asano, H., Tomita, J., Shindo, H., & Matsumoto, Y. (2020).
        Length-controllable   Abstractive   Summarization   by   Guiding   with   Summary   Prototype.
        https://github.com/google-research/bert/
        Schumann, R. (2018). Unsupervised abstractive sentence summarization using length controlled variational
        autoencoder. arXiv preprint arXiv:1809.05233.
        Takase, S., Takase, S., & Okazaki, N. (2019). Positional encoding to control output sequence length. arXiv
        preprint arXiv:1904.07418.
        Tang, G., Müller, M., Rios, A., & Sennrich, R. (2018). Why self-attention? A targeted evaluation of neural
        machine translation architectures. ArXiv, 4263–4272.
        Tas, O., & Kiyani, F. (2007). A survey automatic text summarization. PressAcademia Procedia, 5(1), 205-213.
        Yang,  W.,  Tang,  Z.,  &  Tang,  X.  (2018,  May).  A  hierarchical  neural  abstractive  summarization  with  self-
        attention  mechanism.  In  2018  3rd  International  Conference  on  Automation,  Mechanical  Control  and
        Computational Engineering (AMCCE 2018) (pp. 514-518). Atlantis Press.
        Yolchuyeva, S., Németh, G., & Gyires-Toth, B. (2020). Self-attention networks for intent detection. ArXiv,
        1373–1379.
        Zhang, Y., Li, D., Wang, Y., Fang, Y., & Xiao, W. (2019). Abstract text summarization with a convolutional
        Seq2seq model. Applied Sciences, 9(8), 1665.









        E- Proceedings of The 5th International Multi-Conference on Artificial Intelligence Technology (MCAIT 2021)   [166]
        Artificial Intelligence in the 4th Industrial Revolution
   174   175   176   177   178   179   180   181   182   183   184