17
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Reading comprehension based question answering system in Bangla language with transformer-based learning

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Question answering (QA) system in any language is an assortment of mechanisms for obtaining answers to user questions with various data compositions. Reading comprehension (RC) is one type of composition, and the popularity of this type is increasing day by day in Natural Language Processing (NLP) research area. Some works have been done in several languages, mainly in English. In the Bangla language, neither any dataset available for RC nor any work has been done in the past. In this research work, we develop a question-answering system from RC. For doing this, we construct a dataset containing 3636 reading comprehensions along with questions and answers. We apply a transformer-based deep neural network model to obtain convenient answers to questions based on reading comprehensions precisely and swiftly. We exploit some deep neural network architectures such as LSTM (Long Short-Term Memory), Bi-LSTM (Bidirectional LSTM) with attention, RNN (Recurrent Neural Network), ELECTRA, and BERT (Bidirectional Encoder Representations from Transformers) to our dataset for training. The transformer-based pre-training language architectures BERT and ELECTRA perform more prominently than others from those architectures. Finally, the trained model of BERT performs a satisfactory outcome with 87.78% of testing accuracy and 99% training accuracy, and ELECTRA provides training and testing accuracy of 82.5% and 93%, respectively.

          Abstract

          Bangla question answering; Transformer-based learning; Reading comprehension; Bangla language; Bangla reading comprehension

          Related collections

          Most cited references30

          • Record: found
          • Abstract: not found
          • Book: not found

          Advances in neural information processing systems

            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            State-of-the-art augmented NLP transformer models for direct and single-step retrosynthesis

            We investigated the effect of different training scenarios on predicting the (retro)synthesis of chemical compounds using text-like representation of chemical reactions (SMILES) and Natural Language Processing (NLP) neural network Transformer architecture. We showed that data augmentation, which is a powerful method used in image processing, eliminated the effect of data memorization by neural networks and improved their performance for prediction of new sequences. This effect was observed when augmentation was used simultaneously for input and the target data simultaneously. The top-5 accuracy was 84.8% for the prediction of the largest fragment (thus identifying principal transformation for classical retro-synthesis) for the USPTO-50k test dataset, and was achieved by a combination of SMILES augmentation and a beam search algorithm. The same approach provided significantly better results for the prediction of direct reactions from the single-step USPTO-MIT test set. Our model achieved 90.6% top-1 and 96.1% top-5 accuracy for its challenging mixed set and 97% top-5 accuracy for the USPTO-MIT separated set. It also significantly improved results for USPTO-full set single-step retrosynthesis for both top-1 and top-10 accuracies. The appearance frequency of the most abundantly generated SMILES was well correlated with the prediction outcome and can be used as a measure of the quality of reaction prediction.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Performance Analysis of Google Colaboratory as a Tool for Accelerating Deep Learning Applications

                Bookmark

                Author and article information

                Contributors
                Journal
                Heliyon
                Heliyon
                Heliyon
                Elsevier
                2405-8440
                12 October 2022
                October 2022
                12 October 2022
                : 8
                : 10
                : e11052
                Affiliations
                [a ]Department of Computer Science and Engineering, Jahangirnagar University, Savar, Dhaka, Bangladesh
                [b ]Department of Computer Science and Engineering, International University of Business Agriculture and Technology, Bangladesh
                [c ]Central Queensland University, Melbourne, Australia
                [d ]Brian Station 23 Ltd, Dhaka, Bangladesh
                [e ]JU Data Mining Research Lab, Dhaka, Bangladesh
                Author notes
                [* ]Corresponding author at: Department of Computer Science and Engineering, International University of Business Agriculture and Technology, Bangladesh. taurpa.cse@ 123456iubat.edu
                Article
                S2405-8440(22)02340-4 e11052
                10.1016/j.heliyon.2022.e11052
                9568857
                36254291
                5490af35-85a2-4e75-80ea-1e7d77eb6923
                © 2022 The Authors

                This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

                History
                : 5 May 2022
                : 18 August 2022
                : 7 October 2022
                Categories
                Research Article

                bangla question answering,transformer-based learning,reading comprehension,bangla language,bangla reading comprehension

                Comments

                Comment on this article