4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Automated detection of pain levels using deep feature extraction from shutter blinds-based dynamic-sized horizontal patches with facial images

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Pain intensity classification using facial images is a challenging problem in computer vision research. This work proposed a patch and transfer learning-based model to classify various pain intensities using facial images. The input facial images were segmented into dynamic-sized horizontal patches or “shutter blinds”. A lightweight deep network DarkNet19 pre-trained on ImageNet1K was used to generate deep features from the shutter blinds and the undivided resized segmented input facial image. The most discriminative features were selected from these deep features using iterative neighborhood component analysis, which were then fed to a standard shallow fine k-nearest neighbor classifier for classification using tenfold cross-validation. The proposed shutter blinds-based model was trained and tested on datasets derived from two public databases—University of Northern British Columbia-McMaster Shoulder Pain Expression Archive Database and Denver Intensity of Spontaneous Facial Action Database—which both comprised four pain intensity classes that had been labeled by human experts using validated facial action coding system methodology. Our shutter blinds-based classification model attained more than 95% overall accuracy rates on both datasets. The excellent performance suggests that the automated pain intensity classification model can be deployed to assist doctors in the non-verbal detection of pain using facial images in various situations (e.g., non-communicative patients or during surgery). This system can facilitate timely detection and management of pain.

          Related collections

          Most cited references33

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation

          Background To evaluate binary classifications and their confusion matrices, scientific researchers can employ several statistical rates, accordingly to the goal of the experiment they are investigating. Despite being a crucial issue in machine learning, no widespread consensus has been reached on a unified elective chosen measure yet. Accuracy and F1 score computed on confusion matrices have been (and still are) among the most popular adopted metrics in binary classification tasks. However, these statistical measures can dangerously show overoptimistic inflated results, especially on imbalanced datasets. Results The Matthews correlation coefficient (MCC), instead, is a more reliable statistical rate which produces a high score only if the prediction obtained good results in all of the four confusion matrix categories (true positives, false negatives, true negatives, and false positives), proportionally both to the size of positive elements and the size of negative elements in the dataset. Conclusions In this article, we show how MCC produces a more informative and truthful score in evaluating binary classifications than accuracy and F1 score, by first explaining the mathematical properties, and then the asset of MCC in six synthetic use cases and in a real genomics scenario. We believe that the Matthews correlation coefficient should be preferred to accuracy and F1 score in evaluating binary classification tasks by all scientific communities.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Chronic pain: an update on burden, best practices, and new advances

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              K-nearest neighbor

                Bookmark

                Author and article information

                Contributors
                mokhzainiazizan@usim.edu.my
                Journal
                Sci Rep
                Sci Rep
                Scientific Reports
                Nature Publishing Group UK (London )
                2045-2322
                14 October 2022
                14 October 2022
                2022
                : 12
                : 17297
                Affiliations
                [1 ]GRID grid.1048.d, ISNI 0000 0004 0473 0844, School of Business (Information System), , University of Southern Queensland, ; Toowoomba, QLD 4350 Australia
                [2 ]GRID grid.117476.2, ISNI 0000 0004 1936 7611, Faculty of Engineering and Information Technology, , University of Technology Sydney, ; Sydney, NSW 2007 Australia
                [3 ]GRID grid.16487.3c, ISNI 0000 0000 9216 0511, Department of Computer Engineering, College of Engineering, , Kafkas University, ; Kars, Turkey
                [4 ]GRID grid.411320.5, ISNI 0000 0004 0574 1529, Department of Digital Forensics Engineering, College of Technology, , Firat University, ; Elazig, Turkey
                [5 ]GRID grid.449062.d, ISNI 0000 0004 0399 2738, Department of Computer Engineering, College of Engineering, , Ardahan University, ; Ardahan, Turkey
                [6 ]Rathinam College of Engineering, Coimbatore, India
                [7 ]Faculty of Information Technology, HUTECH University of Technology, Ho Chi Minh City, Viet Nam
                [8 ]GRID grid.4489.1, ISNI 0000000121678994, Andalusian Research Institute in Data Science and Computational Intelligence, , University of Granada, ; Granada, Spain
                [9 ]GRID grid.443998.b, ISNI 0000 0001 2172 3919, Regional Research Center, , Iwate Prefectural University, ; Iwate, Japan
                [10 ]GRID grid.419385.2, ISNI 0000 0004 0620 9905, Department of Cardiology, , National Heart Centre Singapore, ; Singapore, Singapore
                [11 ]GRID grid.428397.3, ISNI 0000 0004 0385 0924, Duke-NUS Medical School, ; Singapore, Singapore
                [12 ]GRID grid.430417.5, ISNI 0000 0004 0640 6474, Centre of Clinical Genetics, , Sydney Children’s Hospitals Network, ; Randwick, 2031 Australia
                [13 ]GRID grid.1005.4, ISNI 0000 0004 4902 0432, School of Women’s and Children’s Health, , University of New South Wales, ; Randwick, 2031 Australia
                [14 ]GRID grid.462995.5, ISNI 0000 0001 2218 9236, Department of Electrical and Electronic Engineering, Faculty of Engineering and Built Environment, , Universiti Sains Islam Malaysia (USIM), ; Nilai, Malaysia
                [15 ]GRID grid.10347.31, ISNI 0000 0001 2308 5949, Department of Biomedical Engineering, Faculty of Engineering, , University Malaya, ; 50603 Kuala Lumpur, Malaysia
                [16 ]GRID grid.462630.5, ISNI 0000 0000 9158 4937, Department of Electronics and Computer Engineering, , Ngee Ann Polytechnic, ; Singapore, 599489 Singapore
                [17 ]GRID grid.443365.3, ISNI 0000 0004 0388 6484, Department of Biomedical Engineering, School of Science and Technology, , SUSS University, ; Singapore, Singapore
                [18 ]GRID grid.252470.6, ISNI 0000 0000 9263 9645, Department of Biomedical Informatics and Medical Engineering, , Asia University, ; Taichung, Taiwan
                Article
                21380
                10.1038/s41598-022-21380-4
                9568538
                36241674
                12ebcf41-427c-4908-8f29-545fe00a928a
                © The Author(s) 2022

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 26 April 2022
                : 27 September 2022
                Categories
                Article
                Custom metadata
                © The Author(s) 2022

                Uncategorized
                health care,medical research
                Uncategorized
                health care, medical research

                Comments

                Comment on this article