5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Autism Behavior Inventory: A Novel Tool for Assessing Core and Associated Symptoms of Autism Spectrum Disorder

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Objective: Autism Behavior Inventory (ABI) is a new measure for assessing changes in core and associated symptoms of autism spectrum disorder (ASD) in participants (ages: 3 years-adulthood) diagnosed with ASD. It is a web-based tool with five domains (two ASD core domains: social communication, restrictive and repetitive behaviors; three associated domains: mental health, self-regulation, and challenging behavior). This study describes design, development, and initial psychometric properties of the ABI.

          Methods: ABI items were generated following review of existing measures and inputs from expert clinicians. Initial ABI scale contained 161 items that were reduced to fit a factor analytic model, retaining items of adequate reliability. Two versions of the scale, ABI-full (ABI-F; 93 items) and ABI-short version (ABI-S; 36 items), were developed and evaluated for psychometric properties, including validity comparisons with commonly used measures. Both scales were administered to parents and healthcare professionals (HCPs) involved with study participants.

          Results: Test–retest reliability (intraclass correlation coefficient [ICC] = 0.79) for parent ratings on ABI was robust and compared favorably to existing scales. Test–retest correlations for HCP ratings were generally lower versus parent ratings. ABI core domains and comparison measures strongly correlated ( r ≥ 0.70), demonstrating good concurrent validity.

          Conclusions: Overall, ABI demonstrates promise as a tool for measuring change in core symptoms of autism in ASD clinical studies, with further validation required.

          Related collections

          Most cited references19

          • Record: found
          • Abstract: not found
          • Article: not found

          Child/adolescent behavioral and emotional problems: implications of cross-informant correlations for situational specificity.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            The Repetitive Behavior Scale-Revised: independent validation in individuals with autism spectrum disorders.

            A key feature of autism is restricted repetitive behavior (RRB). Despite the significance of RRBs, little is known about their phenomenology, assessment, and treatment. The Repetitive Behavior Scale-Revised (RBS-R) is a recently-developed questionnaire that captures the breadth of RRB in autism. To validate the RBS-R in an independent sample, we conducted a survey within the South Carolina Autism Society. A total of 320 caregivers (32%) responded. Factor analysis produced a five-factor solution that was clinically meaningful and statistically sound. The factors were labeled "Ritualistic/Sameness Behavior," "Stereotypic Behavior," "Self-injurious Behavior," "Compulsive Behavior," and "Restricted Interests." Measures of internal consistency were high for this solution, and interrater reliability data suggested that the RBS-R performs well in outpatient settings.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Equivalence of electronic and paper-and-pencil administration of patient-reported outcome measures: a meta-analytic review.

              Patient-reported outcomes (PROs; self-report assessments) are increasingly important in evaluating medical care and treatment efficacy. Electronic administration of PROs via computer is becoming widespread. This article reviews the literature addressing whether computer-administered tests are equivalent to their paper-and-pencil forms. Meta-analysis was used to synthesize 65 studies that directly assessed the equivalence of computer versus paper versions of PROs used in clinical trials. A total of 46 unique studies, evaluating 278 scales, provided sufficient detail to allow quantitative analysis. Among 233 direct comparisons, the average mean difference between modes averaged 0.2% of the scale range (e.g., 0.02 points on a 10-point scale), and 93% were within +/-5% of the scale range. Among 207 correlation coefficients between paper and computer instruments (typically intraclass correlation coefficients), the average weighted correlation was 0.90; 94% of correlations were at least 0.75. Because the cross-mode correlation (paper vs. computer) is also a test-retest correlation, with potential variation because of retest, we compared it to the within-mode (paper vs. paper) test-retest correlation. In four comparisons that evaluated both, the average cross-mode paper-to-computer correlation was almost identical to the within-mode correlation for readministration of a paper measure (0.88 vs. 0.91). Extensive evidence indicates that paper- and computer-administered PROs are equivalent.
                Bookmark

                Author and article information

                Journal
                J Child Adolesc Psychopharmacol
                J Child Adolesc Psychopharmacol
                cap
                Journal of Child and Adolescent Psychopharmacology
                Mary Ann Liebert, Inc. (140 Huguenot Street, 3rd FloorNew Rochelle, NY 10801USA )
                1044-5463
                1557-8992
                01 November 2017
                01 November 2017
                01 November 2017
                : 27
                : 9
                : 814-822
                Affiliations
                [ 1 ]Janssen Research & Development, LLC , Titusville, New Jersey.
                [ 2 ]The Nisonger Center University Center for Excellence in Developmental Disabilities (UCEDD), Ohio State University , Columbus, Ohio.
                [ 3 ]Division of Developmental and Behavioral Pediatrics, Cincinnati Children's Hospital Medical Center , Cincinnati, Ohio.
                [ 4 ]312E Robinson Hall, Department of Health Sciences, Bouvé College of Health Sciences, Northeastern University , Boston, Massachusetts.
                [ 5 ]Duke Center for Autism and Brain Development, Duke University , Durham, North Carolina.
                [ 6 ]Department of Psychiatry, University of California , San Francisco, California.
                [ 7 ]UCSF Benioff Children's Hospital , San Francisco, California.
                [ 8 ]Nathan Kline Institute , Orangeburg, New York.
                [ 9 ]ProPhase, LLC, NYU School of Medicine, Columbia University Medical Center , New York, New York.
                Author notes
                Address correspondence to: Abi Bangerter, MA, Janssen Research & Development, LLC Titusville, NJ 08534, E-mail: abangert@ 123456its.jnj.com ;

                autismbehaviorinventory@ 123456its.jnj.com
                Article
                10.1089/cap.2017.0018
                10.1089/cap.2017.0018
                5689117
                28498053
                bedad8c5-e108-49fd-a193-50c4209614de
                © Abi Bangerter et al. 2017; Published by Mary Ann Liebert, Inc.

                This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                Page count
                Figures: 1, Tables: 6, References: 29, Pages: 9
                Categories
                Original Articles

                autism spectrum disorder,rating scale,software,assessment,outcome,measures

                Comments

                Comment on this article