Blog
About

43
views
0
recommends
+1 Recommend
1 collections
    1
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Researcher degrees of freedom in phonetic research

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The results of published research critically depend on methodological decisions that have been made during data analysis. These so-called ‘researcher degrees of freedom’ ( Simmons, Nelson, & Simonsohn, 2011) can affect the results and the conclusions researchers draw from it. It is argued that phonetic research faces a large number of researcher degrees of freedom due to its scientific object—speech—being inherently multidimensional and exhibiting complex interactions between multiple covariates. A Type-I error simulation is presented that demonstrates the severe inflation of false positives when exploring researcher degrees of freedom. It is argued that combined with common cognitive fallacies, exploitation of researcher degrees of freedom introduces strong bias and poses a serious challenge to quantitative phonetics as an empirical science. This paper discusses potential remedies for this problem including adjusting the threshold for significance; drawing a clear line between confirmatory and exploratory analyses via preregistration; open, honest, and transparent practices in communicating data analytical decisions; and direct replications.

          Related collections

          Most cited references 70

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          The Natural Selection of Bad Science

          Poor research design and data analysis encourage false-positive findings. Such poor methods persist despite perennial calls for improvement, suggesting that they result from something more than just misunderstanding. The persistence of poor methods results partly from incentives that favor them, leading to the natural selection of bad science. This dynamic requires no conscious strategizing---no deliberate cheating nor loafing---by scientists, only that publication is a principle factor for career advancement. Some normative methods of analysis have almost certainly been selected to further publication instead of discovery. In order to improve the culture of science, a shift must be made away from correcting misunderstandings and towards rewarding understanding. We support this argument with empirical evidence and computational modeling. We first present a 60-year meta-analysis of statistical power in the behavioral sciences and show that power has not improved despite repeated demonstrations of the necessity of increasing power. To demonstrate the logical consequences of structural incentives, we then present a dynamic model of scientific communities in which competing laboratories investigate novel or previously published hypotheses using culturally transmitted research methods. As in the real world, successful labs produce more "progeny", such that their methods are more often copied and their students are more likely to start labs of their own. Selection for high output leads to poorer methods and increasingly high false discovery rates. We additionally show that replication slows but does not stop the process of methodological deterioration. Improving the quality of research requires change at the institutional level.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Scientific Utopia: II. Restructuring incentives and practices to promote truth over publishability

            An academic scientist's professional success depends on publishing. Publishing norms emphasize novel, positive results. As such, disciplinary incentives encourage design, analysis, and reporting decisions that elicit positive results and ignore negative results. Prior reports demonstrate how these incentives inflate the rate of false effects in published science. When incentives favor novelty over replication, false results persist in the literature unchallenged, reducing efficiency in knowledge accumulation. Previous suggestions to address this problem are unlikely to be effective. For example, a journal of negative results publishes otherwise unpublishable reports. This enshrines the low status of the journal and its content. The persistence of false findings can be meliorated with strategies that make the fundamental but abstract accuracy motive - getting it right - competitive with the more tangible and concrete incentive - getting it published. We develop strategies for improving scientific practices and knowledge accumulation that account for ordinary human motivations and self-serving biases.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Theory and data interactions of the scientific mind: evidence from the molecular and the cognitive laboratory.

              A number of researchers and scholars have stressed the importance of disconfirmation in the quest for the development of scientific knowledge (e.g., Popper, 1959). Paradoxically, studies examining human reasoning in the laboratory have typically found that people display a confirmation bias in that they are more likely to seek out and attend to data consistent rather than data inconsistent with their initial theory (Wason, 1968). We examine the strategies that scientists and students use to evaluate data that are either consistent or inconsistent with their expectations. First, we present findings from scientists reasoning "live" in their laboratory meetings. We show that scientists often show an initial reluctance to consider inconsistent data as "real." However, this initial reluctance is often overcome with repeated observations of the inconsistent data such that they modify their theories to account for the new data. We further examine these issues in a controlled scientific causal thinking simulation specifically developed to examine the reasoning strategies we observed in the natural scientific environment. Like the scientists, we found that participants in our simulation initially displayed a propensity to discount data inconsistent with a theory provided. However, with repeated observations of the inconsistent data, the students, like the scientists, began to see the once anomalous data as "real" and the initial bias to discount that data was significantly diminished.
                Bookmark

                Author and article information

                Contributors
                Journal
                1868-6354
                Laboratory Phonology: Journal of the Association for Laboratory Phonology
                Ubiquity Press
                1868-6354
                04 January 2019
                2019
                : 10
                : 1
                Affiliations
                [1 ]Department of Linguistics, Northwestern University, Evanston, IL, US
                Article
                10.5334/labphon.147
                Copyright: © 2019 The Author(s)

                This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. See http://creativecommons.org/licenses/by/4.0/.

                Categories
                Journal article

                Comments

                Comment on this article