+1 Recommend
0 collections
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Scaling up behavioral science interventions in online education

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          Low persistence in educational programs is a major obstacle to social mobility. Scientists have proposed many scalable interventions to support students learning online. In one of the largest international field experiments in education, we iteratively tested established behavioral science interventions and found small benefits depending on individual and contextual characteristics. Forecasting intervention efficacy using state-of-the-art methods yields limited improvements. Online education provides unprecedented access to learning opportunities, as evidenced by its role during the 2020 coronavirus pandemic, but adequately supporting diverse students will require more than a light-touch intervention. Our findings encourage funding agencies and researchers conducting large-scale field trials to consider dynamic investigations to uncover and design for contextual heterogeneity to complement static investigations of overall effects.


          Online education is rapidly expanding in response to rising demand for higher and continuing education, but many online students struggle to achieve their educational goals. Several behavioral science interventions have shown promise in raising student persistence and completion rates in a handful of courses, but evidence of their effectiveness across diverse educational contexts is limited. In this study, we test a set of established interventions over 2.5 y, with one-quarter million students, from nearly every country, across 247 online courses offered by Harvard, the Massachusetts Institute of Technology, and Stanford. We hypothesized that the interventions would produce medium-to-large effects as in prior studies, but this is not supported by our results. Instead, using an iterative scientific process of cyclically preregistering new hypotheses in between waves of data collection, we identified individual, contextual, and temporal conditions under which the interventions benefit students. Self-regulation interventions raised student engagement in the first few weeks but not final completion rates. Value-relevance interventions raised completion rates in developing countries to close the global achievement gap, but only in courses with a global gap. We found minimal evidence that state-of-the-art machine learning methods can forecast the occurrence of a global gap or learn effective individualized intervention policies. Scaling behavioral science interventions across various online learning contexts can reduce their average effectiveness by an order-of-magnitude. However, iterative scientific investigations can uncover what works where for whom.

          Related collections

          Most cited references 27

          • Record: found
          • Abstract: found
          • Article: not found

          Reducing the racial achievement gap: a social-psychological intervention.

          Two randomized field experiments tested a social-psychological intervention designed to improve minority student performance and increase our understanding of how psychological threat mediates performance in chronically evaluative real-world environments. We expected that the risk of confirming a negative stereotype aimed at one's group could undermine academic performance in minority students by elevating their level of psychological threat. We tested whether such psychological threat could be lessened by having students reaffirm their sense of personal adequacy or "self-integrity." The intervention, a brief in-class writing assignment, significantly improved the grades of African American students and reduced the racial achievement gap by 40%. These results suggest that the racial achievement gap, a major social concern in the United States, could be ameliorated by the use of timely and targeted social-psychological interventions.
            • Record: found
            • Abstract: not found
            • Article: not found

            Future thought and behaviour change

              • Record: found
              • Abstract: found
              • Article: not found

              Do you have a voting plan?: implementation intentions, voter turnout, and organic plan making.

              Phone calls encouraging citizens to vote are staples of modern campaigns. Insights from psychological science can make these calls dramatically more potent while also generating opportunities to expand psychological theory. We present a field experiment conducted during the 2008 presidential election (N = 287,228) showing that facilitating the formation of a voting plan (i.e., implementation intentions) can increase turnout by 4.1 percentage points among those contacted, but a standard encouragement call and self-prediction have no significant impact. Among single-eligible-voter households, the formation of a voting plan increased turnout among persons contacted by 9.1 percentage points, whereas those in multiple-eligible-voter households were unaffected by all scripts. Some situational factors may organically facilitate implementation-intentions formation more readily than others; we present data suggesting that this could explain the differential treatment effect that we found. We discuss implications for psychological and political science, and public interventions involving implementation-intentions formation.

                Author and article information

                Proc Natl Acad Sci U S A
                Proc. Natl. Acad. Sci. U.S.A
                Proceedings of the National Academy of Sciences of the United States of America
                National Academy of Sciences
                30 June 2020
                15 June 2020
                15 June 2020
                : 117
                : 26
                : 14900-14905
                aDepartment of Information Science, Cornell University , Ithaca, NY 14850;
                bComparative Media Studies/Writing, Massachusetts Institute of Technology , Cambridge, MA 02139;
                cHarvard Business School, Harvard University , Cambridge, MA 02138;
                dMachine Learning Department, Carnegie Mellon University , New York, NY 10004;
                eComputer Science Department, Stanford University , Stanford, CA 94305;
                fOffice of the Vice Provost for Advances in Learning, Harvard University , Cambridge, MA 02138;
                gSchool of Computer Science, Queensland University of Technology , Brisbane City, QLD 4000, Australia;
                hDepartment of Computer Science, University of Toronto , Toronto, M5S 1A1 ON, Canada
                iDepartment of Government, Harvard University , Cambridge, MA 02138
                Author notes
                2To whom correspondence may be addressed. Email: kizilcec@ 123456cornell.edu , jreich@ 123456mit.edu , or myeomans@ 123456hbs.edu .

                Edited by Susan T. Fiske, Princeton University, Princeton, NJ, and approved May 12, 2020 (received for review December 5, 2019)

                Author contributions: R.F.K., J.R., M.Y., C.D., E.B., S.T., J.J.W., and D.T. designed research; R.F.K., J.R., M.Y., C.D., and G.L. performed research; R.F.K., J.R., M.Y., C.D., and G.L. analyzed data; and R.F.K., J.R., M.Y., and C.D. wrote the paper.

                1R.F.K., J.R., and M.Y. contributed equally to this work.

                Copyright © 2020 the Author(s). Published by PNAS.

                This open access article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).

                Page count
                Pages: 6
                Funded by: National Science Foundation (NSF) 100000001
                Award ID: 1646978
                Award Recipient : Justin Reich Award Recipient : Michael Yeomans Award Recipient : Dustin Tingley
                Funded by: Stanford University (SU) 100005492
                Award ID: n/a
                Award Recipient : René F Kizilcec
                Funded by: Microsoft 100004318
                Award ID: n/a
                Award Recipient : Christoph Dann Award Recipient : Emma Brunskill
                Social Sciences
                Psychological and Cognitive Sciences

                behavioral interventions, scale, online learning


                Comment on this article