8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Effects of a learning analytics‐based real‐time feedback approach on knowledge elaboration, knowledge convergence, interactive relationships and group performance in CSCL

      1 , 1 , 1
      British Journal of Educational Technology
      Wiley

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references71

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Calculating and reporting effect sizes to facilitate cumulative science: a practical primer for t-tests and ANOVAs

          Effect sizes are the most important outcome of empirical studies. Most articles on effect sizes highlight their importance to communicate the practical significance of results. For scientists themselves, effect sizes are most useful because they facilitate cumulative science. Effect sizes can be used to determine the sample size for follow-up studies, or examining effects across studies. This article aims to provide a practical primer on how to calculate and report effect sizes for t-tests and ANOVA's such that effect sizes can be used in a-priori power analyses and meta-analyses. Whereas many articles about effect sizes focus on between-subjects designs and address within-subjects designs only briefly, I provide a detailed overview of the similarities and differences between within- and between-subjects designs. I suggest that some research questions in experimental psychology examine inherently intra-individual effects, which makes effect sizes that incorporate the correlation between measures the best summary of the results. Finally, a supplementary spreadsheet is provided to make it as easy as possible for researchers to incorporate effect size calculations into their workflow.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

            We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. As a result, the pre-trained BERT model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications. BERT is conceptually simple and empirically powerful. It obtains new state-of-the-art results on eleven natural language processing tasks, including pushing the GLUE score to 80.5% (7.7% point absolute improvement), MultiNLI accuracy to 86.7% (4.6% absolute improvement), SQuAD v1.1 question answering Test F1 to 93.2 (1.5 point absolute improvement) and SQuAD v2.0 Test F1 to 83.1 (5.1 point absolute improvement).
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: a review of the research

                Bookmark

                Author and article information

                Journal
                British Journal of Educational Technology
                Brit J Educational Tech
                Wiley
                0007-1013
                1467-8535
                January 2022
                August 13 2021
                January 2022
                : 53
                : 1
                : 130-149
                Affiliations
                [1 ]School of Educational Technology Faculty of Education Beijing Normal University Beijing China
                Article
                10.1111/bjet.13156
                2da2eefb-3ea9-4cc8-a026-e371467dc18b
                © 2022

                http://onlinelibrary.wiley.com/termsAndConditions#vor

                http://doi.wiley.com/10.1002/tdm_license_1.1

                History

                Comments

                Comment on this article