2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Development and Validation of a Dynamically Updated Prediction Model for Attrition From Marine Recruit Training

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Dijksma, I, Hof, MHP, Lucas, C, and Stuiver, MM. Development and validation of a dynamically updated prediction model for attrition from Marine recruit training. J Strength Cond Res 36(9): 2523–2529, 2022—Whether fresh Marine recruits thrive and complete military training programs, or fail to complete, is dependent on numerous interwoven variables. This study aimed to derive a prediction model for dynamically updated estimation of conditional dropout probabilities for Marine recruit training. We undertook a landmarking analysis in a Cox proportional hazard model using longitudinal data from 744 recruits from existing databases of the Marine Training Center in the Netherlands. The model provides personalized estimates of dropout from Marine recruit training given a recruit's baseline characteristics and time-varying mental and physical health status, using 21 predictors. We defined nonoverlapping landmarks at each week and developed a supermodel by stacking the landmark data sets. The final supermodel contained all but one a priori selected baseline variables and time-varying health status to predict the hazard of attrition from Marine recruit training for each landmark as comprehensive as possible. The discriminative ability (c-index) of the prediction model was 0.78, 0.75, and 0.73 in week one, week 4 and week 12, respectively. We used 10-fold cross-validation to train and evaluate the model. We conclude that this prediction model may help to identify recruits at an increased risk of attrition from training throughout the Marine recruit training and warrants further validation and updates for other military settings.

          Related collections

          Most cited references20

          • Record: found
          • Abstract: not found
          • Article: not found

          Calculating the sample size required for developing a clinical prediction model

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Survival model predictive accuracy and ROC curves.

            The predictive accuracy of a survival model can be summarized using extensions of the proportion of variation explained by the model, or R2, commonly used for continuous response models, or using extensions of sensitivity and specificity, which are commonly used for binary response models. In this article we propose new time-dependent accuracy summaries based on time-specific versions of sensitivity and specificity calculated over risk sets. We connect the accuracy summaries to a previously proposed global concordance measure, which is a variant of Kendall's tau. In addition, we show how standard Cox regression output can be used to obtain estimates of time-dependent sensitivity and specificity, and time-dependent receiver operating characteristic (ROC) curves. Semiparametric estimation methods appropriate for both proportional and nonproportional hazards data are introduced, evaluated in simulations, and illustrated using two familiar survival data sets.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Towards better clinical prediction models: seven steps for development and an ABCD for validation.

              Clinical prediction models provide risk estimates for the presence of disease (diagnosis) or an event in the future course of disease (prognosis) for individual patients. Although publications that present and evaluate such models are becoming more frequent, the methodology is often suboptimal. We propose that seven steps should be considered in developing prediction models: (i) consideration of the research question and initial data inspection; (ii) coding of predictors; (iii) model specification; (iv) model estimation; (v) evaluation of model performance; (vi) internal validation; and (vii) model presentation. The validity of a prediction model is ideally assessed in fully independent data, where we propose four key measures to evaluate model performance: calibration-in-the-large, or the model intercept (A); calibration slope (B); discrimination, with a concordance statistic (C); and clinical usefulness, with decision-curve analysis (D). As an application, we develop and validate prediction models for 30-day mortality in patients with an acute myocardial infarction. This illustrates the usefulness of the proposed framework to strengthen the methodological rigour and quality for prediction models in cardiovascular research.
                Bookmark

                Author and article information

                Journal
                J Strength Cond Res
                J Strength Cond Res
                jscr
                Journal of Strength and Conditioning Research
                Journal of Strength and Conditioning Research
                1064-8011
                1533-4287
                September 2022
                15 January 2021
                : 36
                : 9
                : 2523-2529
                Affiliations
                [1 ]Amsterdam UMC Location AMC, Epidemiology and Data Science, Master Evidence Based Practice in Health Care, Arizona, Amsterdam, the Netherlands; and
                [2 ]Defense Health Care Organization, Netherlands Armed Forces, Utrecht, the Netherlands
                Author notes
                Address correspondence to Iris Dijksma, i.dijksma@ 123456amsterdamumc.nl .
                Article
                JSCR-08-14828 00022
                10.1519/JSC.0000000000003910
                9394493
                33470603
                bb23c729-4be0-4ba1-9a05-e2c929c36aae
                Copyright © 2021 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the National Strength and Conditioning Association.

                This is an open access article distributed under the Creative Commons Attribution License 4.0 (CCBY), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                Categories
                Original Research
                Custom metadata
                TRUE

                dropout,monitoring,landmarking
                dropout, monitoring, landmarking

                Comments

                Comment on this article