21
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Performance Analysis Of Regularized Linear Regression Models For Oxazolines And Oxazoles Derivitive Descriptor Dataset

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Regularized regression techniques for linear regression have been created the last few ten years to reduce the flaws of ordinary least squares regression with regard to prediction accuracy. In this paper, new methods for using regularized regression in model choice are introduced, and we distinguish the conditions in which regularized regression develops our ability to discriminate models. We applied all the five methods that use penalty-based (regularization) shrinkage to handle Oxazolines and Oxazoles derivatives descriptor dataset with far more predictors than observations. The lasso, ridge, elasticnet, lars and relaxed lasso further possess the desirable property that they simultaneously select relevant predictive descriptors and optimally estimate their effects. Here, we comparatively evaluate the performance of five regularized linear regression methods The assessment of the performance of each model by means of benchmark experiments is an established exercise. Cross-validation and resampling methods are generally used to arrive point evaluates the efficiencies which are compared to recognize methods with acceptable features. Predictive accuracy was evaluated using the root mean squared error (RMSE) and Square of usual correlation between predictors and observed mean inhibitory concentration of antitubercular activity (R square). We found that all five regularized regression models were able to produce feasible models and efficient capturing the linearity in the data. The elastic net and lars had similar accuracies as well as lasso and relaxed lasso had similar accuracies but outperformed ridge regression in terms of the RMSE and R square metrics.

          Related collections

          Most cited references5

          • Record: found
          • Abstract: not found
          • Article: not found

          Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The Adaptive Lasso and Its Oracle Properties

            Hui Zou (2006)
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Atomic Decomposition by Basis Pursuit

                Bookmark

                Author and article information

                Journal
                10 December 2013
                Article
                10.5121/ijcsity.2013.1408
                1312.2789
                23cafee3-497f-4dd6-9d48-ac717b32a1b9

                http://creativecommons.org/licenses/by-nc-sa/3.0/

                History
                Custom metadata
                published International Journal of Computational Science and Information Technology (IJCSITY) Vol.1, No.4, November 2013
                cs.LG

                Comments

                Comment on this article