Blog
About

8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Metamodel Construction for Sensitivity Analysis

      Preprint

      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We propose to estimate a metamodel and the sensitivity indices of a complex model m in the Gaussian regression framework. Our approach combines methods for sensitivity analysis of complex models and statistical tools for sparse non-parametric estimation in multivariate Gaussian regression model. It rests on the construction of a metamodel for aproximating the Hoeffding-Sobol decomposition of m. This metamodel belongs to a reproducing kernel Hilbert space constructed as a direct sum of Hilbert spaces leading to a functional ANOVA decomposition. The estimation of the metamodel is carried out via a penalized least-squares minimization allowing to select the subsets of variables that contribute to predict the output. It allows to estimate the sensitivity indices of m. We establish an oracle-type inequality for the risk of the estimator, describe the procedure for estimating the metamodel and the sensitivity indices, and assess the performances of the procedure via a simulation study.

          Related collections

          Most cited references 12

          • Record: found
          • Abstract: not found
          • Article: not found

          Convex Optimization: Algorithms and Complexity

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The Brunn-Minkowski inequality in Gauss space

              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Local Rademacher complexities

              We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.
                Bookmark

                Author and article information

                Journal
                2017-01-17
                Article
                1701.04671

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                Custom metadata
                math.ST stat.TH
                ccsd

                Statistics theory

                Comments

                Comment on this article