Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Optimal Dimensioning of Retaining Walls Using Explainable Ensemble Learning Algorithms

      , , ,
      Materials
      MDPI AG

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          This paper develops predictive models for optimal dimensions that minimize the construction cost associated with reinforced concrete retaining walls. Random Forest, Extreme Gradient Boosting (XGBoost), Categorical Gradient Boosting (CatBoost), and Light Gradient Boosting Machine (LightGBM) algorithms were applied to obtain the predictive models. Predictive models were trained using a comprehensive dataset, which was generated using the Harmony Search (HS) algorithm. Each data sample in this database consists of a unique combination of the soil density, friction angle, ultimate bearing pressure, surcharge, the unit cost of concrete, and six different dimensions that describe an optimal retaining wall geometry. The influence of these design features on the optimal dimensioning and their interdependence are explained and visualized using the SHapley Additive exPlanations (SHAP) algorithm. The prediction accuracy of the used ensemble learning methods is evaluated with different metrics of accuracy such as the coefficient of determination, root mean square error, and mean absolute error. Comparing predicted and actual optimal dimensions on a test set showed that an R2 score of 0.99 could be achieved. In terms of computational speed, the LightGBM algorithm was found to be the fastest, with an average execution speed of 6.17 s for the training and testing of the model. On the other hand, the highest accuracy could be achieved by the CatBoost algorithm. The availability of open-source machine learning algorithms and high-quality datasets makes it possible for designers to supplement traditional design procedures with newly developed machine learning techniques. The novel methodology proposed in this paper aims at producing larger datasets, thereby increasing the applicability and accuracy of machine learning algorithms in relation to optimal dimensioning of structures.

          Related collections

          Most cited references45

          • Record: found
          • Abstract: not found
          • Conference Proceedings: not found

          XGBoost

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            A Unified Approach to Interpreting Model Predictions

            Understanding why a model makes a certain prediction can be as crucial as the prediction's accuracy in many applications. However, the highest accuracy for large modern datasets is often achieved by complex models that even experts struggle to interpret, such as ensemble or deep learning models, creating a tension between accuracy and interpretability. In response, various methods have recently been proposed to help users interpret the predictions of complex models, but it is often unclear how these methods are related and when one method is preferable over another. To address this problem, we present a unified framework for interpreting predictions, SHAP (SHapley Additive exPlanations). SHAP assigns each feature an importance value for a particular prediction. Its novel components include: (1) the identification of a new class of additive feature importance measures, and (2) theoretical results showing there is a unique solution in this class with a set of desirable properties. The new class unifies six existing methods, notable because several recent methods in the class lack the proposed desirable properties. Based on insights from this unification, we present new methods that show improved computational performance and/or better consistency with human intuition than previous approaches. To appear in NIPS 2017
              Bookmark
              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              LightGBM: a highly efficient gradient boosting decision tree

                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                Journal
                MATEG9
                Materials
                Materials
                MDPI AG
                1996-1944
                July 2022
                July 18 2022
                : 15
                : 14
                : 4993
                Article
                10.3390/ma15144993
                75fdded6-1eb7-4f69-b8a0-2acd6c284458
                © 2022

                https://creativecommons.org/licenses/by/4.0/

                History

                Comments

                Comment on this article