Author: Ahmad, Ayaz; Farooq, Furqan; Niewiadomski, Pawel; Ostrowski, Krzysztof; Akbar, Arslan; Aslam, Fahid; Alyousef, Rayed
Title: Prediction of Compressive Strength of Fly Ash Based Concrete Using Individual and Ensemble Algorithm Cord-id: kuji0jmu Document date: 2021_2_8
ID: kuji0jmu
Snippet: Machine learning techniques are widely used algorithms for predicting the mechanical properties of concrete. This study is based on the comparison of algorithms between individuals and ensemble approaches, such as bagging. Optimization for bagging is done by making 20 sub-models to depict the accurate one. Variables like cement content, fine and coarse aggregate, water, binder-to-water ratio, fly-ash, and superplasticizer are used for modeling. Model performance is evaluated by various statistic
Document: Machine learning techniques are widely used algorithms for predicting the mechanical properties of concrete. This study is based on the comparison of algorithms between individuals and ensemble approaches, such as bagging. Optimization for bagging is done by making 20 sub-models to depict the accurate one. Variables like cement content, fine and coarse aggregate, water, binder-to-water ratio, fly-ash, and superplasticizer are used for modeling. Model performance is evaluated by various statistical indicators like mean absolute error (MAE), mean square error (MSE), and root mean square error (RMSE). Individual algorithms show a moderate bias result. However, the ensemble model gives a better result with R(2) = 0.911 compared to the decision tree (DT) and gene expression programming (GEP). K-fold cross-validation confirms the model’s accuracy and is done by R(2), MAE, MSE, and RMSE. Statistical checks reveal that the decision tree with ensemble provides 25%, 121%, and 49% enhancement for errors like MAE, MSE, and RMSE between the target and outcome response.
Search related documents:
Co phrase search for related documents- actual performance and machine learning: 1
- adaboost gradient boosting and machine ensemble: 1
- adaboost gradient boosting and machine learning: 1, 2, 3, 4, 5, 6, 7, 8
- adaboost gradient boosting and machine learning model: 1, 2
- adaboost gradient boosting and mae mean absolute error: 1
- machine ensemble and mae mean absolute error: 1, 2
- machine learning and mae mean absolute error: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23
- machine learning approach and mae mean absolute error: 1, 2
- machine learning model and mae mean absolute error: 1, 2, 3, 4
Co phrase search for related documents, hyperlinks ordered by date