Author: Balogun, Abdullateef O.; Lafenwa-Balogun, Fatimah B.; Mojeed, Hammed A.; Adeyemo, Victor E.; Akande, Oluwatobi N.; Akintola, Abimbola G.; Bajeh, Amos O.; Usman-Hamza, Fatimah E.
Title: SMOTE-Based Homogeneous Ensemble Methods for Software Defect Prediction Cord-id: kvgj6n76 Document date: 2020_8_24
ID: kvgj6n76
Snippet: Class imbalance is a prevalent problem in machine learning which affects the prediction performance of classification algorithms. Software Defect Prediction (SDP) is no exception to this latent problem. Solutions such as data sampling and ensemble methods have been proposed to address the class imbalance problem in SDP. This study proposes a combination of Synthetic Minority Oversampling Technique (SMOTE) and homogeneous ensemble (Bagging and Boosting) methods for predicting software defects. Th
Document: Class imbalance is a prevalent problem in machine learning which affects the prediction performance of classification algorithms. Software Defect Prediction (SDP) is no exception to this latent problem. Solutions such as data sampling and ensemble methods have been proposed to address the class imbalance problem in SDP. This study proposes a combination of Synthetic Minority Oversampling Technique (SMOTE) and homogeneous ensemble (Bagging and Boosting) methods for predicting software defects. The proposed approach was implemented using Decision Tree (DT) and Bayesian Network (BN) as base classifiers on defects datasets acquired from NASA software corpus. The experimental results showed that the proposed approach outperformed other experimental methods. High accuracy of 86.8% and area under operating receiver characteristics curve value of 0.93% achieved by the proposed technique affirmed its ability to differentiate between the defective and non-defective labels without bias.
Search related documents:
Co phrase search for related documents- accuracy evaluate and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20
- accuracy score and low accuracy: 1, 2, 3, 4
- accuracy score and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
- accuracy score and machine learning technique: 1, 2
- accuracy value and low accuracy: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
- accuracy value and low accuracy value: 1, 2, 3, 4, 5
- accuracy value and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22
- accurate prediction and low accuracy: 1, 2, 3
- accurate prediction and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
- low accuracy and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11
Co phrase search for related documents, hyperlinks ordered by date