Oktaviani, Avi (2025) Optimasi XGBoost dengan Hyperparameter GridSearch dan RandomSearch pada Klasifikasi Status Gizi Balita. Undergraduate thesis, UPN "Veteran" Jawa Timur.
|
Text
skripsi Avi v2-1-23.pdf Download (877kB) |
|
|
Text (BAB 1)
AVI FIK UPN_merged-24-29_Bab_1.pdf Download (271kB) |
|
|
Text (BAB 2)
AVI FIK UPN_merged-30-48_Bab_2.pdf Restricted to Repository staff only Download (672kB) | Request a copy |
|
|
Text (BAB 3)
AVI FIK UPN_merged-50-83_Bab_3.pdf Restricted to Repository staff only until 8 December 2028. Download (782kB) | Request a copy |
|
|
Text (BAB 4)
AVI FIK UPN_merged-84-153_Bab_4.pdf Restricted to Repository staff only until 8 December 2028. Download (3MB) | Request a copy |
|
|
Text (BAB 5)
AVI FIK UPN_merged-154-157_Bab_5.pdf Download (259kB) |
|
|
Text (DAFTAR PUSTAKA)
AVI FIK UPN_merged-158-159_Dafpus.pdf Download (233kB) |
|
|
Text (LAMPIRAN)
AVI FIK UPN_merged-160.pdf Restricted to Repository staff only until 8 December 2028. Download (3MB) | Request a copy |
Abstract
This study aims to compare the performance of Grid Search and Random Search in classifying the nutritional status of children under five into six classes: severe undernutrition, undernutrition, normal nutrition, at risk of overweight, overweight, and obesity. To address class imbalance, SMOTE is applied to the training data. Experiments are conducted by testing various train–test split proportions of 80:20, 75:25, and 70:30, as well as several key parameters of the model, such as learning rates of 0.05, 0.1, and 0.2, max depths of 4, 6, and 8, numbers of estimators of 50, 100, and 200, and subsample values of 0.7, 0.9, and 1.0. The results show that the best parameter combination for both models is a data proportion of 80:20, learning rate of 0.1, max depth of 6, 200 estimators, and subsample of 0.7. The model is then evaluated using Grid Search and Random Search optimization approaches, performance metrics including accuracy, precision, recall, and F1-score, as well as 3, 5 and 10-fold cross-validation. The experimental results indicate that the XGBoost model without optimization achieves an accuracy of 0.869, precision of 0.802, recall of 0.831, and F1-score of 0.814. After optimization with Grid Search, the model attains an accuracy of 0.877, precision of 0.817, recall of 0.813, and F1 score of 0.812, while Random Search yields an accuracy of 0.869, precision of 0.796, recall of 0.807, and F1-score of 0.812. In terms of efficiency, Grid Search requires 388.40 seconds, whereas Random Search takes only 123.73 seconds with nearly comparable performance. Thus, XGBoost optimization is proven to provide a moderate improvement in performance, with Grid Search performing slightly better in accuracy and precision, while Random Search is more time-efficient and more suitable when computational resources are limited. The final model is then integrated into a web application as a decision-support system for predicting the nutritional status of children under five.
| Item Type: | Thesis (Undergraduate) | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Contributors: |
|
||||||||||||
| Subjects: | T Technology > T Technology (General) T Technology > T Technology (General) > T58.6-58.62 Management Information Systems |
||||||||||||
| Divisions: | Faculty of Computer Science > Departemen of Informatics | ||||||||||||
| Depositing User: | Avi Oktaviani | ||||||||||||
| Date Deposited: | 12 Dec 2025 07:10 | ||||||||||||
| Last Modified: | 12 Dec 2025 07:10 | ||||||||||||
| URI: | https://repository.upnjatim.ac.id/id/eprint/48229 |
Actions (login required)
![]() |
View Item |
