The Effect of Hyperparameter Tuning on VGG16 for Classification of Basurek Bengkulu Batik Images
DOI:
https://doi.org/10.36085/jsai.v8i3.9778Abstract
This study aimed to analyze the effect of combining hyperparameters, namely optimizer and batch size, on the performance of the VGG16 model in classifying Batik Basurek images. The dataset consisted of 250 images divided into five motif classes, with 50 images in each class. The data were split into training, validation, and testing sets with proportions of 70%, 15%, and 15%, respectively. The study employed a transfer learning approach using the VGG16 model, with hyperparameter variations including the RMSProp, Adam, and SGD optimizers, as well as batch sizes of 16, 32, and 64. The results showed that the Adam optimizer consistently delivered the best accuracy performance across all testing scenarios. The optimal performance was achieved using the combination of Adam and a batch size of 32, yielding a training accuracy of 97.55%, validation accuracy of 93.25%, and testing accuracy of 92.80%. Meanwhile, RMSProp demonstrated reasonably good performance but remained below Adam, and SGD produced the lowest accuracy across all evaluation stages. In terms of batch size, a batch size of 32 provided the most stable and accurate performance, whereas a batch size of 64 tended to reduce the model’s generalization capability. Therefore, the combination of Adam and a batch size of 32 was identified as the most optimal hyperparameter configuration for Batik Basurek image classification using the VGG16 model.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Vina Ayumi

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.




