Image Classification of Traditional Kaganga Bengkulu Script Using ResNet50 Architecture Optimization
DOI:
https://doi.org/10.36085/jsai.v8i3.9780Abstract
This study aimed to analyze the performance of the ResNet50 model based on transfer learning in classifying 19 classes of Kaganga script, while also evaluating the effect of applying L1, L2, and dropout regularization techniques on the model’s generalization ability in minimizing overfitting. In addition, the study examined the impact of varying batch sizes (16, 32, and 64) on training stability and overall model performance. The experiments were conducted by freezing the initial layers of ResNet50 as a feature extractor and modifying the final layers for the classification task. Model performance was evaluated using accuracy, precision, recall, F1-score, and confusion matrix metrics on the test dataset. The results showed that all model configurations achieved high training and validation accuracy. However, the combination of L2 regularization with a batch size of 32 yielded the best performance with a testing accuracy of 86.10%, indicating the most optimal generalization capability compared to other configurations. Meanwhile, the use of batch size 64 resulted in a more noticeable decrease in accuracy, making it less effective for this dataset. These findings indicated that the appropriate selection of regularization techniques and batch size played an important role in improving training stability and classification accuracy for traditional script image recognition.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Vina Ayumi

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.




