Training VGG16, MobileNetV1 and Simple CNN Models from Scratch for Balinese Inscription Recognition
Many inscriptions in Bali are damaged. Damage to these inscriptions can be caused by natural disasters, overgrown with moss, algae and bacteria. Damage can also be caused by warfare, or deliberately erased. This inscription contains the knowledge and civilization of the ancestors so it is very impor...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Udayana University, Institute for Research and Community Services
2025-01-01
|
Series: | Lontar Komputer |
Online Access: | https://ojs.unud.ac.id/index.php/lontar/article/view/116841 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Many inscriptions in Bali are damaged. Damage to these inscriptions can be caused by natural disasters, overgrown with moss, algae and bacteria. Damage can also be caused by warfare, or deliberately erased. This inscription contains the knowledge and civilization of the ancestors so it is very important to be able to read its contents. Based on these problems, this research conducted training from scratch on 3 CNN models namely VGG16, MobileNetV1 and Simple CNN. The purpose of this research is to choose one recognition model that has the best performance and produces the highest recognition rate to proceed to the inscription restoration stage. The dataset used is Balinese inscription: Isolated Character Recognition of Balinese Script in Palm Leaf Manuscript Images in Challenge-3-ForTrain.zip. The training process of three models with five different training files resulted in the finding that VGG16 has the highest accuracy in the training, testing, and validation process with the least number of epochs. This research contributes to specific datasets, such as the Isolated Character Recognition of Balinese Script using the training process from the beginning of VGG16, involving all stages of the process. It will produce the best model performance compared to the other four training models. |
---|---|
ISSN: | 2088-1541 2541-5832 |