Pengenalan Tulisan Tangan Angka menggunakan Self Organizing Maps (SOM)
Abstract
Handwriting is character pattern recognition. Character pattern recognition is exciting to do research. In character pattern recognition, many types of characters can be recognized by computers and solved by various algorithms. Various kinds of pattern recognition have been successfully applied in multiple fields such as voice recognition, face detection, fingerprint recognition, and handwriting recognition. Handwriting recognition is divided into two types, namely online handwriting recognition and offline handwriting recognition. Online handwriting recognition requires special electronic equipment, and handwriting is captured on a pressure-sensitive tablet. Offline handwriting recognition does not need a particular machine because handwriting data is entered from previously written text such as images scanned by a scanner. Several methods have been developed to recognize handwriting with varying degrees of accuracy. This research uses the feature extraction of United Moment Invariant (UMI) and Self Organizing Maps (SOM). Based on the results of the software experiment for the entire test data set, the primary data yielded an accuracy rate of 88% for 50 images, and the first secondary data paid an accuracy rate of 98.2% for 500 images. However, for the second secondary data experiment with 50 test data, the accuracy rate is 90%. The third secondary data experiment was 500 test data. The accuracy rate was 89%. When viewed from the accuracy value, the primary data has a lower level of accuracy when compared to the two secondary data with different amounts. The story of accuracy resulting from experimenting with varying data sets proves that handwritten characters have a high and inconsistent level of variation. This is caused by the thickness and form of writing that is not consistent in each person and habits that affect the character of one's handwriting. Primary data is data that is taken directly and through a scanner process and still has a lot of noise in the handwritten image of numbers. At the same time, the secondary data has undergone a grey image process so that the handwritten image is clean from noise.
Downloads
References
K. Dutta, “Improving CNN-RNN hybrid networks for handwriting recognition,” Proc. Int. Conf. Front. Handwrit. Recognition, ICFHR, vol. 2018, pp. 80–85, 2018, doi: 10.1109/ICFHR-2018.2018.00023.
C. Boufenar, “Investigation on deep learning for off-line handwritten Arabic character recognition,” Cogn. Syst. Res., vol. 50, pp. 180–195, 2018, doi: 10.1016/j.cogsys.2017.11.002.
A. Baldominos, “Evolutionary convolutional neural networks: An application to handwriting recognition,” Neurocomputing, vol. 283, pp. 38–52, 2018, doi: 10.1016/j.neucom.2017.12.049.
S. Pengenalan, T. Tangan, A. Fadlil, P. Studi, T. Elektro, and U. A. Dahlan, “( Influence of Feature Extraction Complexity in Online Handwriting Recognition ),” pp. 127–132.
S. V. M. Classifiers et al., “Handwritten Assamese Numeral Recognizer Using.”
R. J. Ramteke, “Invariant Moments Based Feature Extraction for Handwritten Devanagari Vowels Recognition,” vol. 1, no. 18, pp. 1–5, 2010.
I. J. Kim and X. Xie, “Handwritten Hangul recognition using deep convolutional neural networks,” Int. J. Doc. Anal. Recognit., vol. 18, no. 1, pp. 1–13, 2015, doi: 10.1007/s10032-014-0229-4.
G. F. Fitriana, “Handwriting digit recognition using united moment invariant feature extraction and self organizing maps,” 2014, doi: 10.1109/ICT-ISPC.2014.6923214.
P. Kumar, “A lexicon-free approach for 3D handwriting recognition using classifier combination,” Pattern Recognit. Lett., vol. 103, pp. 1–7, 2018, doi: 10.1016/j.patrec.2017.12.014.
K. Xu, W. Long, Y. Sun, and Y. Lin, “A novel image feature extraction algorithm based on the fusion AutoEncoder and CNN,” J. Phys. Conf. Ser., vol. 1646, no. 1, 2020, doi: 10.1088/1742-6596/1646/1/012039.
“United Moment Invariants for Shape Discrimination,” vol. 00, no. October, pp. 88–93, 2003.
S. F. Rashid, F. Shafait, and T. M. Breuel, “Scanning Neural Network for Text Line Recognition.”
O. P. Sharma, M. K. Ghose, K. B. Shah, and B. K. Thakur, “Recent Trends and Tools for Feature Extraction in OCR Technology,” no. 6, pp. 220–223, 2013.
N. K. Raman, S. Gandhi, and J. Khurana, “Study and Analysis of Devnagari Handwritten Character Recognition Techniques,” vol. 2, no. 6, pp. 2–6, 2013.
S. ur Rehman, S. Tu, O. ur Rehman, Y. Huang, C. M. S. Magurawalage, and C. C. Chang, “Optimization of CNN through novel training strategy for visual classification problems,” Entropy, vol. 20, no. 4, pp. 1–11, 2018, doi: 10.3390/e20040290.
K. Jia, X. Wang, and X. Tang, “Image transformation based on learning dictionaries across image spaces,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 35, no. 2, pp. 367–380, 2013, doi: 10.1109/TPAMI.2012.95.
R. Ghosh, “RNN based online handwritten word recognition in Devanagari and Bengali scripts using horizontal zoning,” Pattern Recognit., vol. 92, pp. 203–218, 2019, doi: 10.1016/j.patcog.2019.03.030.
Bila bermanfaat silahkan share artikel ini
Berikan Komentar Anda terhadap artikel Pengenalan Tulisan Tangan Angka menggunakan Self Organizing Maps (SOM)
Pages: 31-42
Copyright (c) 2021 Gita Fadila Fitriana

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under Creative Commons Attribution 4.0 International License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (Refer to The Effect of Open Access).