Working Paper Article Version 1 This version is not peer-reviewed

Learning by Injection: Attention Embedded Recurrent Neural Network for Amharic Text-image Recognition

Version 1 : Received: 13 October 2020 / Approved: 15 October 2020 / Online: 15 October 2020 (13:42:28 CEST)

How to cite: Belay, B.; Habtegebrial, T.; Belay, G.; Mesheshsa, M.; Liwicki, M.; Stricker, D. Learning by Injection: Attention Embedded Recurrent Neural Network for Amharic Text-image Recognition. Preprints 2020, 2020100324 Belay, B.; Habtegebrial, T.; Belay, G.; Mesheshsa, M.; Liwicki, M.; Stricker, D. Learning by Injection: Attention Embedded Recurrent Neural Network for Amharic Text-image Recognition. Preprints 2020, 2020100324

Abstract

In the present, the growth of digitization and worldwide communications make OCR systems of exotic languages a very important task. In this paper, we attempt to develop an OCR system for one of these exotic languages with a unique script, Amharic. Motivated by the recent success of the Attention mechanism in Neural Machine Translation (NMT), we extend the attention mechanism for Amharic text-image recognition. The proposed model consists of CNNs and attention embedded recurrent encoder-decoder networks that are integrated following the configuration of the seq2seq framework. The attention network parameters are trained in an end-to-end fashion and the context vector is injected, with the previously predicted output, at each time steps of decoding. Unlike the existing OCR model that minimizes the CTC objective function, the new model minimizes the categorical cross-entropy loss. The performance of the proposed attention-based model is evaluated against the test dataset from the ADOCR database which consists of both printed and synthetically generated Amharic text-line images and achieved promising results with a CER of 1.54% and 1.17% respectively.

Keywords

Amharic script ; Attention mechanism; OCR; Encoder-decoder; Text-image

Subject

Computer Science and Mathematics, Algebra and Number Theory

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.