Evaluating Sentence Alignment Methods in a Low-Resource Setting: An English-YorùBá Study Case

Warning

This publication doesn't include Faculty of Arts. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Title in English Evaluating Sentence Alignment Methods in a Low-Resource Setting: An English-YoruBá Study Case
Authors

SIGNORONI Edoardo RYCHLÝ Pavel

Year of publication 2023
Type Article in Proceedings
Conference Proceedings of the Sixth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2023)
MU Faculty or unit

Faculty of Informatics

Citation
web https://aclanthology.org/2023.loresmt-1.10.pdf
Doi http://dx.doi.org/10.18653/v1/2023.loresmt-1.10
Keywords NLP;low-resource;sentence alignment
Attached files
Description Parallel corpora are still crucial to train effective Machine Translation systems. This is even more true for low-resource language pairs, for which Neural Machine Translation has been shown to be less robust to domain mismatch and noise. Due to time and resource constraints, parallel corpora are mostly created with sentence alignment methods which automatically infer alignments. Recent work focused on state-of-the-art pre-trained sentence embeddings-based methods which are available only for a tiny fraction of the world’s languages. In this paper, we evaluate the performance of four widely used algorithms on the low-resource English-Yorubá language pair against a multidomain benchmark parallel corpus on two experiments involving 1-to-1 alignments with and without reordering. We find that, at least for this language pair, earlier and simpler methods are more suited to the task, all the while not requiring additional data or resources. We also report that the methods we evaluated perform differently across distinct domains, thus indicating that some approach may be better for a specific domain or textual structure.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.