IMPLEMENTING TRANSFORMER ALGORITHM FOR THE TRANSLATION PROBLEM FROM ENGLISH TO SERBIAN LANGUAGE

Authors

  • Драган Зарић Autor

DOI:

https://doi.org/10.24867/30BE32Zaric

Keywords:

Natural language processing, Sequence-to-sequence model

Abstract

This paper topics is the implementation of a transformer algorithm for the problem of translation from English to Serbian language and the formation of a pretrained model that can be used for this and other natural language processing tasks. The structure implemented in this paper was proposed in the paper.

References

[1] A. Vaswani, N. Shazeer and others, Attention is All You Need, Advances in Neural Information Processing Systems, 2017, pp. 5998-6008. https://papers.nips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper. pdf
[2] Daniel Jurafsky and James H. Martin, Speech and Language Processing, An Introduction to Natural Language Processing, 3rd edition, 2024 https://web.stanford.edu/~jurafsky/slp3/
[3] Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton, Layer Normalization, 2016 https://arxiv.org/abs/1607.06450
[4] https://data.statmt.org/opus-100-corpus/v1.0/supervised/en-sr/
[5] Слика преузета са: https://www.brainlabsdigital.com/blog/what-word2vec-means-forseo/

Published

2025-04-04

Issue

Section

Electrotechnical and Computer Engineering