dc.contributor.author |
Kepsha, Volodymyr |
|
dc.date.accessioned |
2021-08-19T08:44:01Z |
|
dc.date.available |
2021-08-19T08:44:01Z |
|
dc.date.issued |
2021-08-19 |
|
dc.identifier.issn |
2021/I/D/ |
|
dc.identifier.uri |
https://repin.pjwstk.edu.pl/xmlui/handle/186319/818 |
|
dc.description.abstract |
Abstract In this paper I would like to present the idea behind neural machine
translation starting from the sequence to sequence models based on recurrent neural
networks and long short memory term networks and finishing with the state of the
art Transformer model. Also, I will explain the complete process of preparing ready
to used translation models, which consists of language pair selection, corpus processing
and model training. The main idea is to show the growth of machine translation
architectures and how they improve the translation quality. |
pl_PL |
dc.language.iso |
en |
pl_PL |
dc.relation.ispartofseries |
;Nr 6037 |
|
dc.title |
Neural machine translation |
pl_PL |
dc.title.alternative |
Neuronowe tłumaczenie maszynowe |
pl_PL |
dc.type |
Thesis |
pl_PL |