•  
  •  
 

Document Type

Article

Keywords

Arabic machine translation, deep learning, neural MT, sulfur texts, transformers model

Abstract

The field of machine translation (MT) has seen significant advancements with deep learning (DL) techniques for translating texts among different languages. Despite the wealth of studies, there exists a noticeable gap in significant research dedicated to its translate Sulfur manufacture texts, primarily hindered by resource scarcity and the intricate grammatical structures inherent to these texts. This paper explores the application of transformer-based Arabic MT for sulfur manufacture texts, including its attention mechanisms and encoder-decoder framework, focusing on the new model ability to handle the linguistic and syntactic complexities inherent in these languages, such as morphological richness and context, and how the transformer's self-attention mechanism addresses these issues. It discusses the specific challenges of our proposed translation model, the obtained results indicate that this model is effective and has an accuracy of 90.7% in comparison with Mishraq application, which has 84.9% for the same test samples.

Share

COinS