Selected article for: "additional challenge and long term memory"

Author: Messaoudi, Abir; Haddad, Hatem; Ben HajHmida, Moez; Fourati, Chayma; Ben Hamida, Abderrazak
Title: Learning Word Representations for Tunisian Sentiment Analysis
  • Cord-id: sta6gy88
  • Document date: 2021_2_22
  • ID: sta6gy88
    Snippet: Tunisians on social media tend to express themselves in their local dialect using Latin script (TUNIZI). This raises an additional challenge to the process of exploring and recognizing online opinions. To date, very little work has addressed TUNIZI sentiment analysis due to scarce resources for training an automated system. In this paper, we focus on the Tunisian dialect sentiment analysis used on social media. Most of the previous work used machine learning techniques combined with handcrafted
    Document: Tunisians on social media tend to express themselves in their local dialect using Latin script (TUNIZI). This raises an additional challenge to the process of exploring and recognizing online opinions. To date, very little work has addressed TUNIZI sentiment analysis due to scarce resources for training an automated system. In this paper, we focus on the Tunisian dialect sentiment analysis used on social media. Most of the previous work used machine learning techniques combined with handcrafted features. More recently, Deep Neural Networks were widely used for this task, especially for the English language. In this paper, we explore the importance of various unsupervised word representations (word2vec, BERT) and we investigate the use of Convolutional Neural Networks and Bidirectional Long Short-Term Memory. Without using any kind of handcrafted features, our experimental results on two publicly available datasets [18, 19] showed comparable performances to other languages.

    Search related documents:
    Co phrase search for related documents
    • accuracy achieve and long bi lstm short term memory: 1
    • accuracy achieve and low resource: 1, 2, 3
    • accuracy achieve and machine model: 1, 2, 3, 4, 5, 6, 7
    • accuracy good performance and achieve accuracy: 1
    • accuracy good performance and machine model: 1
    • accuracy performance achieve and achieve accuracy: 1, 2, 3, 4
    • accuracy performance and achieve accuracy: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
    • accuracy performance and long bi lstm short term memory: 1, 2
    • accuracy performance and low resource: 1
    • accuracy performance and machine model: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21
    • achieve accuracy and long bi lstm short term memory: 1
    • achieve accuracy and low resource: 1, 2, 3
    • achieve accuracy and machine model: 1, 2, 3, 4, 5, 6, 7
    • long bi lstm short term memory and lstm representation: 1