Selected article for: "art state and long short term memory"

Author: Lobo Neto, Vicente Coelho; Passos, Leandro Aparecido; Papa, João Paulo
Title: Evolving Long Short-Term Memory Networks
  • Cord-id: dtb959ei
  • Document date: 2020_6_15
  • ID: dtb959ei
    Snippet: Machine learning techniques have been massively employed in the last years over a wide variety of applications, especially those based on deep learning, which obtained state-of-the-art results in several research fields. Despite the success, such techniques still suffer from some shortcomings, such as the sensitivity to their hyperparameters, whose proper selection is context-dependent, i.e., the model may perform better over each dataset when using a specific set of hyperparameters. Therefore,
    Document: Machine learning techniques have been massively employed in the last years over a wide variety of applications, especially those based on deep learning, which obtained state-of-the-art results in several research fields. Despite the success, such techniques still suffer from some shortcomings, such as the sensitivity to their hyperparameters, whose proper selection is context-dependent, i.e., the model may perform better over each dataset when using a specific set of hyperparameters. Therefore, we propose an approach based on evolutionary optimization techniques for fine-tuning Long Short-Term Memory networks. Experiments were conducted over three public word-processing datasets for part-of-speech tagging. The results showed the robustness of the proposed approach for the aforementioned task.

    Search related documents:
    Co phrase search for related documents
    • accuracy loss and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9
    • activation function and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12