Author: Cheng, Mingxi; Nazarian, Shahin; Bogdan, Paul
Title: VRoC: Variational Autoencoder-aided Multi-task Rumor Classifier Based on Text Cord-id: bxh2w78b Document date: 2021_1_28
ID: bxh2w78b
Snippet: Social media became popular and percolated almost all aspects of our daily lives. While online posting proves very convenient for individual users, it also fosters fast-spreading of various rumors. The rapid and wide percolation of rumors can cause persistent adverse or detrimental impacts. Therefore, researchers invest great efforts on reducing the negative impacts of rumors. Towards this end, the rumor classification system aims to detect, track, and verify rumors in social media. Such systems
Document: Social media became popular and percolated almost all aspects of our daily lives. While online posting proves very convenient for individual users, it also fosters fast-spreading of various rumors. The rapid and wide percolation of rumors can cause persistent adverse or detrimental impacts. Therefore, researchers invest great efforts on reducing the negative impacts of rumors. Towards this end, the rumor classification system aims to detect, track, and verify rumors in social media. Such systems typically include four components: (i) a rumor detector, (ii) a rumor tracker, (iii) a stance classifier, and (iv) a veracity classifier. In order to improve the state-of-the-art in rumor detection, tracking, and verification, we propose VRoC, a tweet-level variational autoencoder-based rumor classification system. VRoC consists of a co-train engine that trains variational autoencoders (VAEs) and rumor classification components. The co-train engine helps the VAEs to tune their latent representations to be classifier-friendly. We also show that VRoC is able to classify unseen rumors with high levels of accuracy. For the PHEME dataset, VRoC consistently outperforms several state-of-the-art techniques, on both observed and unobserved rumors, by up to 26.9%, in terms of macro-F1 scores.
Search related documents:
Co phrase search for related documents- accuracy level and long lstm short term memory: 1
- accuracy level and loss function: 1, 2
- accuracy level and lstm layer: 1
- accuracy level and lstm network: 1, 2, 3
- accuracy level and lstm short term memory: 1
- accuracy macro f1 and macro accuracy: 1, 2, 3, 4, 5, 6, 7
- accuracy macro f1 and macro f1 score recall: 1
- accuracy macro f1 and macro f1 score recall precision: 1
Co phrase search for related documents, hyperlinks ordered by date