Author: Zhang, Xu; Lu, Wenpeng; Zhang, Guoqiang; Li, Fangfang; Wang, Shoujin
Title: Chinese Sentence Semantic Matching Based on Multi-Granularity Fusion Model Cord-id: 6rw3atf1 Document date: 2020_4_17
ID: 6rw3atf1
Snippet: Sentence semantic matching is the cornerstone of many natural language processing tasks, including Chinese language processing. It is well known that Chinese sentences with different polysemous words or word order may have totally different semantic meanings. Thus, to represent and match the sentence semantic meaning accurately, one challenge that must be solved is how to capture the semantic features from the multi-granularity perspective, e.g., characters and words. To address the above challe
Document: Sentence semantic matching is the cornerstone of many natural language processing tasks, including Chinese language processing. It is well known that Chinese sentences with different polysemous words or word order may have totally different semantic meanings. Thus, to represent and match the sentence semantic meaning accurately, one challenge that must be solved is how to capture the semantic features from the multi-granularity perspective, e.g., characters and words. To address the above challenge, we propose a novel sentence semantic matching model which is based on the fusion of semantic features from character-granularity and word-granularity, respectively. Particularly, the multi-granularity fusion intends to extract more semantic features to better optimize the downstream sentence semantic matching. In addition, we propose the equilibrium cross-entropy, a novel loss function, by setting mean square error (MSE) as an equilibrium factor of cross-entropy. The experimental results conducted on Chinese open data set demonstrate that our proposed model combined with binary equilibrium cross-entropy loss function is superior to the existing state-of-the-art sentence semantic matching models.
Search related documents:
Co phrase search for related documents- accuracy improve and long distance: 1
- accuracy improve and long short term: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13
- accuracy improve and loss function: 1, 2, 3, 4
- activation function and long distance: 1
- activation function and long short term: 1, 2
- activation function and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
- activation function sigmoid and loss function: 1, 2
- long short term and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9
- long short term and lstm single layer: 1, 2
- long short term and machine translation: 1, 2, 3
Co phrase search for related documents, hyperlinks ordered by date