Author: Spinks, Graham; Moens, Marie-Francine
Title: Structured (De)composable Representations Trained with Neural Networks Cord-id: uemlyxbv Document date: 2020_8_5
ID: uemlyxbv
Snippet: The paper proposes a novel technique for representing templates and instances of concept classes. A template representation refers to the generic representation that captures the characteristics of an entire class. The proposed technique uses end-to-end deep learning to learn structured and composable representations from input images and discrete labels. The obtained representations are based on distance estimates between the distributions given by the class label and those given by contextual
Document: The paper proposes a novel technique for representing templates and instances of concept classes. A template representation refers to the generic representation that captures the characteristics of an entire class. The proposed technique uses end-to-end deep learning to learn structured and composable representations from input images and discrete labels. The obtained representations are based on distance estimates between the distributions given by the class label and those given by contextual information, which are modeled as environments. We prove that the representations have a clear structure allowing to decompose the representation into factors that represent classes and environments. We evaluate our novel technique on classification and retrieval tasks involving different modalities (visual and language data).
Search related documents:
Co phrase search for related documents- adam optimizer and long term memory: 1, 2, 3
- additional information and logistic regression: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16
- additional information and long term memory: 1
- additional structure and logistic regression: 1
- logistic regression and long term memory: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14
Co phrase search for related documents, hyperlinks ordered by date