Rights statement: This is the author’s version of a work that was accepted for publication in Neurocomputing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neurocomputing, 329, 2019 DOI: 10.1016/j.neucom.2018.10.069
Accepted author manuscript, 2.09 MB, PDF document
Available under license: CC BY-NC-ND
Final published version
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Class-specific synthesized dictionary model for Zero-Shot Learning
AU - Ji, Z.
AU - Wang, J.
AU - Yu, Y.
AU - Pang, Y.
AU - Han, J.
N1 - This is the author’s version of a work that was accepted for publication in Neurocomputing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neurocomputing, 329, 2019 DOI: 10.1016/j.neucom.2018.10.069
PY - 2019/2/15
Y1 - 2019/2/15
N2 - Zero-Shot Learning (ZSL) aims at recognizing unseen classes that are absent during the training stage. Unlike the existing approaches that learn a visual-semantic embedding model to bridge the low-level visual space and the high-level class prototype space, we propose a novel synthesized approach for addressing ZSL within a dictionary learning framework. Specifically, it learns both a dictionary matrix and a class-specific encoding matrix for each seen class to synthesize pseudo instances for unseen classes with auxiliary of seen class prototypes. This allows us to train the classifiers for the unseen classes with these pseudo instances. In this way, ZSL can be treated as a traditional classification task, which makes it applicable for traditional and generalized ZSL settings simultaneously. Extensive experimental results on four benchmark datasets (AwA, CUB, aPY, and SUN) demonstrate that our method yields competitive performances compared to state-of-the-art methods on both settings.
AB - Zero-Shot Learning (ZSL) aims at recognizing unseen classes that are absent during the training stage. Unlike the existing approaches that learn a visual-semantic embedding model to bridge the low-level visual space and the high-level class prototype space, we propose a novel synthesized approach for addressing ZSL within a dictionary learning framework. Specifically, it learns both a dictionary matrix and a class-specific encoding matrix for each seen class to synthesize pseudo instances for unseen classes with auxiliary of seen class prototypes. This allows us to train the classifiers for the unseen classes with these pseudo instances. In this way, ZSL can be treated as a traditional classification task, which makes it applicable for traditional and generalized ZSL settings simultaneously. Extensive experimental results on four benchmark datasets (AwA, CUB, aPY, and SUN) demonstrate that our method yields competitive performances compared to state-of-the-art methods on both settings.
KW - Dictionary learning
KW - Image recognition
KW - Synthesized model
KW - Zero-Shot Learning
KW - Benchmark datasets
KW - Classification tasks
KW - Competitive performance
KW - Encoding matrix
KW - State-of-the-art methods
KW - Visual semantics
KW - Semantics
KW - article
KW - classifier
KW - embedding
KW - learning
KW - sun
U2 - 10.1016/j.neucom.2018.10.069
DO - 10.1016/j.neucom.2018.10.069
M3 - Journal article
VL - 329
SP - 339
EP - 347
JO - Neurocomputing
JF - Neurocomputing
SN - 0925-2312
ER -