Home > Research > Publications & Outputs > Class-specific synthesized dictionary model for...

Electronic data

  • NEUCOM-D-18-01958.R1

    Rights statement: This is the author’s version of a work that was accepted for publication in Neurocomputing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neurocomputing, 329, 2019 DOI: 10.1016/j.neucom.2018.10.069

    Accepted author manuscript, 2.09 MB, PDF document

    Available under license: CC BY-NC-ND

Links

Text available via DOI:

View graph of relations

Class-specific synthesized dictionary model for Zero-Shot Learning

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Class-specific synthesized dictionary model for Zero-Shot Learning. / Ji, Z.; Wang, J.; Yu, Y. et al.
In: Neurocomputing, Vol. 329, 15.02.2019, p. 339-347.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Ji, Z, Wang, J, Yu, Y, Pang, Y & Han, J 2019, 'Class-specific synthesized dictionary model for Zero-Shot Learning', Neurocomputing, vol. 329, pp. 339-347. https://doi.org/10.1016/j.neucom.2018.10.069

APA

Vancouver

Ji Z, Wang J, Yu Y, Pang Y, Han J. Class-specific synthesized dictionary model for Zero-Shot Learning. Neurocomputing. 2019 Feb 15;329:339-347. Epub 2018 Nov 5. doi: 10.1016/j.neucom.2018.10.069

Author

Ji, Z. ; Wang, J. ; Yu, Y. et al. / Class-specific synthesized dictionary model for Zero-Shot Learning. In: Neurocomputing. 2019 ; Vol. 329. pp. 339-347.

Bibtex

@article{60b37c472638461e9dcc2724f3be1cd1,
title = "Class-specific synthesized dictionary model for Zero-Shot Learning",
abstract = "Zero-Shot Learning (ZSL) aims at recognizing unseen classes that are absent during the training stage. Unlike the existing approaches that learn a visual-semantic embedding model to bridge the low-level visual space and the high-level class prototype space, we propose a novel synthesized approach for addressing ZSL within a dictionary learning framework. Specifically, it learns both a dictionary matrix and a class-specific encoding matrix for each seen class to synthesize pseudo instances for unseen classes with auxiliary of seen class prototypes. This allows us to train the classifiers for the unseen classes with these pseudo instances. In this way, ZSL can be treated as a traditional classification task, which makes it applicable for traditional and generalized ZSL settings simultaneously. Extensive experimental results on four benchmark datasets (AwA, CUB, aPY, and SUN) demonstrate that our method yields competitive performances compared to state-of-the-art methods on both settings. ",
keywords = "Dictionary learning, Image recognition, Synthesized model, Zero-Shot Learning, Benchmark datasets, Classification tasks, Competitive performance, Encoding matrix, State-of-the-art methods, Visual semantics, Semantics, article, classifier, embedding, learning, sun",
author = "Z. Ji and J. Wang and Y. Yu and Y. Pang and J. Han",
note = "This is the author{\textquoteright}s version of a work that was accepted for publication in Neurocomputing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neurocomputing, 329, 2019 DOI: 10.1016/j.neucom.2018.10.069",
year = "2019",
month = feb,
day = "15",
doi = "10.1016/j.neucom.2018.10.069",
language = "English",
volume = "329",
pages = "339--347",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier Science B.V.",

}

RIS

TY - JOUR

T1 - Class-specific synthesized dictionary model for Zero-Shot Learning

AU - Ji, Z.

AU - Wang, J.

AU - Yu, Y.

AU - Pang, Y.

AU - Han, J.

N1 - This is the author’s version of a work that was accepted for publication in Neurocomputing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neurocomputing, 329, 2019 DOI: 10.1016/j.neucom.2018.10.069

PY - 2019/2/15

Y1 - 2019/2/15

N2 - Zero-Shot Learning (ZSL) aims at recognizing unseen classes that are absent during the training stage. Unlike the existing approaches that learn a visual-semantic embedding model to bridge the low-level visual space and the high-level class prototype space, we propose a novel synthesized approach for addressing ZSL within a dictionary learning framework. Specifically, it learns both a dictionary matrix and a class-specific encoding matrix for each seen class to synthesize pseudo instances for unseen classes with auxiliary of seen class prototypes. This allows us to train the classifiers for the unseen classes with these pseudo instances. In this way, ZSL can be treated as a traditional classification task, which makes it applicable for traditional and generalized ZSL settings simultaneously. Extensive experimental results on four benchmark datasets (AwA, CUB, aPY, and SUN) demonstrate that our method yields competitive performances compared to state-of-the-art methods on both settings.

AB - Zero-Shot Learning (ZSL) aims at recognizing unseen classes that are absent during the training stage. Unlike the existing approaches that learn a visual-semantic embedding model to bridge the low-level visual space and the high-level class prototype space, we propose a novel synthesized approach for addressing ZSL within a dictionary learning framework. Specifically, it learns both a dictionary matrix and a class-specific encoding matrix for each seen class to synthesize pseudo instances for unseen classes with auxiliary of seen class prototypes. This allows us to train the classifiers for the unseen classes with these pseudo instances. In this way, ZSL can be treated as a traditional classification task, which makes it applicable for traditional and generalized ZSL settings simultaneously. Extensive experimental results on four benchmark datasets (AwA, CUB, aPY, and SUN) demonstrate that our method yields competitive performances compared to state-of-the-art methods on both settings.

KW - Dictionary learning

KW - Image recognition

KW - Synthesized model

KW - Zero-Shot Learning

KW - Benchmark datasets

KW - Classification tasks

KW - Competitive performance

KW - Encoding matrix

KW - State-of-the-art methods

KW - Visual semantics

KW - Semantics

KW - article

KW - classifier

KW - embedding

KW - learning

KW - sun

U2 - 10.1016/j.neucom.2018.10.069

DO - 10.1016/j.neucom.2018.10.069

M3 - Journal article

VL - 329

SP - 339

EP - 347

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

ER -