Browse by author
Lookup NU author(s): Dr Shidong WangORCiD
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
© 2023 Elsevier B.V.Generalized Zero-Shot Learning (GZSL) has become an important research due to its powerful ability of recognizing unseen objects. Generative methods, converting conventional GZSL to fully supervised learning, can achieve competing performance, and most of them use semantic attributes plus Gaussian noise to enrich generated features. The visual features obtained in this way are consistent with the semantic description. However, the reality is that the semantic description of the visual features of the same category should be different, because the appearance of images differs from each other although they belong to a same category, i.e., mapping from semantic attributes to visual features should be a “many to many” relationship rather than “one to many”. Therefore, we propose a novel method to generate diverse augmented attribute, which are subsequently utilized to synthesize features. We construct a semantic generator based on a pre-trained semantic mapper, which augments the category semantics. Using the augmented category semantics to generate visual features will result in a better fit of the generated visual features to the distribution of real features. The proposed method can well solve the pseudo diversity of visual features generated by most generative GZSL methods. We evaluate the proposed method on five popular benchmark datasets, and the results show that it can achieve the state-of-the-art performance.
Author(s): Zhao X, Shen Y, Wang S, Zhang H
Publication type: Article
Publication status: Published
Journal: Pattern Recognition Letters
Year: 2023
Volume: 166
Pages: 126-133
Print publication date: 01/02/2023
Online publication date: 11/01/2023
Acceptance date: 08/01/2023
ISSN (print): 0167-8655
ISSN (electronic): 1872-7344
Publisher: Elsevier B.V.
URL: https://doi.org/10.1016/j.patrec.2023.01.005
DOI: 10.1016/j.patrec.2023.01.005
Altmetrics provided by Altmetric