ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Disentangled Representation Learning for Unsupervised Neural Quantization
Cited 1 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Haechan Noh, Sangeek Hyun, Woojin Jeong, Hanshin Lim, Jae-Pil Heo
Issue Date
2023-06
Citation
Conference on Computer Vision and Pattern Recognition (CVPR) 2023, pp.12001-12010
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/CVPR52729.2023.01155
Abstract
The inverted index is a widely used data structure to avoid the infeasible exhaustive search. It accelerates retrieval significantly by splitting the database into multiple disjoint sets and restricts distance computation to a small fraction of the database. Moreover, it even improves search quality by allowing quantizers to exploit the compact distribution of residual vector space. However, we firstly point out a problem that an existing deep learning-based quantizer hardly benefits from the residual vector space, unlike conventional shallow quantizers. To cope with this problem, we introduce a novel disentangled representation learning for unsupervised neural quantization. Similar to the concept of residual vector space, the proposed method enables more compact latent space by disentangling information of the inverted index from the vectors. Experimental results on large-scale datasets confirm that our method outperforms the state-of-the-art retrieval systems by a large margin.
KSP Keywords
Data structure, Disjoint sets, Distance computation, Large-scale datasets, Latent space, Learning-based, Representation learning, deep learning(DL), exhaustive search(ES), inverted index, large margin