TY - GEN
T1 - Hierarchical semantic indexing for large scale image retrieval
AU - Deng, Jia
AU - Berg, Alexander C.
AU - Fei-Fei, Li
PY - 2011
Y1 - 2011
N2 - This paper addresses the problem of similar image retrieval, especially in the setting of large-scale datasets with millions to billions of images. The core novel contribution is an approach that can exploit prior knowledge of a semantic hierarchy. When semantic labels and a hierarchy relating them are available during training, significant improvements over the state of the art in similar image retrieval are attained. While some of this advantage comes from the ability to use additional information, experiments exploring a special case where no additional data is provided, show the new approach can still outperform OASIS [6], the current state of the art for similarity learning. Exploiting hierarchical relationships is most important for larger scale problems, where scalability becomes crucial. The proposed learning approach is fundamentally parallelizable and as a result scales more easily than previous work. An additional contribution is a novel hashing scheme (for bilinear similarity on vectors of probabilities, optionally taking into account hierarchy) that is able to reduce the computational cost of retrieval. Experiments are performed on Caltech256 and the larger ImageNet dataset.
AB - This paper addresses the problem of similar image retrieval, especially in the setting of large-scale datasets with millions to billions of images. The core novel contribution is an approach that can exploit prior knowledge of a semantic hierarchy. When semantic labels and a hierarchy relating them are available during training, significant improvements over the state of the art in similar image retrieval are attained. While some of this advantage comes from the ability to use additional information, experiments exploring a special case where no additional data is provided, show the new approach can still outperform OASIS [6], the current state of the art for similarity learning. Exploiting hierarchical relationships is most important for larger scale problems, where scalability becomes crucial. The proposed learning approach is fundamentally parallelizable and as a result scales more easily than previous work. An additional contribution is a novel hashing scheme (for bilinear similarity on vectors of probabilities, optionally taking into account hierarchy) that is able to reduce the computational cost of retrieval. Experiments are performed on Caltech256 and the larger ImageNet dataset.
UR - http://www.scopus.com/inward/record.url?scp=80052910977&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=80052910977&partnerID=8YFLogxK
U2 - 10.1109/CVPR.2011.5995516
DO - 10.1109/CVPR.2011.5995516
M3 - Conference contribution
AN - SCOPUS:80052910977
SN - 9781457703942
T3 - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 785
EP - 792
BT - 2011 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2011
PB - IEEE Computer Society
ER -