TY - GEN
T1 - A SIMPLE SUPERVISED HASHING ALGORITHM USING PROJECTED GRADIENT AND OPPOSITIONAL WEIGHTS
AU - Hemati, Sobhan
AU - Mehdizavareh, Mohammad Hadi
AU - Babaie, Morteza
AU - Kalra, Shivam
AU - Tizhoosh, H. R.
N1 - Publisher Copyright:
© 2021 IEEE
PY - 2021
Y1 - 2021
N2 - Learning to hash is generating similarity-preserving binary representations of images, which is, among others, an efficient way for fast image retrieval. Two-step hashing has become a common approach because it simplifies the learning by separating binary code inference from hash function training. However, the binary code inference typically leads to an intractable optimization problem with binary constraints. Different relaxation methods, which are generally based on complicated optimization techniques, have been proposed to address this challenge. In this paper, a simple relaxation scheme based on the projected gradient is proposed. To this end in each iteration, we try to update the optimization variable as if there is no binary constraint and then project the updated solution to the feasible set. We formulate the projection step as fining closet binary matrix to the updated matrix and take advantage of the closed-form solution for the projection step to complete our learning algorithm. Inspired by opposition-based learning, pairwise opposite weights between data points are incorporated to impose a stronger penalty on data instances with higher misclassification probability in the proposed objective function. We show that this simple learning algorithm leads to binary codes that achieve competitive results on both CIFAR-10 and NUS-WIDE datasets compared to state-of-the-art benchmarks.
AB - Learning to hash is generating similarity-preserving binary representations of images, which is, among others, an efficient way for fast image retrieval. Two-step hashing has become a common approach because it simplifies the learning by separating binary code inference from hash function training. However, the binary code inference typically leads to an intractable optimization problem with binary constraints. Different relaxation methods, which are generally based on complicated optimization techniques, have been proposed to address this challenge. In this paper, a simple relaxation scheme based on the projected gradient is proposed. To this end in each iteration, we try to update the optimization variable as if there is no binary constraint and then project the updated solution to the feasible set. We formulate the projection step as fining closet binary matrix to the updated matrix and take advantage of the closed-form solution for the projection step to complete our learning algorithm. Inspired by opposition-based learning, pairwise opposite weights between data points are incorporated to impose a stronger penalty on data instances with higher misclassification probability in the proposed objective function. We show that this simple learning algorithm leads to binary codes that achieve competitive results on both CIFAR-10 and NUS-WIDE datasets compared to state-of-the-art benchmarks.
KW - Binary representation
KW - Image search
KW - Supervised hashing
KW - Two step hashing
UR - http://www.scopus.com/inward/record.url?scp=85125586744&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85125586744&partnerID=8YFLogxK
U2 - 10.1109/ICIP42928.2021.9506441
DO - 10.1109/ICIP42928.2021.9506441
M3 - Conference contribution
AN - SCOPUS:85125586744
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 2748
EP - 2752
BT - 2021 IEEE International Conference on Image Processing, ICIP 2021 - Proceedings
PB - IEEE Computer Society
T2 - 2021 IEEE International Conference on Image Processing, ICIP 2021
Y2 - 19 September 2021 through 22 September 2021
ER -