Scheduled service maintenance on November 22


On Friday, November 22, 2024, between 06:00 CET and 18:00 CET, GIN services will undergo planned maintenance. Extended service interruptions should be expected. We will try to keep downtimes to a minimum, but recommend that users avoid critical tasks, large data uploads, or DOI requests during this time.

We apologize for any inconvenience.

Code for: Ferro, D., Gripon, V. and Jiang, X., 2016, July. Nearest neighbour search using binary neural networks. In 2016 International Joint Conference on Neural Networks (IJCNN) (pp. 5106-5112). IEEE.
The code finds nearest neighbours in terms of Euclidean distance, and uses it for classification. The search is optimized by combining Product Quantization (PQ) and binary neural associative memories (Willshaw Neural Networks).
DOI
http://dx.doi.org/10.1109/IJCNN.2016.7727873

Demetrio Ferro dbdbb81864 Update 'README.md' 11 mesiacov pred
Figures e72ac7ac8f Upload files to 'Figures' 1 rok pred
MNIST 61638c707e Add 'MNIST/Neural Networks over PQ/PQ Library/Yael Library/Readme.md' 11 mesiacov pred
TEXMEX 533e23dc3a Add 'TEXMEX/PQ Library/Yael Library/Readme.md' 11 mesiacov pred
LICENSE 8ae58179b2 Initial commit 1 rok pred
README.md dbdbb81864 Update 'README.md' 11 mesiacov pred

README.md

Nearest Neighbour Search using binary Neural Networks and Product Quantization

We face the problem of Nearest Neighbours Search (NNS) in terms of Euclidean distance, Hamming distance or other distance metric. In order to accelerate the search for the nearest neighbour in large collection datasets, many methods rely on the coarse-fine approach. In this work we propose to combine Product Quantization (PQ) and Willshaw Neural Networks (WNN), i.e. binary neural associative memories.

This work was published in 2016: International Joint Conference on Neural Networks (IJCNN), IEEE.

To be cited as:

Ferro, D., Gripon, V., & Jiang, X. (2016, July). Nearest neighbour search using binary neural networks. In 2016 International Joint Conference on Neural Networks (IJCNN) (pp. 5106-5112). IEEE. 10.1109/IJCNN.2016.7727873

We show the main results, i.e., performances vs computational cost for two main applications:

1) Classification of handwritten digits over MNIST (NIST, USA) dataset (60k train, 10k test).

Performances vs Computational Cost for Classification MNIST handwritten digits

2) Euclidean metric NNS of image SIFT descriptors over TEXMEX (IRISA, FR) dataset (1M train, 10k test).

Performances vs Computational Cost for Euclidean NNS Texmex SIFT1M

The project was developed between April and September 2015 during my internship at Télécom Bretagne, Brest (FR), under the supervision of Prof. Claude Berrou and Vincent Gripon.