-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Train function in IndexLSH? #4023
Comments
Hello, I also have this question. And when I look at the source code of IndexLSH, it seems that it doesn't use a random hyperplane? |
Hi @Phoenix0726! About the random hyperplane, they seem to use a (n_hash_tables x hash_code_size) random matrix that they call "rotation matrix". They fix the seed here. The training still puzzles me -- would be great to know more about this training and the pre-processing that it calls. Btw, given that we seem to be looking at the same implementation, have you found the point in IndexLSH where they set the code size to "ceil(d / 8)" as suggested here? |
The statement in the demo isn't wrong - LSH does not require training. It can function okay without any training, unlike indexes like PQ. However, training does tune the thresholds, so it is expected to have a positive effect on accuracy. |
It depends on the constructor parameters https://github.com/facebookresearch/faiss/blob/main/faiss/IndexLSH.h#L33 if train_thresholds is set then the thresholds to decide about 0 or 1 will be set to the median in the training set. Otherwise they are set to 0. |
@mdouze @gtwang01 Thanks for your reply! I've read that IndexLSH is based on LSH random projection. I may have missed it but I didn't see where in the code you're computing the dot-product between the dataset and the hyperplanes. Could you please point me to that? At first I thought it was this, but I'm not sure, also because this seems to be called only if (A minor question, is nbits the number of hash tables or the number of hyperplanes per hash table, i.e., the number of bits of the hash code? I assume the second one. So is it the case that the idea of considering more hash tables must be implemented at higher level, by constructing multiple indexes?) Thank you! |
This issue is stale because it has been open for 7 days with no activity. |
@Hillier98 Thanks for your reply! |
This issue is stale because it has been open for 7 days with no activity. |
Yes the (optional) transformation is applied https://github.com/facebookresearch/faiss/blob/main/faiss/IndexLSH.cpp#L111 this transformation can be a rotation in which case it's equivalent to doing dot products with the hyperplane directions. |
@mdouze hello, can you point where the code is doing dot products with the hyperplane directions? Thank you! |
Hi, could you please help me to better understand your work? I read in one of your demos that "Unlike PQ, LSH does not require training". However, I see a train function in the code of IndexLSH. My assumption was that the function was used to adjust the thresholds (i.e., the hyperplanes) given the dataset, in order to get better hash buckets. So I assumed that it's called after constructing IndexLSH and before searching for a query point. Is this not the case? Could you please let me know the purpose of the function and when it should be called when using IndexLSH, if at all? Thanks!
The text was updated successfully, but these errors were encountered: