First of all, let me state that I'm engineer, not mathematician, and finished my study 25 years ago, so I'm quite rusted, so please forgive me if I'm asking something obvious or stupid. My problem is this:
I have over 100ths of vectors in a high dimensional space (over 1000 parameters, yes, it's from deep learning problem), and I want to get the closest n vectors to a given vector using cosine distance. Naive approach, I calculate the distance to all vectors one by one, but it takes lots of compute power.
Now the part that I might ask the stupid question: If these vectors were in 3-d space would first fix the max distance I want, say in the radius of x, and I would find the vectors which are inside the sphere with radius x from the given vector. As a result, I would filter out all vectors which they are having distance greater than x in any dimension. For example, if the given vector sits on (0, 0, 0), and I'm looking for all vectors in the radius of 2, I would filter out all vectors which are having greater than 2 in any dimension, i.e (0, 0, 2), (0, 2, 0) or (2, 0, 0) because their distance to the given vector (0, 0, 0) would be obviously greater than 2. Then I would calculate the cosine or Euclidian distance of the vectors for the remaining vectors.
I suspect this approach is not applicable in higher dimensions, so is there any approach that I can filter out the distant vectors quickly before calculating the distance?