Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kmeans.java code mistakes #1

Open
francoisleduc opened this issue May 23, 2018 · 2 comments
Open

Kmeans.java code mistakes #1

francoisleduc opened this issue May 23, 2018 · 2 comments

Comments

@francoisleduc
Copy link

Hi, I was reading your code the other day because I was interested to see an Hamerly's implementation of the Kmeans in java.
But after reading it I spotted a few mistakes in your code that you might take a look at.
Based on the spotted errors I will show you an example which proves that it doesn't work with your code:

  1. Let's take the points represented on the graph (8 points, in two dimensions)
    Mon image

  2. Impose the centers as : (7.5;6.5) and (11.5;4.5) (close to each other in the upper right part of the graph)

  3. Now let's start your algorithm

  4. Output (I changed the printing form):

Center of cluster = [4.0, 4.166666666666667]
Point number 0: [1.0, 1.5]
Point number 1: [3.5, 2.0]
Point number 2: [2.0, 4.0]
Point number 3: [1.0, 6.5]
Point number 4: [7.5, 6.5]
Point number 5: [9.0, 4.5]

Center of cluster = [11.0, 5.5]
Point number 0: [11.5, 4.5]
Point number 1: [10.5, 6.5]

Here it is clear that you have a problem since the right answer should obviously return 4 points in each cluster especially with these two centers coordinates at the end.

Let's take a step back:
In fact, you use the squared euclidean distance everywhere. It's actually a good choice since you avoid computing square roots here and there. At first sight you only need to compare distances and as long as you only need to compare distances keeping these are fine from a mathematical point of view.

However, it is really important to keep in mind that in the Hamerly's optimization (with the two boundaries) you also ADD and SUBSTRACT distances and then compare them, that's where the nightmare begins and where you have to be careful.

If you take the pseudo-codes on which your implementation is based on, the errors can be pointed out at line 6 of the Kmeans(dataset x, initial centers c) function
Here: m = max(s(a(i))/2, l(i))
and at line 4,6,8 of the Update-Bounds(p,a,u,l) function
Here : u(i) = u(i) + p(a(i))
Here : l(i) = l(i) - p(r')
Here : l(i) = l(i) - p(r)

The solution if you want to keep your squared euclidean distance mechanism as it is (what I recommend you to keep) then, you have to edit the lines I mentioned above.

In the k-means function

final double m = Math.max(s[a[i]] * 0.5f, l[i]);

As we apply the square on the coefficient. It becomes

final double m = Math.max(s[a[i]] * 0.25f, l[i]);

In the update-bounds function:

u[i] += p[a[i]];

Becomes

double final sumSqrt = Math.sqrt(u[i]) + Math.sqrt(p[a[i]]);
u[i] = sumSqrt * sumSqrt;

And

l[i] -= p[rp];

Becomes

double final diffSqrt = Math.sqrt(l[i]) - Math.sqrt(p[rp]);
l[i] = diffSqrt * diffSqrt;

And

l[i] -= p[r];

Becomes

double final diffSqrt = Math.sqrt(l[i]) - Math.sqrt(p[r]);
l[i] = diffSqrt * diffSqrt;

And the final output is now correct

Center of cluster = [1.875, 3.5]
Point number 0: [1.0, 1.5]
Point number 1: [3.5, 2.0]
Point number 2: [2.0, 4.0]
Point number 3: [1.0, 6.5]

Center of cluster = [9.625, 5.5]
Point number 0: [7.5, 6.5]
Point number 1: [9.0, 4.5]
Point number 2: [11.5, 4.5]
Point number 3: [10.5, 6.5]

I hope I am not mistaken and that it'll help people that may use your code !
I'm open to any questions regarding these modifications
Have a nice day.

@e-n-f
Copy link
Owner

e-n-f commented May 23, 2018

Thanks for the bug report. I'm not sure whether this error was in the original code or in my port of it to Java.

@Ashlanfox
Copy link

Ashlanfox commented May 24, 2018

You should also consider using Mathf.max for the difference between two square root. Otherwise you could get a negative value and square it which falses the lower bound.

Be careful while calculating the new center as you might divide by zero if the cluster is empty.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants