Abstract
The K-means algorithm is a commonly used technique in cluster analysis. In this paper, several questions about the algorithm are addressed. The clustering problem is first cast as a nonconvex mathematical program. Then, a rigorous proof of the finite convergence of the K-means-type algorithm is given for any metric. It is shown that under certain conditions the algorithm may fail to converge to a local minimum, and that it converges under differentiability conditions to a Kuhn-Tucker point. Finally, a method for obtaining a local-minimum solution is given.
| Original language | English |
|---|---|
| Pages (from-to) | 81-87 |
| Number of pages | 7 |
| Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
| Volume | PAMI-6 |
| Issue number | 1 |
| DOIs | |
| State | Published - Jan 1984 |
Keywords
- Basic ISODATA
- Index Terms
- K-means algorithm
- K-means convergence
- cluster analysis
- numerical taxonomy
ASJC Scopus subject areas
- Software
- Computer Vision and Pattern Recognition
- Computational Theory and Mathematics
- Artificial Intelligence
- Applied Mathematics