This paper presents another necessary condition about the optimum parti-tion on a finite set of samples. From this condition, a corresponding generalized sequential hao f k-means (GSHKM) clustering algorithm is built ...This paper presents another necessary condition about the optimum parti-tion on a finite set of samples. From this condition, a corresponding generalized sequential hao f k-means (GSHKM) clustering algorithm is built and many well-known clustering algorithms are found to be included in it. Under some assumptions the well-known MacQueen's SHKM (Sequential Hard K-Means)algorithm, FSCL (Frequency Sensitive Competitive Learning) algorithm and RPCL (Rival Penalized Competitive Learning) algorithm are derived. It is shown that FSCL in fact still belongs to the kind of GSHKM clustering algth rithm and is more suitable for producing means of K-partition of sample data,which is illustrated by numerical experiment. Meanwhile, some improvements on these algorithms are also given.展开更多
文摘This paper presents another necessary condition about the optimum parti-tion on a finite set of samples. From this condition, a corresponding generalized sequential hao f k-means (GSHKM) clustering algorithm is built and many well-known clustering algorithms are found to be included in it. Under some assumptions the well-known MacQueen's SHKM (Sequential Hard K-Means)algorithm, FSCL (Frequency Sensitive Competitive Learning) algorithm and RPCL (Rival Penalized Competitive Learning) algorithm are derived. It is shown that FSCL in fact still belongs to the kind of GSHKM clustering algth rithm and is more suitable for producing means of K-partition of sample data,which is illustrated by numerical experiment. Meanwhile, some improvements on these algorithms are also given.