Deep Architectures for Joint Clustering and Visualization with Self-Organizing Maps. Forest, F., Lebbah, M., Azzag, H., & Lacaille, J. In Workshop on Learning Data Representations for Clustering (LDRC), PAKDD, 2019.
Deep Architectures for Joint Clustering and Visualization with Self-Organizing Maps [link]Link  Deep Architectures for Joint Clustering and Visualization with Self-Organizing Maps [pdf]Paper  doi  abstract   bibtex   38 downloads  
Recent research has demonstrated how deep neural networks are able to learn representations to improve data clustering. By considering representation learning and clustering as a joint task, models learn clustering-friendly spaces and achieve superior performance, com- pared with standard two-stage approaches where dimensionality reduc- tion and clustering are performed separately. We extend this idea to topology-preserving clustering models, known as self-organizing maps (SOM). First, we present the Deep Embedded Self-Organizing Map (DE- SOM), a model composed of a fully-connected autoencoder and a custom SOM layer, where the SOM code vectors are learnt jointly with the au- toencoder weights. Then, we show that this generic architecture can be extended to image and sequence data by using convolutional and recur- rent architectures, and present variants of these models. First results demonstrate advantages of the DESOM architecture in terms of cluster- ing performance, visualization and training time.

Downloads: 38