Skip to content

PyTorch implementation of Kohonen's Self-Organizing Map.

License

Notifications You must be signed in to change notification settings

nicomignoni/som

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kohonen's Self-Organizing Map (SOM)

PyPI version shields.io

gif

Installation

pip install kohonen-som

Background

The original paper written by Teuvo Kohonen in 1990 was one of the first neural network model capable of unsupervised learning.

Out of the different implementations of the algorithm, this one follows almost entirely the original paper. The update function is defined as

where

and equation is the current epoch.

Also, each neuron is connected to all the other ones, hence the map is a equation complete graph, where equation is the number of neurons.

Example

from sklearn.datasets import load_iris
from sklearn.decomposition import PCA

import matplotlib.pyplot as plt
import numpy as np

from som.mapping import SOM

dataset = load_iris()
train = dataset.data

# Reducing the dimensionality of the train set from
# 4 to 2 with PCA
pca       = PCA(n_components=2)
train_pca = pca.fit_transform(train)

parameters = {'n_points'  : 500,
              'alpha0'    : 0.5,
              't_alpha'   : 25,
              'sigma0'    : 2,
              't_sigma'   : 25,
              'epochs'    : 300,
              'seed'      : 124,
              'scale'     : True,
              'shuffle'   : True,
              'history'   : True}

# Load and train the model
model = SOM()
model.set_params(parameters)
model.fit(train_pca)

weights = model.get_weights()

# Plot the train dataset and the weights
fig, ax = plt.subplots()
fig.suptitle("Train set (PCA-reduced) and weights")
t = ax.scatter(train_pca[:,0], train_pca[:,1])
w = ax.scatter(weights[:, 0], weights[:, 1])
fig.legend((t, w), ("Train", "Weights"))
plt.show()

About

PyTorch implementation of Kohonen's Self-Organizing Map.

Topics

Resources

License

Stars

Watchers

Forks

Languages