Intro to Self Organizing Map and SOM Python Implementation

Self Organizing Map Python Tutorial

Self Organizing Map (SOM) is also known as Kohonen Map that is considered as an Artificial Neural Network model which resembles mammalian cerebral cortex characteristics. SOM is an unsupervised learning algorithm that employs the vector quantization method. In this tutorial, we are going to learn the core concepts in SOM and how to implement a SOM by using Python3 programming language. To follow this tutorial, you should have a good idea about vector spaces and you should have a basic idea about Machine Learning. If you need to know the basics in ML, you can refer to Zero to Hero in ML book.

Vector Quantization

Now you may ask what vector quantization is. In vector quantization, simply it attempts to quantize the vector space by distributing prototype vectors. After segregating the vector space into regions, each region will be represented by a vector called the centroid. The simplified quantization algorithm is

  1. Pick a random data point (row in the dataset)
  2. Find the nearest neuron
  3. Move the nearest neuron towards the data point by a small fraction of the distance
  4. Repeat

Self Organizing Map vs K Means Clustering

Both SOM and K Means algorithms can be used for clustering data. In K Means algorithm, you should define the K value (the number of clusters you want). But in SOM, you don't need to pre-define the number of clusters because SOM identifies the internal organization of data points. The Euclidean distance between neurons, that calculated after training the SOM, will let you know how many clusters are there because it uses the vector quantization method to segregate the vector space into regions. So, you can use SOM to define the K value in the K Means clustering algorithm as well as using the elbow method.

SOM Artificial Neural Network Structure

SOM can be identified as a grid like 2D ANN (Artificial Neural Network). Generally, we represent these neural network as a rectangular grid or a hexagonal grid.

SOM Neural Network Structures

When we initialize the SOM structure, we have to consider whether it's a rectangular grid or a hexagonal grid. Then we should define the dimensions of the structure (size of the 2D neural network). It can be a 10*10 neural network or a 15*20 neural network. you can define it as you need. The next thing we have to consider when initializing the SOM structure is whether we are going to initialize random weights (random positions in the vector space) to neurons or assign random data points' weights (data points' position in the vector space) to neurons.

Self Organizing Map Kohonen Algorithm

  1. Define the structure of the neural network
  2. Initialize weights for the neurons
  3. Randomly pick an input vector x (a data point)
  4. Select the Best Matching Unit (BMU)/ the closest neuron of the x
  5. Define the neighborhood N for the BMU
  6. Update weights of all the neurons within N to pull them closer to the selected input vector
  7. Repeat from step 3 to step 7 for the given number of iterations
In this animation, you can see data points (vectors) as red rectangles and neurons as black circles.

How to select the BMU - the closest neuron for the selected data point

The 4th step in the algorithm is to find the BMU of the selected x. This part is done by a simple calculation performed to measure the Euclidean distance between x and each neuron in the SOM network. Then it will select the neuron that is closest to the x as the BMU.
SOM equation to find the BMU of a vector

How to define the neighborhood for the BMU

After finding the BMU, we have to find the neighbor neurons around the BMU also because we should update their weights also to make it happen properly. To do that, we have to define an initial N value and this value shrinks monotonically with time (with each iteration). You can identify the N value as the radius around the BMU point to define the neighborhood area.

How to adjust weights of the BMU neuron and the neighbor neurons

Updating weights (position in the vector space) is the main part of the SOM algorithm. We have to update the weight of the neurons within N to pull them closer to the randomly selected data point in the 3rd step. Below is the equation we used to adjust weights of the BMU and the neurons within the neighborhood area of that iteration/ time (t). So, the below equation will be applied to all neurons within the N.
SOM Neuron Weight Adjustment Equation
  • Wi,j(t+1) = New weight of the neuron
  • Wi,j(t) = Current weight of the neuron
  • 𝛼(𝑡) = Learning rate which is a function of time/ iterations. So, the adjusting weight will be decreased with time.

Visualize The Number of Clusters by Using Self Organizing Maps - U Matrix

After training the Artificial Neural Network of the SOM implementation, we can visualize it as a 2D grid to view the final weights of the neurons. These weights imply how clusters are distributed in the vector space. So we can create a matrix of average Euclidean distance between each neuron and its neighbor neurons. This matrix is called as U-Matrix and it is the most used visualization to view the results of SOM applications.

If we use a Self Organizing Map Python implementation to cluster a part of the iris dataset, then the U Matrix will be generated like in the below image (the first image shows how the data points are distributed in the vector space and the second image shows the U Matrix). Just to simplify, consider we just used only 2 features from the iris dataset.

SOM Kohonen Map visualization U matrix

The U Matrix tells us where the neurons are close to their neighbors (darker means closer) and where the neurons are isolated in the vector space (lighter means more isolated). After checking out this U Matrix, we can clearly see there are 2 good clusters and one cluster can be divided into 2 but it won't be a good separation.

Self Organizing Map Python Implementation with Tensorflow 2.0

Implementing a SOM by using Python 3 is very easy with Tensorflow 2.0 library because we can define tensors as neurons in the SOM ANN. There are many Python libraries also to use SOM. I will write another article about them and how to use them.

But if you need to implement your own SOM to look how it really works, then you can follow https://github.com/ravinduramesh/tensorflow-som/tree/tfv2 GitHub code. This is the best Self Organizing Map Python implementation, I could find during my intern period. In this GitHub repo, you can find the SOM Python implementation and how to use that implementation for clustering applications. You may find it as a long code but most of the lines are just comments, which teach you what happens there. So, it's a very small and understandable code.


It will be very easy to understand this code because now you know the algorithm and equations used in SOM. You just need to learn Tensorflow 2.0 before referring to this GitHub code. Otherwise, you may get confused after seeing classes, methods, and functions used in Tensorflow.

Evaluate SOM Applications

After using this SOM application for a clustering problem, you can use U Matrix to evaluate how your SOM Kohonen map clustered the data points. If you need to evaluate the accuracy of the U Matrix, then you can use matrices like topographic error, quantization error, and population based convergence. I'm not going to discuss how to calculate these values here because now you have learned a lot for today and the maths behind these evaluation matrices are more complex than the maths behind the SOM algorithm ;) You can find the code to calculate these matrices in that GitHub repo and you can refer to https://digitalcommons.uri.edu/cgi/viewcontent.cgi?article=2013&context=theses PDF to read more about evaluating a SOM model.

That's all for today... Happy Coding!