Diploma thesis

Basic Info

About

Goal : Study the existing models within semi-supervised learning for categorization with focus on the Temporal Ensembling and the Mean Teacher models. Make an overview of the current state of the art, propose an extension of this semi-supervised model using self-organizing maps and evaluate the new model.

Annotation : These days, Deep neural networks are the most widely used and researched models in machine learning, with application in many different domains. However, training of such models requires an abundance of adequately labeled data and labels for the real world data are scarce. The semi-supervised learning paradigm aims at leveraging this problem via various different techniques that would typically involve capturing and evaluating the distance between the feature vectors of the learned labeled and unlabeled data and learning is based on similarity. This approach is used for example in the popular Mean Teacher model (MT). Self-organizing maps are one of the classical neural network models that do not require the training signal and yet capture relationships among the data presented to the network preserving similarity in a topological fashion. This mechanism could be utilized for improving semi-supervised learning with information coming from the structure of the data or its feature vectors.

Progress

Task
winter semester 2022Study literature about MT, BMT experiment
summer semester 20221. halfStudy literature about other models, our model definition, brainstorming on SOM loss, straight-forward implementation MT + SOM (experiments run too long, not possible to fine tune)
summer semester 20222. halfArchitecture change (to smaller - Sarmads), still too slow, visualizations and SOM-metrices, writing KUZ article
winter semester 2023week 1 - 3Designing SOM loss, designing experiment for proving SOM loss improve performance, impementation of experiment, running experiment
winter semester 2023week 4 - 6Redefining SOM loss and experiment several times, new implementation and bug fixing
winter semester 2023week 7 - 8Running experiments, recording results
winter semester 2023week 9 - 11Writing chapter about experiment and overview chapter

Code, text and slides

Articles

Image results

SOM training

SOMmetrices
Fig.1 - SOM metrices during SOM training.
som after 10 epochs
Fig.2 - visualization of SOM after 10 epochs

MLP+SOM training

loss
Fig.3 - Models loss during 30 epochs of training.
acc
Fig.4 - Models accuracy during 30 epochs of training.
SOMmetrices
Fig.5 - Confusion matrix of model.