Μελέτη μοντέλων παράλληλης εκτέλεσης για την εκπαίδευση συνελικτικών νευρωνικών δικτύων
Study of parallel execution models for training convolutional neural networks

View/ Open
Keywords
Συνελικτικά νευρωνικά δίκτυα ; Παραλληλισμός δεδομένων ; Παραλληλισμός μοντέλου ; AlexNet ; DCGANAbstract
This thesis investigates various parallel execution practices for training Convolutional Neural Networks. The study focuses on experimenting with two models, AlexNet for image classification tasks, and DCGAN for generating synthetic images. The aim was to implement both the basic models, as well as implementations of data and model parallelism. CIFAR-10 was used as the dataset and TensorFlow was the implementation framework. The results highlight the significant time-saving benefit of implementing parallelism, while analyzing its impact on classification accuracy and the reliability of the generated results. For instance, training AlexNet with 2-way data parallelism reduced training time by 46%, with a 5.7% drop in classification accuracy.