/* Das ist der Code, damit das Akkordeon geschlossen angezeigt wird. */ /* Das ist der Code, um offene Akkordeons wieder schließen zu können */
Leopold Thomas

Leopold Thomas

Otto-von-Taube-Gymnasium

 

Titel der Forschungsarbeit: Comparison of Straggler Mitigation Algorithms in Distributed Machine Learning Systems

Fakultät: Fakultät für Elektro- und Informationstechnik

Lehrstuhl: Professur für Codierung und Kryptographie

Betreuung: Dr. Rawad Bitar

Abstract der Forschungsarbeit

We consider a distributed machine learning algorithm in the presence of stragglers. We focus on algorithms running approximate gradient decent. We compare the performance of two algorithms that aim at mitigating stragglers, which are both based on gradient cod-ing, namely ErasureHead and Stochastic Gradient Coding. Even though either approach gives theoretical guarantees for their respective performance, we compare the practical performance in the same environment using the same data and the same loss function. We run the simulations using python programming language on a single machine with parallelized worker threads. We find that ErasureHead is a faster solution for our pur-pose, as it decreases the loss function faster. However, Stochastic Gradient Coding has a smaller margin of error after the same number of iterations, the run time of which is much longer.