Skip to main navigation menu Skip to main content Skip to site footer

Electrotechnical and Computer Engineering

Vol. 38 No. 01 (2023): Proceedings of Faculty of Technical Sciences

PARALLELISATION OF DFA ALGORITHM FOR DEEP NEURAL NETWORK TRAINING

  • Aleksandar Grahovac
DOI:
https://doi.org/10.24867/21BE04Grahovac
Submitted
January 7, 2023
Published
2023-01-07

Abstract

This paper presents a multiprocessor parallelization of the DFA algorithm for neural network training. Parallelization is implemented for a deep neural network of arbitrary dimensions.

References

[1] Nøkland A., “Direct Feedback Alignment Provides Learning in Deep Neural Networks”, 30th Conference on Neural Information Processing Systems (NIPS 2016), Barcelona, Spain.
[2] OpenAI, “Language Models are Few-Shot Learners”, Advances in Neural Information Processing Systems 33 (NeurIPS 2020).
[3] Grother P. J., “MNIST Special Database 19”, National Institute of Standards and Technology, 1995.
[4] Launay J., Poli I., Boniface F., Krzakala F., “Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures”, 34th Conference on Neural Information Processing Systems (NeurIPS 2020), Vancouver, Canada.