Predictive Coding: Towards a Future of Deep Learning beyond Backpropagation?
Millidge B., Salvatori T., Song Y., Bogacz R., Lukasiewicz T.
The backpropagation of error algorithm used to train deep neural networks has been fundamental to the successes of deep learning. However, it requires sequential backward updates and non-local computations, which make it challenging to parallelize at scale and is unlike how learning works in the brain. Neuroscience-inspired learning algorithms, however, such as predictive coding, which utilize local learning, have the potential to overcome these limitations and advance beyond deep learning technologies in the future. While predictive coding originated in theoretical neuroscience as a model of information processing in the cortex, recent work has developed the idea into a general-purpose algorithm able to train deep neural networks using only local computations. In this survey, we review works that have contributed to this perspective and demonstrate the close connection between predictive coding and backpropagation in terms of generalization quality, as well as works that highlight the multiple advantages of using predictive coding over backpropagation-trained neural networks. Specifically, we show the substantially greater flexibility of predictive coding networks, which, unlike standard deep neural networks, can function as classifiers, generators, and associative memories simultaneously, and can be defined on arbitrary graph topologies. Finally, we review direct benchmarks of predictive coding networks on machine learning classification tasks, as well as its close connections to control theory and applications in robotics.