论文标题
一个强大的图像的无反向传播框架
A Robust Backpropagation-Free Framework for Images
论文作者
论文摘要
尽管当前的深度学习算法已经成功地完成了各种人工智能(AI)任务,包括涉及结构化图像数据的任务,但由于它们依赖于错误的错误传播(BackProp)计算出的梯度,因此它们提出了深层的神经生理概念问题。需要梯度来获得突触重量调整,但需要了解馈送前馈活动才能进行向后传播,这是一个生物学上令人难以置信的过程。这被称为“体重运输问题”。因此,在这项工作中,我们提出了一种在生物学上更合理的方法来解决图像数据的重量传输问题。我们将这种方法命名为错误内核驱动的激活对准(EKDAA)算法,通过引入局部派生的错误传输内核和误差映射来实现。像标准深度学习网络一样,EKDAA通过权重和激活功能执行标准的远期过程。但是,其向后错误计算涉及通过网络传播本地误差信号的自适应错误内核。 EKDAA的功效通过在时尚MNIST,CIFAR-10和SVHN基准上执行视觉识别任务来证明,并展示了其从自然色图像中提取视觉特征的能力。此外,为了证明其对梯度计算的不依赖,为EKDAA训练有素的CNN提供了结果,该结果采用了非差异性激活函数。
While current deep learning algorithms have been successful for a wide variety of artificial intelligence (AI) tasks, including those involving structured image data, they present deep neurophysiological conceptual issues due to their reliance on the gradients that are computed by backpropagation of errors (backprop). Gradients are required to obtain synaptic weight adjustments but require knowledge of feed-forward activities in order to conduct backward propagation, a biologically implausible process. This is known as the "weight transport problem". Therefore, in this work, we present a more biologically plausible approach towards solving the weight transport problem for image data. This approach, which we name the error kernel driven activation alignment (EKDAA) algorithm, accomplishes through the introduction of locally derived error transmission kernels and error maps. Like standard deep learning networks, EKDAA performs the standard forward process via weights and activation functions; however, its backward error computation involves adaptive error kernels that propagate local error signals through the network. The efficacy of EKDAA is demonstrated by performing visual-recognition tasks on the Fashion MNIST, CIFAR-10 and SVHN benchmarks, along with demonstrating its ability to extract visual features from natural color images. Furthermore, in order to demonstrate its non-reliance on gradient computations, results are presented for an EKDAA trained CNN that employs a non-differentiable activation function.