论文标题

参数凸神经网络

Parameter Convex Neural Networks

论文作者

Zhou, Jingcheng, Wei, Wei, Li, Xing, Pang, Bowen, Zheng, Zhiming

论文摘要

在许多重要领域(例如计算机视觉,自然语言处理和推荐系统)中,使用深度神经网络(DNNS)的深度学习最近取得了很多成功。缺乏DNN的凸度已被视为许多优化方法的主要缺点,例如随机梯度下降,大大降低了神经网络应用的基因化。我们意识到,在神经网络中的凸度是有意义的,并提出了指数多层神经网络(EMLP),这是一类参数凸神经网络(PCNN),它在某些条件下是关于神经网络的参数的凸面。此外,我们提出了两层EGCN的凸度度量,并在凸度度变化时测试准确性。对于晚期实验,我们使用相同的体系结构来制作指数图卷积网络(EGCN),并在图形分类数据集上进行实验,其中我们的模型EGCN的性能优于图形卷积网络(GCN)和图形注意网络(GAT)。

Deep learning utilizing deep neural networks (DNNs) has achieved a lot of success recently in many important areas such as computer vision, natural language processing, and recommendation systems. The lack of convexity for DNNs has been seen as a major disadvantage of many optimization methods, such as stochastic gradient descent, which greatly reduces the genelization of neural network applications. We realize that the convexity make sense in the neural network and propose the exponential multilayer neural network (EMLP), a class of parameter convex neural network (PCNN) which is convex with regard to the parameters of the neural network under some conditions that can be realized. Besides, we propose the convexity metric for the two-layer EGCN and test the accuracy when the convexity metric changes. For late experiments, we use the same architecture to make the exponential graph convolutional network (EGCN) and do the experiment on the graph classificaion dataset in which our model EGCN performs better than the graph convolutional network (GCN) and the graph attention network (GAT).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源