论文标题

数值稳定的二进制梯度编码

Numerically Stable Binary Gradient Coding

论文作者

Charalambides, Neophytos, Mahdavifar, Hessam, Hero III, Alfred O.

论文摘要

机器学习的主要障碍是对大量数据集的可扩展性。克服这一点的一种方法是在几位工人中分配计算任务。 \ textIt {梯度编码}最近在分布式优化中提出了使用多个,可能不可靠的工人节点来计算目标函数的梯度。通过设计分布式编码方案,可以使梯度编码的计算具有弹性,可弹性\ textit {stragglers},与分布式网络中其他节点相比,响应时间较长的节点。大多数此类方案都依赖于真实或复杂数字上的操作,并且本质上是不稳定的。我们提出了一个避免此类操作的二进制方案,从而实现了梯度的数值稳定分布式计算。同样,在先前工作中的一些限制假设被删除,并给出了更有效的解码。

A major hurdle in machine learning is scalability to massive datasets. One approach to overcoming this is to distribute the computational tasks among several workers. \textit{Gradient coding} has been recently proposed in distributed optimization to compute the gradient of an objective function using multiple, possibly unreliable, worker nodes. By designing distributed coded schemes, gradient coded computations can be made resilient to \textit{stragglers}, nodes with longer response time comparing to other nodes in a distributed network. Most such schemes rely on operations over the real or complex numbers and are inherently numerically unstable. We present a binary scheme which avoids such operations, thereby enabling numerically stable distributed computation of the gradient. Also, some restricting assumptions in prior work are dropped, and a more efficient decoding is given.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源