论文标题

适应性网络内合作缓存,以增强Edge的整体学习

Adaptive In-network Collaborative Caching for Enhanced Ensemble Deep Learning at Edge

论文作者

Qin, Yana, Wu, Danye, Xu, Zhiwei, Tian, Jie, Zhang, Yujun

论文摘要

为了提高数据处理的质量和速度并保护数据的隐私和安全性,Edge计算已广泛应用于Edge的数据密集型智能处理服务。在这些数据密集型服务中,基于集合学习的服务可以自然利用边缘设备的分布式计算和存储资源,以实现有效的数据收集,处理,分析。 为了在边缘设备上获取有限的资源以支持高性能集合学习解决方案,协作缓存已应用于Edge Computing,以支持靠近数据源的服务。为了实现这一目标,我们为Edge的集合学习提出了一种自适应的网络内合作缓存计划。首先,提出了有效的数据表示结构来记录不同节点之间的缓存数据。此外,我们设计了一种协作方案,以根据来自不同边缘节点的数据表示形式的摘要来安排本地缓存,以促进边缘节点以缓存本地集合学习的有价值数据。我们的广泛模拟证明了拟议的协作缓存计划的高性能,该计划大大降低了学习延迟和传播开销。

To enhance the quality and speed of data processing and protect the privacy and security of the data, edge computing has been extensively applied to support data-intensive intelligent processing services at edge. Among these data-intensive services, ensemble learning-based services can in natural leverage the distributed computation and storage resources at edge devices to achieve efficient data collection, processing, analysis. Collaborative caching has been applied in edge computing to support services close to the data source, in order to take the limited resources at edge devices to support high-performance ensemble learning solutions. To achieve this goal, we propose an adaptive in-network collaborative caching scheme for ensemble learning at edge. First, an efficient data representation structure is proposed to record cached data among different nodes. In addition, we design a collaboration scheme to facilitate edge nodes to cache valuable data for local ensemble learning, by scheduling local caching according to a summarization of data representations from different edge nodes. Our extensive simulations demonstrate the high performance of the proposed collaborative caching scheme, which significantly reduces the learning latency and the transmission overhead.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源