论文标题

联合数据加深和提取能源有效的边缘学习

Joint Data Deepening-and-Prefetching for Energy-Efficient Edge Learning

论文作者

Kook, Sujin, Shin, Won-Yong, Kim, Seong-Lyun, Ko, Seung-Woo

论文摘要

可以通过使用物联网(IoT)设备收集的实时数据对ML模型(IoT)设备收集的实时数据进行培训,可以实现Pervasive机器学习(ML)服务的愿景。为此,IoT设备需要将其数据卸载到Edge服务器上。另一方面,大容量的高维数据会给能源预算有限的物联网设备带来重大负担。为了应对限制,我们提出了一种新颖的卸载体系结构,称为联合数据加深和预取(JD2P),该架构是逐个功能的卸载功能,包括两种关键技术。第一个是数据加深,其中每个数据样本的特征按照由数据嵌入技术(例如原理组件分析(PCA))确定的重要性顺序卸载。当卸载功能越来越多的功能足以对数据进行分类,从而减少了卸载数据量时,没有更多的功能被卸载。第二个是数据预取的数据,其中将来可能需要的某些功能会提前下载,从而通过精确的预测和参数优化实现了高效率。为了验证JD2P的有效性,我们使用MNIST和时尚界数据集进行实验。实验结果表明,与几个基准相比,JD2P可以显着降低预期的能耗,而不会降低学习精度。

The vision of pervasive machine learning (ML) services can be realized by training an ML model on time using real-time data collected by internet of things (IoT) devices. To this end, IoT devices require offloading their data to an edge server in proximity. On the other hand, high dimensional data with a heavy volume causes a significant burden to an IoT device with a limited energy budget. To cope with the limitation, we propose a novel offloading architecture, called joint data deepening and prefetching (JD2P), which is feature-by-feature offloading comprising two key techniques. The first one is data deepening, where each data sample's features are sequentially offloaded in the order of importance determined by the data embedding technique such as principle component analysis (PCA). No more features are offloaded when the features offloaded so far are enough to classify the data, resulting in reducing the amount of offloaded data. The second one is data prefetching, where some features potentially required in the future are offloaded in advance, thus achieving high efficiency via precise prediction and parameter optimization. To verify the effectiveness of JD2P, we conduct experiments using the MNIST and fashion-MNIST dataset. Experimental results demonstrate that the JD2P can significantly reduce the expected energy consumption compared with several benchmarks without degrading learning accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源