论文标题

Aoi Awaws Markov缓存的决策政策

AoI-Aware Markov Decision Policies for Caching

论文作者

Park, Soohyun, Jung, Soyi, Choi, Minseok, Kim, Joongheon

论文摘要

我们将利用道路侧单元(RSU)的场景视为连接车辆网络中的分布式缓存。在我们的情况下使用缓存的目的是在各种交通条件下迅速为连接的车辆提供内容。在此操作期间,由于道路环境和用户移动性的迅速变化,信息年龄的概念(AOI)被考虑用于(1)更新缓存信息以及(2)维护缓存信息的新鲜度。缓存信息的频繁更新以牺牲网络资源为代价保持信息的新鲜感。在这里,频繁的更新增加了RSUS和MBS之间的数据传输数量;因此,它增加了系统成本。因此,在缓存信息的AOI和系统成本之间存在权衡。基于此观察结果,本文提出的算法旨在基于马尔可夫决策过程(MDP)和Lyapunov优化,在最小化内容AOI的同时,在最小化内容AOI的同时,该算法降低了。

We consider a scenario that utilizes road side units (RSUs) as distributed caches in connected vehicular networks. The goal of the use of caches in our scenario is for rapidly providing contents to connected vehicles under various traffic conditions. During this operation, due to the rapidly changed road environment and user mobility, the concept of age-of-information (AoI) is considered for (1) updating the cached information as well as (2) maintaining the freshness of cached information. The frequent updates of cached information maintain the freshness of the information at the expense of network resources. Here, the frequent updates increase the number of data transmissions between RSUs and MBS; and thus, it increases system costs, consequently. Therefore, the tradeoff exists between the AoI of cached information and the system costs. Based on this observation, the proposed algorithm in this paper aims at the system cost reduction which is fundamentally required for content delivery while minimizing the content AoI, based on Markov Decision Process (MDP) and Lyapunov optimization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源