论文标题

依赖用户的时间序列匹配导致的渐近隐私损失

Asymptotic Privacy Loss due to Time Series Matching of Dependent Users

论文作者

Takbiri, Nazanin, Chen, Minting, Goeckel, Dennis L., Houmansadr, Amir, Pishro-Nik, Hossein

论文摘要

物联网(IoT)有望通过将应用程序调整为用户行为来改善用户实用性,但揭示用户行为的特征会带来重大的隐私风险。我们以前的工作已经确定了匿名化的挑战性要求,以保护用户在贝叶斯环境中的隐私,在这种情况下,我们假设一个有力的对手,他对每个用户行为的先前分布都有完美的了解。但是,即使是复杂的对手也经常没有那么完美的知识。因此,在本文中,我们将注意力转向了一个对手,必须从过去的数据痕迹中学习用户行为。我们还假设存在不同用户的数据痕迹之间存在依赖性,并且每个用户的数据点都是从正常分布中绘制的。介绍了导致用户隐私丢失的训练序列和数据序列的长度的结果。

The Internet of Things (IoT) promises to improve user utility by tuning applications to user behavior, but revealing the characteristics of a user's behavior presents a significant privacy risk. Our previous work has established the challenging requirements for anonymization to protect users' privacy in a Bayesian setting in which we assume a powerful adversary who has perfect knowledge of the prior distribution for each user's behavior. However, even sophisticated adversaries do not often have such perfect knowledge; hence, in this paper, we turn our attention to an adversary who must learn user behavior from past data traces of limited length. We also assume there exists dependency between data traces of different users, and the data points of each user are drawn from a normal distribution. Results on the lengths of training sequences and data sequences that result in a loss of user privacy are presented.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源