Visible to the public Biblio

Filters: Author is Wang, Weina  [Clear All Filters]
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z 
D
Akbay, Abdullah Basar, Wang, Weina, Zhang, Junshan.  2019.  Data Collection from Privacy-Aware Users in the Presence of Social Learning. 2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton). :679–686.
We study a model where a data collector obtains data from users through a payment mechanism to learn the underlying state from the elicited data. The private signal of each user represents her individual knowledge about the state. Through social interactions, each user can also learn noisy versions of her friends' signals, which is called group signals. Based on both her private signal and group signals, each user makes strategic decisions to report a privacy-preserved version of her data to the data collector. We develop a Bayesian game theoretic framework to study the impact of social learning on users' data reporting strategies and devise the payment mechanism for the data collector accordingly. Our findings reveal that, the Bayesian-Nash equilibrium can be in the form of either a symmetric randomized response (SR) strategy or an informative non-disclosive (ND) strategy. A generalized majority voting rule is applied by each user to her noisy group signals to determine which strategy to follow. When a user plays the ND strategy, she reports privacy-preserving data completely based on her group signals, independent of her private signal, which indicates that her privacy cost is zero. Both the data collector and the users can benefit from social learning which drives down the privacy costs and helps to improve the state estimation at a given payment budget. We derive bounds on the minimum total payment required to achieve a given level of state estimation accuracy.
V
Wang, Weina, Ying, Lei, Zhang, Junshan.  2016.  The Value of Privacy: Strategic Data Subjects, Incentive Mechanisms and Fundamental Limits. Proceedings of the 2016 ACM SIGMETRICS International Conference on Measurement and Modeling of Computer Science. :249–260.

We study the value of data privacy in a game-theoretic model of trading private data, where a data collector purchases private data from strategic data subjects (individuals) through an incentive mechanism. The private data of each individual represents her knowledge about an underlying state, which is the information that the data collector desires to learn. Different from most of the existing work on privacy-aware surveys, our model does not assume the data collector to be trustworthy. Then, an individual takes full control of its own data privacy and reports only a privacy-preserving version of her data. In this paper, the value of ε units of privacy is measured by the minimum payment of all nonnegative payment mechanisms, under which an individual's best response at a Nash equilibrium is to report the data with a privacy level of ε. The higher ε is, the less private the reported data is. We derive lower and upper bounds on the value of privacy which are asymptotically tight as the number of data subjects becomes large. Specifically, the lower bound assures that it is impossible to use less amount of payment to buy ε units of privacy, and the upper bound is given by an achievable payment mechanism that we designed. Based on these fundamental limits, we further derive lower and upper bounds on the minimum total payment for the data collector to achieve a given learning accuracy target, and show that the total payment of the designed mechanism is at most one individual's payment away from the minimum.