Explore chapters and articles related to this topic
Security and Privacy Issues in Biomedical AI Systems and Potential Solutions
Published in Saravanan Krishnan, Ramesh Kesavan, B. Surendiran, G. S. Mahalakshmi, Handbook of Artificial Intelligence in Biomedical Engineering, 2021
Several variations of this concept exist; other than the pure differential privacy, there exist useful variants like Rényi differential privacy (Mironov et al., 2017; Geumlek et al., 2017). Another technique of designing DP algorithms is the exponential mechanism (McSherry et al., 2007). The idea of “local differential privacy” versus centralized differential privacy, “multiparty differential privacy” and differential privacy with interactive mechanisms has also been studied in this regard. Another example of an adaptive algorithm based on differential privacy has been suggested by Alnemari et al. (2017)—where it is shown that the partitioning technique can help improve the privacy guarantee. Also, considering the sensitivity of the queries prior to making predictions is shown to be essential in this regard.
Privacy in Internet of Healthcare Things
Published in Ahmed Elngar, Ambika Pawar, Prathamesh Churi, Data Protection and Privacy in Healthcare, 2021
Mohammad Wazid, Ashok Kumar Das
Kim et al. [68] proposed a scheme for the privacy-preserving collection of personal health-related data streams, which are characterized as temporal data. The data are collected at fixed intervals through the benefit of “Local Differential Privacy (LDP).” A data contributor is used to provide a privacy budget of the LDP. It reports a small quantity of salient data, which is extracted from the health data stream.
DPWeVote: differentially private weighted voting protocol for cloud-based decision-making
Published in Enterprise Information Systems, 2019
Ziqi Yan, Jiqiang Liu, Shaowu Liu
In LDP model, each distributed partner would first perturb his/her data locally by adopting a randomized mechanism (or called local randomizer ) which is provided by the semi-honest cloud server and satisfies -differential privacy, and then upload the perturbed data to cloud server, who cannot infer the sensitive information of every single partner but can post-process those data to obtain useful population statistics for further analysis. The LDP can be formally defined as follows: Definition 2.7 Local Differential Privacy (Bassily et al. 2017). An algorithm satisfies-Local Differential Privacy (LDP) if it accesses the databaseonly via invocations of a local randomizerand if for all, if denote the algorithms invocations ofon the data sample, then the algorithmis-differentially private. That is, if for any pair of data samples, and , .
Improving the performance of deep learning-based classification when a sample has various appearances
Published in Journal of Experimental & Theoretical Artificial Intelligence, 2022
Guanghao Jin, Yuming Jiao, Jianming Wang, Ming Ma, Qingzeng Song
Xue, Q et al proposed a method based on joint distribution estimation under local differential privacy (Xue et al., 2020). They leverage extensive experiments to evaluate the effectiveness of the schemes designed for joint distribution and Naive Bayes classification. In these methods, the joint probability distribution is applied to the following function abstracts: