Explore chapters and articles related to this topic
Machine Learning
Published in Seyedeh Leili Mirtaheri, Reza Shahbazian, Machine Learning Theory to Applications, 2022
Seyedeh Leili Mirtaheri, Reza Shahbazian
Traditionally in machine learning, for training amodel and to make a prediction, their data pipeline uses a central server instead of on-prem or the cloud, that hosts the trained model. In this architecture, all the sensors and devices collect the data and send it back to the server, and after processing the data, return it to the devices. This round-trip limits a model’s ability for real-time learning. In other words, the standard machine learning approaches require centralizing the training data on one machine or in a datacenter. Consider models which want to train from user interaction like mobile or tablet devices. To train a model in this situation, the federated learning is introduced. Federated Learning enables devices like mobile phones to collaboratively learn a shared prediction model while keeping all the training data on the device.
IT Governance and Enterprise Security Policy in the 6G Era
Published in Mohiuddin Ahmed, Nour Moustafa, Abu Barkat, Paul Haskell-Dowland, Next-Generation Enterprise Security and Governance, 2022
Mohsen Aghabozorgi Nafchi, Zahra Alidousti Shahraki
With the remarkable success of the use of artificial intelligence in various fields such as computer vision, natural language processing, and autonomous driving, the use of artificial intelligence capabilities in 6G has received much attention. For example, with the rapid expansion of smart mobile gadgets and the Internet of Things devices (e.g., self-driving cars, drones, and auto-robots), many intelligent applications on the edge of wireless networks will be developed in the near future. But one of the most important issues when people use smart apps is the privacy. Therefore, instead of uploading intelligence apps data in the cloud for model training process, federated learning technology is used [22,23]. Federated learning defines as “a machine learning setting where multiple entities (clients) collaborate in solving a machine learning problem, under the coordination of a central server or service provider. Each client's raw data is stored locally and not exchanged or transferred; instead, Focused updates are updates narrowly scoped to contain the minimum information necessary for the specific learning task at hand; aggregation is performed as early as possible in the service of data minimization” [24].
Privacy Breaches through Cyber Vulnerabilities
Published in Amit Kumar Tyagi, Ajith Abraham, A. Kaklauskas, N. Sreenath, Gillala Rekha, Shaveta Malik, Security and Privacy-Preserving Techniques in Wireless Robotics, 2022
S.U. Aswathy, Amit Kumar Tyagi
One group of methods to secure privacy through attention to data is “differential privacy.” Differential privacy is a type of mathematical perturbation introduced into a data set to ensure that a specific individual’s inclusion in that data cannot be detected when summary statistics generated from either a true or differentiated data set are compared to one another. Some of the other methods for increasing the privacy protections for data prior to use in ML models include: homomorphic encryption, secure multi-party computation, and federated learning. Homomorphic encryption preserves data privacy through analysis of encrypted data. Secure multi-party computation is a protocol for collaboration between parties holding information they prefer to keep private from one another without intervention of a trusted third-party actor. Federated learning allows data to be stored and analyzed locally through models or segments of models sent to the user’s device.
Privacy protection against attack scenario of federated learning using internet of things
Published in Enterprise Information Systems, 2023
Kusum Yadav, Elham Kariri, Shoayee Dlaim Alotaibi, Wattana Viriyasitavat, Gaurav Dhiman, Amandeep Kaur
Federated learning provides a training model that can protect user data privacy, thus realising a mechanism in which data is not shared between participants, but the model is shared. However, some recent work has shown that federated learning may not guarantee sufficient privacy protection capabilities. For example, in the parameter communication update of federated learning training, some sensitive information may be leaked. In-depth information leakage in the iterative process of these models may be caused by a third-party attacker, or it may be leaked through a central server. For example, reference (Qu et al. 2020) introduces a method of privacy disclosure that uses a small part of the original gradient information to deduce the original data information reversely. Reference (Liu et al. 2020b) introduced a malicious attacker to steal the original data through partially updated echelon information.
Alzheimer disease classification using tawny flamingo based deep convolutional neural networks via federated learning
Published in The Imaging Science Journal, 2022
Umakant Mandawkar, Tausif Diwan
Federated learning is the emerging learning model, which executes larger datasets that were distributed globally or in distributed sources, which were needed to be organized in a single area for testing or training. The federation is the combination of group of processors that process the data individually, executes them then organize this information to process them in a single source. The computational efficiency can be improved since it can process remote sources of data and generates better models [7] various sources are connected at the same time to train the deep learning model with their processed data’s the performance levels have been improved. A master model is provided to each location of the server that is downloaded by the processors, which will be later uploaded in the new master server [8].
A reward response game in the blockchain-powered federated learning system
Published in International Journal of Parallel, Emergent and Distributed Systems, 2022
Federated learning has enabled model(s) to be collaboratively trained across multiple devices using decentralised data samples without actual data exchange, and therefore protecting data privacy and security. Meanwhile, the growth of mobile devices also get machine learning at the end of the network for real. Therefore, mobile-crowd federated learning has emerged as a new business trend. Figure 1 shows a typical federated learning system, consisting of a central server as a model requester and a set of mobile devices as model trainers. In such a system, the server distributes a global model to the devices. The devices train the model on locally available data. All updated models, instead of the data, are then sent back to the server, where they are averaged to produce a new global model. This new model now acts as the primary model and is again distributed to the devices. This process is repeated forever or until the global model achieves a satisfactory result from the server side. Usually, the newly aggregated global model should get a little better than it already was. Obviously, model training moves to the edge of the network so that the data never leaves the device, while it is still under the central server's orchestration.