Explore chapters and articles related to this topic
Introduction
Published in Wen Sun, Qubeijian Wang, Nan Zhao, Haibin Zhang, Chao Shen, Ultra-Dense Heterogeneous Networks, 2023
Wen Sun, Haibin Zhang, Nan Zhao, Chao Shen, Lawrence Wai-Choong Wong
With the development of wireless mobile networks, a variety of smart terminals and emerging applications have brought severe challenges to the system. On the one hand, the types of terminals that access to the network have expanded from traditional devices (e.g., mobile phones and computers) to ever-more-capable devices (e.g., smart wearable devices, autonomous vehicles, and UAVs), resulting in explosive growth of data traffic in communication systems. On the other hand, diverse applications including high-definition televisions, online video, and VR/AR have placed high requirements on the network bandwidth and the channel capacity in communication systems. 5G system is expected to support communication for more than 100 billion devices, with data transmission rate higher than 1 Gbps and transmission delay lower than 10 ms. The data proliferation urges the densification of base stations from macrocells, microcells, picocells, femtocells, to even tiny-cells.
New challenges for user privacy in cyberspace
Published in Abbas Moallem, Human-Computer Interaction and Cybersecurity Handbook, 2018
Adam Wójtowicz, Wojciech Cellary
Redundant data storage in multiple physical locations is a common feature of cloud computing services. This can lead to the data proliferation phenomenon. Detailed information about the location of a user’s data is unavailable or not disclosed to the user. Therefore, often, it is unclear which party is responsible for ensuring legal requirements and data handling standards for PII processing or whether it is possible to audit them for compliance with these requirements and standards. Moreover, it is not clear to what extent cloud subcontractors involved in processing can be identified and verified as trustworthy, particularly in a dynamic environment [35]. Trust is not transitive, which requires disclosing such third-party contracts in advance of reaching an agreement with the cloud provider, and maintaining the terms of these contracts throughout the agreement. In practice, it is rarely fulfilled, so privacy guarantees can become an issue with composite cloud services [36]. If cloud computing providers outsource certain tasks to third parties, the level of privacy protection of the cloud provider depends on the level of privacy protection of each “supply chain” link and the level of dependency of the cloud provider on the third party. Any corruption in this chain or a lack of coordination of responsibilities between any parties involved can lead to loss of data privacy [37].
Foundations of Opportunistic Networks
Published in Khaleel Ahmad, Nur Izura Udzir, Ganesh Chandra Deka, Opportunistic Networks, 2018
Musaeed Abouaroek, Khaleel Ahmad
In ZebraNet, once the data samples are acquired they are accumulated and analyzed. Zebras must be fairly mobile i.e., only a few are collared, and they must spread across a distance of several kilometers, so data accumulation becomes difficult. To transfer data in the network, nodes will communicate in a pairwise fashion. The main objective is to get position logs back to the biologists so that single-hop transmissions are enough to pass data to other collars. In such a case, latency can be seen, but it is ignored. Since collars come into contact very often, a dedicated system like pairwise communications is used instead of multi-hop transmissions. A manned base station intermittently comes into contact with a zebra, so that it can download data from all the zebras. Flash memory is used to store position logs, which compensate latency of data proliferation.
Big Earth data: disruptive changes in Earth observation data management and analysis?
Published in International Journal of Digital Earth, 2020
Martin Sudmanns, Dirk Tiede, Stefan Lang, Helena Bergstedt, Georg Trost, Hannah Augustin, Andrea Baraldi, Thomas Blaschke
Although these parts of the EO data analysis workflow are relatively new or utilise newer Web-based opportunities when compared to the 1990s, the dominating strategy is still locally processing downloaded data sets. The enormous increase of data, proliferation of cloud service architectures and opportunities of state-of-the-art Web technologies, now make it possible for users to more easily access remote sensing data. This trend is likely to continue, but the data volumes used in the analysis will be a limiting factor as long as the processing in EO data workflows continues to occur locally in the client. Thus, there is a need for huge technological progress in big Earth data analysis, or even a complete, disruptive change in workflows as illustrated in Figure 1.
Assessing supply chain responsiveness, resilience and robustness (Triple-R) by computer simulation: a systematic review of the literature
Published in International Journal of Production Research, 2023
Pranesh Saisridhar, Matthias Thürer, Balram Avittathur
The widespread use of Artificial Intelligence (AI) in manufacturing is becoming a reality (Kuo and Kusiak 2019), driven by algorithmic advances, data proliferation due to increased digitalisation, reduced data storage costs and a tremendous increase in computing power. Machine learning is a subset of AI, which includes, for example, reinforcement learning (Rolf et al. 2023) and deep learning (Kusiak 2020). While deep learning is learning from a training set, and then applies that learning to a new data set, reinforcement leaning (RL) dynamically learns by adjusting actions based on continuous feedback to maximise a reward. RL thus includes causal aspects by continuously evaluating the outcomes of actions. Through the analysis of causal effects, one can predict how systems will respond to potential interventions (Pearl 2000), which is of utmost importance in the context of Triple-R. But RL agents cannot directly learn from the physical world. They require a virtual environment (or simulation) to allow for counterfactuals and to learn through trial and error (MacCarthy and Ivanov 2022) A combination of model-based and data-driven approaches, for example as part of a digital twin, can ensure end-to-end visibility and permanent information accessibility for a RL agent (Ivanov and Dolgui 2021b; Burgos and Ivanov 2021). We can thus expect an increase in simulation applications that use AI to improve decision making in dynamic environments, and to self-learn how to improve and adapt the underlying simulation model itself. This kind of adaptive decision support system can play a major role in creating Triple-R capabilities.
Data availability issues: decisions as patterns of action
Published in Journal of Decision Systems, 2022
Arif Wibisono, David Sammon, Ciara Heavin
Further research could examine patterns of action produced by other data issues: data accuracy, duplication, access, completeness, privacy, meaning, and leakage. This examination would allow us to appreciate which data issues produce the most complex patterns of action. Later, we could isolate similar patterns across data issues as a starting point for developing a better approach to coping with data issues and WCDA. Also, data proliferation has now become more observable in many organisations. Future research could investigate emerging patterns of action to better manage data issues in such a data flood.