Explore chapters and articles related to this topic
Migrating e-Science Applications to the Cloud: Methodology and Evaluation
Published in Olivier Terzo, Lorenzo Mossucca, Cloud Computing with e-Science Applications, 2017
Strauch Steve, Andrikopoulos Vasilios, Karastoyanova Dimka, Karolina Vukojevic-Haupt
Apart from the vendor-specific migration methodologies and guidelines, there are also proposals independent from a specific cloud provider. Reddy and Kumar proposed a methodology for data migration that consists of the following phases: design, extraction, cleansing, import, and verification. Moreover, they categorized data migration into storage migration, database migration, application migration, business process migration, and digital data retention (Reddy and Kumar, 2011). In our proposal, we focus on the storage and database migration as we address the database layer. Morris specifies four golden rules of data migration with the conclusion that the IT staff does not often know about the semantics of the data to be migrated, which causes a lot of overhead effort (Morris, 2012). With our proposal of a step-by-step methodology, we provide detailed guidance and recommendations on both data migration and required application refactoring to minimize this overhead. Tran et al. adapted the function point method to estimate the costs of cloud migration projects and classified the applications potentially migrated to the cloud (Tran et al., 2011). As our assumption is that the decision to migrate to the cloud has already been taken, we do not consider aspects such as costs. We abstract from the classification of applications to define the cloud data migration scenarios and reuse distinctions, such as complete or partial migration to refine a chosen migration scenario.
Optimal Data Placement for Scientific Workflows in Cloud
Published in Journal of Computer Information Systems, 2023
The table analysis shows that the existing data placement methods are affected by different problems. In,20 the proposed method is hampered by transport dependency, complexity and lack of flexibility. The method in21 did not take into account the different number of experiments to show the effectiveness of the proposed method. In,22 the data center failed, and this is due to the time for data migration being limited. Data placement problem is not considered in,23 and security mechanisms are not performed in24 and.27 Likewise, data movement is not minimized using the security-aware intermediate data placement policy,25 and the CEDP method26 is affected by resource constraint issues. So, these limitations of existing methods are considered the research gap for this proposed model. Also, the above-mentioned issues are addressed in this research to perform the secured and optimized data placement in cloud computing.
Requirements of a data storage infrastructure for effective land administration systems: case study of Victoria, Australia
Published in Journal of Spatial Science, 2022
Davood Shojaei, Farshad Badiee, Hamed Olfat, Abbas Rajabifard, Behnam Atazadeh
It is very important to ensure data is completely and correctly stored in databases from flat files. Therefore, various checks are required to validate this transformation from files to database tables. These include internal data quality, external data quality, and data migration quality. The internal data quality looks at logical consistency for spatial and non-spatial features, data issues (i.e. gap and slivers), spatial and attribute accuracy, completeness, and lineage (Teng et al. 2014). The external data quality investigates the consistency of neighbouring features to have acceptable geometrical relations to avoid errors such as intersection, gap and sliver. Shojaei et al. (2017) discussed data quality in cadastral objects and proposed some methods to test and validate various scenarios. Karki et al. (2013) proposed internal and external validation rules for checking lodged plans. Data migration quality is the quality of transferring data from flat files to database tables. This is significantly important to assure data is not lost during the migration process.
A survey of intrusion detection from the perspective of intrusion datasets and machine learning techniques
Published in International Journal of Computers and Applications, 2022
Privacy preservation and cyber-attacks identification in cloud and IoT background is a challenging task. A deep Blockchain Framework (DBF) enforces security and privacy in the cloud and IoT context [61]. A deep learning-based CIDS model is built to cope with security threats imposed on the cloud and the IoT systems from migrated network data. This model employs the Bidirectional Long Short-Term Memory (BiLSTM) technique to develop the CIDS. The performance of the model is evaluated using UNSW-BN15 and BoT-IoT datasets. Smart contracts and Blockchain techniques offer protection to distributed intrusion detection engines. This framework offers a decision support tool for secure data migration. Simple, transparent, and safe data exchange are the outcomes of this experiment.