Explore chapters and articles related to this topic
Analysis: Four Levels for Validation
Published in Tamara Munzner, Visualization Analysis and Design, 2014
web hosting service. Four validation methods were used in this paper, shown in Figure 4.13. At the domain situation level, the paper explains the roles and activities of system management professionals and their existing workflow and tools. The validation approach was interviews with the target audience. The phased design methodology, where management approval was necessary for access to the true target users, led to a mix of immediate and downstream timing for this validation: many of these interviews occurred after a working prototype was developed. This project is a good example of the iterative process alluded to in Section 4.3. At the abstraction level, the choice of a collection of time-series data for data type is discussed early in the paper. The rationale is presented in the opposite manner from the discussion above: rather than justifying that time-series data is the correct choice for the system management domain, the authors justify that this domain is an appropriate one for studying this data type. The paper also contains a set of explicit design requirements, which includes abstract tasks like search, sort, and filter. The downstream validation for the abstraction level is a longitudinal field study of the system deployed to the target users, life cycle engineers for managed hosting services inside a large corporation. At the visual encoding and interaction level, there is an extensive discussion of design choices, with immediate validation by jus-
Flooding Control in Named Data Networking
Published in IETE Technical Review, 2018
Shatarupa Dash, Bharat J.R. Sahu, Navrati Saxena, Abhishek Roy
When consumer node generates Interests and forwards to neighbour nodes, in the current scenario, the neighbour node's Forwarding Information Base (FIB) discards the Interest, if there is no matching entry. However, with the proliferation of smart devices, device-to-device (D2D) communication, and low-cost web hosting, versatile new contents and sites are frequently added to the Internet. In such a scenario, consumer will not be able to access new contents if FIB discards Interest. Interest flooding cannot be avoided completely because of the highly increasing generated contents that have not been recorded in FIB of each router immediately. However, in normal cases, newly generated content can be published into FIB of each involved router all over the Internet by the routing protocols within a certain short time called “convergence time”. Though this process achieves better stability, the convergence time may be as long as tens of minutes [10]. In such cases, forwarding Interests to multiple interfaces may retrieve data quickly through the best paths [10]. Thus, Interest forwarding to multiple neighbour nodes may not be avoided completely. Hence, deciding the limit of Interest forwarding is difficult and highly significant.
A novel home automation distributed server management system using Internet of Things
Published in International Journal of Ambient Energy, 2022
P. Manojkumar, M. Suresh, Alim Al Ayub Ahmed, Hitesh Panchal, Christopher Asir Rajan, A. Dheepanchakkravarthy, A. Geetha, B. Gunapriya, Suman Mann, Kishor Kumar Sadasivuni
Internet Service Provider (ISP) is used to offer support for getting to, and using, the web. This might be sorted out in various structures, for example, business, non-benefit or exclusive. It includes Internet access, Internet transit, web hosting and colocation. It provides e-mail, web-hosting, online storage service, cloud service or physical server operation. The wireless ISP is mainly based on the wireless networking. This may include commonplace Wi-Fi wireless mesh technology or modern equipment designed to operate over very high frequencies. Advanced ISP integrates a wide array of surveillance that allows monitoring of internet traffic in real-time applications.
A survey of phishing attack techniques, defence mechanisms and open research challenges
Published in Enterprise Information Systems, 2022
Step 2: Phishing Site Construction: Then, phisher creates a phishing website that looks similar to the official website. Various online tools are available which generate a replica of a well-known website (HTTrack Website Copier- Free Software Offline Browser (GNU GPL) 2017, GNU Wget 2017). Cui et al. found that 90% of the malicious sites are the replica or modification of previous phishing attacks (Cui et al. 2017). After constructing the website, phisher uploads these files to a web-hosting server.