Explore chapters and articles related to this topic
Architecture (A2): SDL Activities and Best Practices
Published in James F. Ransome, Anmol, Mark S. Merkow, Practical Core Software Security, 2023
James F. Ransome, Anmol, Mark S. Merkow
The first step of the threat modeling process is to develop a visual representation of the threat flows in the form of a diagram typically drawn during a whiteboard session. It is important to provide a structure for this process. Providing structure helps avoid mistakes. Without a good diagram, you likely won’t have a good threat model. It is important to understand, first, that this exercise is about data flow and not code flow. This is a mistake often made by developers on the team because they live, breath, and eat code development and are not typically focused on the data security of the code they are developing. It should be no surprise that the diagram produced in this stage of the threat modeling process is called a data flow diagram or DFD. The focus of the DFD is on how data moves through the software solution and what happens to the data as it moves, giving us a better understanding of how the software works and its underlying architecture by providing a visual representation of how the software processes data. The visual representation is hierarchical in structure, so it allows you to decompose the software architecture into subsystems and then lower-level subsystems. At a high level, this allows you to clarify the scope of the application being modeled; at the lower levels, it allows you to focus on the specific processes involved when processing specific data.
Clause 4: Context of the Organization
Published in Sid Ahmed Benraouane, H. James Harrington, Using the ISO 56002 Innovation Management System, 2021
Sid. Ahmed Benraouane, H. James Harrington
The Context Diagram shows the system under consideration as a single high-level process and then shows the relationship that the system has with other external entities (systems, organizational groups, external data stores, etc.). Another name for a Context Diagram is a Context-Level Data-Flow Diagram or a Level-0 Data Flow Diagram. Since a Context Diagram is a specialized version of Data-Flow Diagram (DFD), understanding a bit about data-flow diagrams can be helpful. A Data-Flow Diagram is a graphical visualization of the movement of data through an information system. The DFDs are one of the three essential components of the structured-systems analysis and design method (SSADM).
Managing System Models
Published in John P.T. Mo, Ronald C. Beckett, Engineering and Operations of System of Systems, 2018
A data flow diagram (DFD) shows the types of data, the data path, actions taken based on the data, and where the data is stored. Data flow diagrams are an integral part of the hierarchical functional modeling process. When a function is decomposed to its lower levels, the corresponding data flows in and out of the function are also decomposed to represent data flowing in and out of the lower-level functions.
Improving data quality during ERP implementation based on information product map
Published in Enterprise Information Systems, 2019
For the simplicity of understanding the data manufacturing system, IPMAP is designed to systematically visualise the important phases of the IPs in the manufacturing process (Nasution and Albarda 2013). As a comparison, it is necessary to introduce some classical data analysis models in previous research. Data Flow Diagrams (DFD) is widely used for modelling the flow of data from data sources and data stores to processes, and from processes to data stores and data sinks (Li and Chen 2009). However, DFD represents all the data processes in the same way and thus it cannot distinguish between different types of data processes needed to represent the manufacture of different IPs. Another similar model is the workflow model which could represent the sequence of work activities, data and data flows for a business process (Sun, Zeng, and Wang 2011). The major difference between workflow and IPMAP is that the former deals with a much lower level of process granularity while the latter deals with all the processes related to manufacturing an IP. However, most of the current research on IPMAP focuses on how to construct IPMAP (Shankaranarayanan 2006; Effendi 2017) or how to assess DQ by IPMAP (Thi and Helfert 2007; Nasution and Albarda 2013), without deeply going into the root causes of DQ by IPMAP. Therefore, as an extension of the previous research, IPMAP is preferred in our paper to describe the manufacturing processes of IPs in ERP systems which provide a basis for analyzing and identifying the root causes of poor DQ.