Explore chapters and articles related to this topic
Software Testing
Published in Kim H. Pries, Jon M. Quigley, Testing Complex and Embedded Systems, 2018
Data flow analysis requires a review of the source code with the objective of understanding the values possible at any point in the code execution. We review the contents of variables throughout the function, calculating the values at each execution point that can modify the data, predicting the values that will be produced during the execution, and comparing those values with what are believed to be valid values. The data flow diagram is one tool we can use to model the flow of data in our code.
Modeling and Simulation Concepts
Published in Gabriel A. Wainer, Discrete-Event Modeling and Simulation, 2017
Static techniques are used to validate the design of the model and the source code (they do not execute the model). These techniques include the analysis of data flow, analysis based on graphics, and syntactic and semantic analyses.
Design and Development (A4): SDL Activities and Best Practices
Published in James F. Ransome, Anmol, Mark S. Merkow, Practical Core Software Security, 2023
James F. Ransome, Anmol, Mark S. Merkow
Using a question-driven approach can help with the review activity. A list of standard questions can help you focus on common security vulnerabilities that are not unique to your soft-ware’s architecture. This approach can be used in conjunction with techniques such as control flow and data flow analysis to optimize the ability to trace those paths through the code that are most likely to reveal security issues. Questions should address at least the most common coding vulnerabilities. Ask these questions while you are using control flow and dataflow analysis. Keep in mind that finding some vulnerabilities may require contextual knowledge of control and data flow, whereas others will be context-free and can be found using simple pattern matching. Some of the following techniques may be combined when doing a manual security review of the code: Control flow analysis. Control flow analysis is the mechanism used to step through logical conditions in the code. The process is as follows: Examine a function and determine each branch condition. These may include loops, switch statements, “if” statements, and “try/catch” blocks.Understand the conditions under which each block will execute.Move to the next function and repeat.Data flow analysis. Data flow analysis is the mechanism used to trace data from the points of input to the points of output. Because there can be many data flows in an application, use your code review objectives and the flagged areas from Step 2 to focus your work. The process is as follows: For each input location, determine how much you trust the source of input. When in doubt, you should give it no trust.Trace the flow of data to each possible output. Note any attempts at data validation.Move to the next input and continue.30
Data change analysis based on function call path
Published in International Journal of Computers and Applications, 2018
Cao Yong, Mu Yongmin, Shen Meie
In 1977, L.J. Osterweil and L.D. Fosdick [2] put forward the concept of data flow analysis. Data flow analysis is a technology [3] used to obtain information about how related data flow along the executed path. In 2003, Bogdan Korel [4] proposed a slicing technology based on state model (Statement-Based model). His research focuses in the system testing. In his research, decrease in the regression test case was done using extended finite state machine (EFSM). In 2011, JinHui [5] introduced the data flow analysis draw lessons from Hardware fault diagnosis system.
Mixed-language automatic differentiation
Published in Optimization Methods and Software, 2018
Valérie Pascual, Laurent Hascoët
Static data-flow analysis [10] is an essential step to achieve efficient differentiation with a source-to-source AD tool. The goal of static data-flow analysis is to provide information on the data computed and returned by a program without knowing the values of the program's run-time inputs. In other words, static data-flow analysis extracts useful information on the program at compile time, this information being thus valid for any run-time execution on any inputs. Obviously, such an information can only be partial and must often resort to the undecidable ‘I don't know’ reply in addition to ‘yes’ or ‘no’. Abstract interpretation [4] is a framework for static data-flow analysis in which the values computed in the original code are replaced with abstract values containing the propagated abstract information. One classical example of the abstract information that one may want to propagate is the interval in which the run-time value will range, or the set of possible destinations of each pointer variable. Starting from some abstract information on the inputs or outputs (which may be empty), abstract interpretation propagates it through the program, possibly guided by its control-flow structure. Instead of a true execution of the program, possible only at run-time, this propagation must stand for every possible execution path. Some data-flow analyses follow these paths forward, others need to follow them backward. As call graphs may be cyclic in general (recursivity), and flow graphs may be cyclic (loops), completion of the analysis requires the reaching of a fixed point both on the call graph and on each flow graph. The abstract domain in which the propagated information range is designed in such a way that this fixed point is reached in a finite number of iterations. Most of the classical data-flow analyses prove useful for AD as well as specific analyses such as activity and TBR analyses [8,14]. In most AD-specific data-flow analysis, the abstract information is, for each variable v, a boolean value (e.g. does v influence the output in a differentiable way?) or a set of other variables (e.g. which input variables have a differentiable influence on v?).