Explore chapters and articles related to this topic
Two-Dimensional Microfluidic Bioarray for Nucleic Acid Analysis
Published in Iniewski Krzysztof, Integrated Microsystems, 2017
Microfluidics also offers the advantage of multisample capabilities on one chip. Conventional microarray experiments usually employ one sample on one glass slide [12]. However, in the application of genetic mutation analysis, clinical diagnostics or microorganism identifications, a direct comparison between different samples on the same chip would be preferable because the quality of slides with probe arrays varies from batch to batch [5]. Microfluidics allows for the delivery of controlled volumes of samples and reagents to the DNA microarray. By integrating multiple channels into one chip, a high-throughput multisample analysis has been achieved [18,19]. Moreover, DNA hybridization assays depend on different parameters such as temperature, stringency of the hybridization, and washing buffer conditions. The parameter space could be too large to be addressed efficiently using conventional approaches. On the contrary, because the microfluidic method has the capacity for multiple sample injection as well as accurate control of liquid flow and temperature, it has been developed for the automated selection of optimal assay parameters [20].
What Should Be Modeled in Cancer
Published in Vittorio Cristini, Eugene J. Koay, Zhihui Wang, An Introduction to Physical Oncology, 2017
Alejandra C. Ventura, Sofia D. Merajver
Given the enormity of the unknowns, what is necessary to begin unraveling the problem? Successful identification of transmembrane receptors, intracellular signaling proteins, and transcription factors that mediate the responses of cells to intra- and extracellular ligands has generated a wealth of information about the biochemistry of signal transduction [18]. It is important to note that most biochemical and molecular biology experiments tend to be biased toward ascertaining large static differences between the expression (or modification) of proteins or genes, rather than subtle steady-state differences or significant dynamical profiles. In this manner, our current thought in signaling is driven yet limited by those types of data. Moreover, accumulation of molecular detail does not automatically yield improved understanding of the ways in which signaling circuits process complementary and opposing inputs to control diverse physiological responses. For this, network-level perspectives are required [19]. When the number of species in the network is large, parameter estimation becomes very challenging [20]. A plausible, alternative approach is to depict the pathway as a collection of modules that are connected with each other through input–output properties. Another useful approach when parameter estimation is challenging is to exhaustively explore the parameter space.
Framework for Biomedical Algorithm Designs
Published in Pietro Salvo, Miguel Hernandez-Silveira, Krzysztof Iniewski, Wireless Medical Systems and Algorithms, 2017
Su-Shin Ang, Miguel Hernandez-Silveira
There are two aspects to an optimization problem—the objective function and the set of constraints. The optimization problem is defined over the parameter space, where each data point in the parameter space has a certain cost or benefit, which is quantified by means of an objective function. The scope of this parameter space is in turn defined by constraints, which are characterized by a set of equalities or inequalities. More formally, the objective function can be specified according to Equation 6.1. () Minimize f:ℝ→ℝ
Effects of grid-size on effective parameters and model performance of SHETRAN for estimation of streamflow and sediment yield
Published in International Journal of River Basin Management, 2021
The SHETRAN model demands a multitude of input parameters. The determination of these parameter values involves large uncertainties (Bahremand & De Smedt, 2008). Hydrologic models with reduced parameter uncertainty, capable of producing better simulations are possible with the knowledge of which parameters the results are most sensitive to (Lenhart et al., 2002). Sensitivity analysis reduces parameter dimension thereby assisting in model calibration procedure (Demaria et al., 2007). Sensitivity analysis methods can be grouped into local, in which sensitivity to parameters is estimated around one local point by varying one value at a time or global techniques which examine sensitivity to parameters around whole parameter space (Arnold et al., 2012; Pappenberger et al., 2008; Yang, 2011). A two-stage sensitivity analysis method consisting of Morris method (Morris, 1991) and a local sensitivity analysis method helped in reducing the SHETRAN model parameter space and was found to be effective in manual calibration of streamflow in Netravathi river basin (Sreedevi & Eldho, 2019).
Reverse dosimetry modeling of toluene exposure concentrations based on biomonitoring levels from the Canadian health measures survey
Published in Journal of Toxicology and Environmental Health, Part A, 2018
Honesty Tohon, Andy Nong, Marjory Moreau, Mathieu Valcke, Sami Haddad
Global sensitivity analyses were conducted, considering each exposure duration simulated, to identify the model parameters that were critical to the current analysis and thus were the most influential on model outputs. Hence, such analysis enabled indication of the critical determinants of toluene TK simulated in our models. It was conducted by evaluating how much does a given change in the values of certain model’s input parameters, such as body height (BH), cardiac output (Qc), alveolar ventilation (Qalv), liver blood flow (Ql), fat blood flow (Qf), liver volume (Vl), CYP2E1 concentration (CYP) and partition coefficients, influence the output parameters of interest (the venous blood concentration of toluene (Cv), the internal dose measure of interest). Precisely, the global sensitivity coefficients (GSC, i.e. here Fourier coefficients) were automatically computed for the models' input parameters considered from the ACSL software, based upon Extended Fourier Amplitude Sensitivity Test (EFAST) method. This method of global sensitivity analysis helps to estimate model output variability arising from variability in model inputs over the entire parameter space, and to account for interaction effects between the parameters. This is not the case with often used local sensitivity analyses that have high efficiency in computational analysis, but which could neglect these interactions and simultaneous variations in various input parameters, and which could lead to misleading results (Hsieh et al. (2018)).
Metaheuristic-based crack detection in beam-type structures using peridynamics theory: A comparative study
Published in Mechanics of Advanced Materials and Structures, 2023
Ehsan Afshari, Farshid Mossaiby, Taha Bakhshpoori
Parameter adjustment or tuning problem in the parameter space of a metaheuristic is crucial and can be considered as an optimization task. However, a deep sensitivity analysis for algorithm parameter configurations is unattainable. Therefore, the best parameters used for these algorithms in solution of structural optimization problems (reported by other researchers) are used. Parameters used in the algorithms are tabulated in the Table 1. np stands for the number of the population of the algorithms. For other parameters definition, the reader may consult the study by Kaveh and Bakhshpoor [50] for more information.