Explore chapters and articles related to this topic
Development of Autonomous Vehicles
Published in Diego Galar, Uday Kumar, Dammika Seneviratne, Robots, Drones, UAVs and UGVs for Operation and Maintenance, 2020
Diego Galar, Uday Kumar, Dammika Seneviratne
Sampling is a broad methodology for gathering statistics about physical and social phenomena; it provides a data source for predictive modeling in oceanography and meteorology. Adaptive sampling denotes sampling strategies that can change depending on prior measurements or analysis and thus allow adaptation to dynamic or unknown scenarios. One such scenario involves the deployment of multiple underwater vehicles for the environmental monitoring of large bodies of water, such as oceans, harbors, lakes, rivers, and estuaries. Predictive models and maps can be created by repeated measurements of physical characteristics, such as water temperature, dissolved oxygen, current strength and direction, and bathymetry. However, because the sampling volume could be quite large, only a limited number of measurements are usually available. Intuitively, a deliberate sampling strategy based on models will be more efficient than a random sampling strategy (Popa et al., 2004).
Recent Trends in Adaptive Control Theory
Published in V. V. Chalam, Adaptive Control Systems, 2017
In a later paper de la Sen [73] discusses another approach for improving the behavior during the adaptation transient. The method is based on the use of adaptive sampling, taking the tracking (or regulation) error between the reference model and the controlled system outputs as the signal to be adapted. No auxiliary input to the adaptive scheme is considered and no extra hypothesis on the input to the plant or reference model output is made. The adaptation algorithms used are of the same type as those commonly applied for constant sampling. The adaptation mechanism consists of (1) the usual feedforward-feedback controller (first control level), (2) the adaptation algorithms (second control level), and, in addition, (3) a sampling controller, which supplies an on-line adaptation of the sampling period to the error signal between the reference model and the controlled plant, and (4) an induced sampling controller acting on the output (while the sampling controller acts on the input). Results of simulation show that adaptive sampling yields better performance than periodic sampling. Although consideration is given to response during the adaptation transient, the stability of the overall scheme is preserved, de la Sen has summarized [73a] the guidelines used for updating the free parameters and the sampling rate in adaptive systems to improve the adaptation transients and has also focused on practical implementation by using appropriate algorithms. Another paper of interest is de la Sen [73b].
Communication, Localization, Coverage, Error and Control, Time Synchronization, Naming and Addressing, and Cross-Layer Issues
Published in Vidushi Sharma, Anuradha Pughat, Energy-Efficient Wireless Sensor Networks, 2017
Anuradha Pughat, Parul Tiwari, Vidushi Sharma, Neeta Singh
These approaches can be mainly classified as data reduction and energy-efficient data acquisition. Data reduction is the process in which the larger entity of the collected data from sensors is converted into smaller useful entity so that at a later stage the same data can be retrieved without any loss. This concept is very important in power management as the transmission of data from nodes uses up a lot of power, thus by reducing the size of the data, power consumption can be reduced. At the same time, the data reduction approach also concentrates on preventing the nodes from transmitting data to the sink. This reduces the transmission load on the node as well as the communication and processing overheads at the sink side. The other approach, i.e., energy-efficient data acquisition, is an approach where data acquisition is reduced based on energy-efficient algorithms. However, this is not exclusive to reducing energy consumption through sensing. It also reduces the number of communications along with reducing sensed data. Adaptive sampling takes advantage of correlated and gradually changing data, thus reducing the number of data sensings. Hierarchical sampling looks into the dynamics of the nodes such as accuracy and energy consumption, thus ensuring a balance between the aforementioned attributes. Another energy-saving approach is the model-based active sampling in which data is presented by building upon the sampled models. Due to this, less number of communications exists between the nodes.
Development of a reduced order model for severe accident analysis codes by singular value decomposition aiming probabilistic safety margin analysis
Published in Journal of Nuclear Science and Technology, 2020
Masaki Matsushita, Tomohiro Endo, Akio Yamamoto
The purpose of ROM is a quick reproduction of the results of an SA code. In the construction of ROM, the minimization of the number of training data obtained by an SA code while maintaining the prediction accuracy of ROM is desirable, as an SA code requires longer computational time. To apply the ROM for the CDP distribution calculation, the boundary between core intact and core damage in the input parameter space (it is named as the CD boundary in this study) should be accurately captured by ROM. A data sampling method to sample training data at the vicinity of CD boundary is therefore desirable for ROM construction; however, the actual CD boundary is unknown at the beginning of ROM construction. ROM has the advantage of quickly reproducing the multiple simulation results. The approximate CD boundary is estimated using multiple reproductions by ROM (it is named as the CDROM boundary in this study). The CDROM boundary is used to sample training data instead of the actual CD boundary. By sampling training data on the CDROM boundary, the prediction accuracy of ROM will improve because the selected data is densely sampled around the CDROM boundary. This sampling method is classified as an adaptive sampling method.