Explore chapters and articles related to this topic
Intelligent Systems
Published in R.S. Chauhan, Kavita Taneja, Rajiv Khanduja, Vishal Kamra, Rahul Rattan, Evolutionary Computation with Intelligent Systems, 2022
J. Senthil Kumar, G. Sivasankar
An Autonomous Mobile Robot Navigation scheme is used to move the robot from the starting position to the specified destination location on the map using the sensor data. Sensors mounted on autonomous mobile robots are helpful in capturing the status of their environments, and they generate numerous raw information. Certain sensor data could capture the internal status of robotic parameters as well. Based on the perception data for the exteroceptive and proprioceptive sensors mounted on robotic systems, accuracy in the perception modules could be ensured by the learning algorithms for assisting the robotic systems toward desired direction. For example, from the odometric sensors mounted on the TurtleBot3 robot, its pose and orientated direction of the robot could be estimated using the odometry data. Apart from the odometry data, other sensor data also could be fused to estimate the accurate position of the robot in the environment. The odometry information is updated in the map using wheel optical encoders, IMU sensor, and distance sensor in TurtleBot3 robot. The ROS topic meant for odometry is used to measure the estimated pose of TurtleBot3 robots. Figure 6.4 shows the ROS navigation architecture for the proposed social navigation application.
Multisensor Precise Positioning for Automated and Connected Vehicles
Published in Hussein T. Mouftah, Melike Erol-Kantarci, Sameh Sorour, Connected and Autonomous Vehicles in Smart Cities, 2020
Mohamed Elsheikh, Aboelmagd Noureldin
Odometry is the calculation of the change in position and the speed of an object using motion sensors. The distance traveled and the speed of a land vehicle can be determined by measuring the rotation of its wheels. An odometer has been traditionally fitted to the transmission shaft of the vehicle to measure its speed; nevertheless, most of the new vehicles have one on each wheel known as the wheel speed sensor, which is used for the antilock braking system [12]. Knowing the forward speed of the vehicle can enhance the navigation solution by either using it to reduce the inertial system model [16,19] or as a measurement update. The vehicle forward speed can be accessed in most of the vehicles through its on-board diagnostics interface. However, the speed obtained from the on-board diagnostics interface is usually provided at a low output rate and with a low resolution which causes quantization errors.
Navigation, Environment Description, and Map Building
Published in Marina Indri, Roberto Oboe, Mechatronics and Robotics, 2020
Henry Carrillo, Yasir Latif, José A. Castellanos
Over the years, a vast number of sensors with different capabilities have been employed to give the robot a better understanding of its surroundings. The initial SLAM solution used sonar range finders that provided range and (very noisy) bearing measurements. Algorithms that extract features such as corners or lines representing flat walls [81] are a part of the front-end. The operation of the front-end is heavily sensor dependent, and the algorithms that are classified as front-end algorithms often deal with a specific sensing modality. Calculating quantities such laser odometry using a laser range finder, the incremental change in the camera position between two frames, position estimation using inertial measurement units (IMUs), fusion of multiple sensor measurements, etc., are all done at the front-end. With each new sensing capability, new front-end algorithms are developed to extract away the sensor and extract the needed measurements (such as relative motion constraints).
Fast Bayesian graph update for SLAM
Published in Advanced Robotics, 2022
Shun Taguchi, Hideki Deguchi, Noriaki Hirose, Kiyosumi Kidono
Simultaneous localization and mapping (SLAM) is an extremely important technology for autonomous mobile robots and has been the subject of much research. In recent years, graph-based SLAM [1] has become the most successful SLAM framework, and it has been demonstrated to perform well with light detection and ranging (LiDAR) systems and cameras [2]. Graph-based SLAM is a SLAM method that expresses the positional relationship between robots and landmarks in a graph, where the poses of the robot and landmarks in the environment are considered as nodes, and any constraint between them is considered to be an edge. The edge constraints are estimated using odometry and sensor observation, and the nodes are calculated using graph optimization with edge constraints. In general, even if environmental sensors are used, the odometry errors increase with movement distance. However, when using graph-based SLAM, the edge between the past and current nodes can be estimated from the external information received from the environmental sensors, thus enabling the modification of the entire graph. This process is called loop closure, which is a significant advantage of the graph-based model. Even if the movement distance is long, loop closure can prevent error accumulation by revisiting previously visited features.
About One Way to Increase the Accuracy of Navigation System for Ground Wheeled Robot Used in Aircraft Parking
Published in Smart Science, 2020
Alexander I. Chernomorsky, Konstantin S. Lelkov, Eduard D. Kuris
The task of modeling the navigation system under consideration was to verify the developed algorithm for correcting the operation of the autonomous integrated navigation system of GWR when it moves along some arbitrary trajectory in the aircraft parking at selected characteristic parameters of the GWR components. Length of the circumferences of right and left wheels is = = 1.57 m. The frequency of the odometer measurement system is 100 Hz. And measurements of the angular speeds of rotation of the wheels by the odometric system contain errors caused by the presence of slippage of the GWR wheels. The errors are harmonic oscillatory processes. Simulation time is 500 sec. Figure 4–5 show graphs of the angular velocities of GWR, taking into account the error of their measurements.
Near-optimal sliding mode control for multi-robot consensus under dynamic events
Published in Advanced Robotics, 2023
Anuj Nandanwar, Narendra Kumar Dhar, Laxmidhar Behera, Rajesh Sinha
We use three Pioneer P3−DX robots in an obstacle-free MRS framework for real-time experiments. Of the three robots, two are followers and the third one is a leader. All these robots have onboard sonar sensor and position encoder. The odometry sensor provides position and orientation information of robot. Each robot has an onboard computer with Ubuntu-14 platform running robot operating system (ROS). These robots are capable of handling several control tasks and to-and-fro data transfers. A dedicated Wi-Fi router (Tinda) maintains network connectivity across the framework. The communication between onboard computer and motion control card takes place through an RS-232 serial port.