For a robot to perform simple movements, it must be aware of its own structure and how to actuate its different parts. As an example, a legged robot needs to actuate its leg joints in order to follow a specific trajectory which results in a walking gait. In its simplest form, a robot uses this type of information, known as odometry, in order to navigate to a different location. However, the inaccuracies in the assumed knowledge of the robot state and external disturbances, such as obstacles or foot slippage, will result in the robot ending up in a different location to where it perceives itself to be. Moreover, for a robot to be more autonomous, it has to plan its motion in more complex environments. This requires building a map of the environment and recognising where the robot is with respect to this map: simultaneous localisation and mapping (SLAM). SLAM algorithms could involve different types of sensors such as stereo cameras or LiDARs.
We are currently working on advanced motion planning algorithms that would allow our hexapod robot, Corin, to navigate through normally-inaccessible areas such as narrow pathways. We have implemented SLAM algorithms on our different robot platforms, including Carma and Mirrax.