Author: Cassandra Marchant

Autonomous Exploration and Radiation Mapping with Mobile Robots

As part of its participation in the RAIN hub, the Dynamic Robotics Systems Group (DRS) of the Oxford Robotics Institute (ORI) has developed a general purpose autonomous exploration planner dubbed: the Terrain Explorer. The Terrain Explorer integrates with external volumetric sensors (either a 3D scanning LIDAR or depth cameras) to build a 3D representation of the robot’s surrounding and to autonomously expand it. To date, the modular system has been integrated with several robots including the ANYbotics ANYmal, the ROSS robotics EXTRM rover and a Clearpath Husky. The Terrain Explorer has enabled the ORI to carry out fully autonomous robotic inspections in complex environments including exploring a UKAEA nuclear storage facility and a vast underground mine. The system integrates with the ORI’s VILENS SLAM system to produce complete LIDAR maps of the explored environment.


The Terrain Explorer finds frontiers to unexplored areas (yellow and red spheres on the image) and directs the robot to the most desirable. Our SLAM system computes the trajectory (yellow path) of the robot and produces a graph based map of the environment.

To employ this system in a radiation environment, ORI has partnered with Createc to deploy a modularised version of their NVisage® / RECON system as part of an Innovate UK funded project. This system adds an overlay to the LIDAR maps detailing the radioactivity of all points in the observed map. The modularised system means that only those software and hardware components not already native to the robotics platform need to be integrated. For the majority of the trials performed, Createc provided a CZT gamma detector and the NVisage® radiation mapping software.

The Terrain Explorer is an open-ended planner which explores an arbitrary space. The system can be constrained by setting a perimeter or time limit for exploration. The planner automatically returns the robot to its starting position after exploration has been completed or should other external factors such as a battery warning be triggered. To do this it takes advantage of the pose graph representation with Oxford’s SLAM system – with the poses acting as a highway to enable efficient backtracking and re-routing. Additionally the system is able to handle communications blackouts by recursively returning to areas where communications previously met a quality of service baseline. Additional features such as preferentially following the perimeter of the space can be configured to take advantage of prior knowledge of the environment.


Here the Nvisage® radiation point cloud is displayed while the robot is exploring. A blue point means that there is no radiation, a warmer colour means that the sensor has detected radiation, the source is located at the green cube.

Several trials have been performed to prove the capabilities of the integrated systems: the system was deployed to an underground mine complex to prove the robustness of the planner in an unstructured and irregular environment. This video shows the robot exploring the entirely unlit facility – using only its own on-board lighting.

Here you can see the system has also been tested in a low-hazard drummed waste store at the UKAEA site in Culham.

A final demo for the project is planned at Createc’s Cockermouth testing facility on 18th February to demonstrate the results of the collaboration through the RAIN Hub’s seminar series.

Professor Robert Buckingham Celebrated on New Year’s Honour List

RAIN’s Professor Rob Buckingham has been awarded an Officer of the Order of the British Empire (OBE) for his services to robotic engineering. The UK Atomic Energy Authority’s (UKAEA) Robotics Director was one of those named in the full New Year Honours list for 2021, which recognises the achievements and service of extraordinary people across the United Kingdom. The awards will be presented by Queen Elizabeth II or her vice-regal representative.

You can read the full article here.

Remote Inspection of the Reactor 4 Shelter’s Western Wall

In October 2020, a Bristol University research team with engineers and scientists from the Interface Analysis Centre deployed a semi-autonomous robot for remote radiation inspection inside the Chornobyl Nuclear Power Plant (ChNPP) in Ukraine. The objective of this visit was to create a radiation map of the western wall of the Shelter Object, known as the Sarcophagus, a temporary containment structure hastily erected over the remains of Reactor 4 in the aftermath of the accident on 26 April 1986. The Shelter is facing structural integrity challenges and will need to be dismantled the upcoming years to take the contained reactor remains apart.

By deploying a specialised sensor system called YanDavos on Boston Dynamics’ quadrupedal robotics platform Spot, the researchers were able to acquire an accurate dose rate map of the Shelter’s wall without exposing humans to high radiation levels. Spot moved the sensor system into position in an environment where the dose rate exceeds 100 µSv/h, a hazardous radiation level for humans, and remained there until the scan was completed. YanDavos was designed and developed at Bristol University by RAIN researchers, and combines state-of-the-art gamma spectroscopy, simultaneous single point Lidar ranging and 15 FPS solid state Lidar ranging with 4K cameras to produce detailed radiation maps that can be used to plan decommissioning activities. Spot also completed a radiation mapping test around the outer perimeter of the Shelter as a training exercise for its radiation mapping algorithms.

You can watch the documentary excerpt for Ukrainian TV here, and a feature on the RT news network Ruptly here.

Dr Cardoso and Dr Ferrando secondment to NIST – US

RAIN researcher’s Dr Rafael C. Cardoso and Dr Angelo Ferrando were not able to go to their secondment at the National Institute of Standards and Technology (NIST) in the US because of the COVID-19 pandemic. Instead, they have been collaborating virtually with NIST in two different projects.

Dr Cardoso is leading the project on agile tasking of robotic systems with explicit verifiable autonomy. Task agility is an increasingly desirable feature for robots in application domains such as manufacturing. The Canonical Robot Command Language (CRCL) is a lightweight information model built for agile tasking of robotic systems by NIST. CRCL replaces the underlying complex proprietary robot programming interface with a standard interface. In this project, we exchange the automated planning component that CRCL used in the past for a rational agent in the Gwendolen agent programming language, thus providing greater possibilities for formal verification and explicit autonomy. We have evaluated our approach by performing agile tasking in a kitting case study.

The NIST agility lab, with two robot arms performing kitting operations in the real world (left) and in simulation (right).

Dr Ferrando is leading the project on runtime verification of the ARIAC (Agile Robotics for Industrial Automation Competition) competition. ARIAC is a robotic competition which aims to advance robotic agility in industry. Participants in this competition are required to implement a robot control system to overcome agility challenges in a simulated environment. ARIAC comes with a set of score metrics to evaluate the performance of each control system during task execution. In this project, we show how such task-oriented evaluation can be problematic and how the addition of runtime monitors to verify properties given in ISO/TS safety standards can help in reducing the resulting reality gap. In particular, we focused on an initial case study where a safety property given in ISO/TS 15066:2016 is used to synthesise a monitor.


The ARIAC competition.

ISCF Robots for a Safer World Cross-Hub Activities

The Use of Digital Twins for Robotic Inspection, Maintenance and Repair 

The use of robots and AI represents the future of critical infrastructure inspection, repair and maintenance.  Through the ISCF’s ‘Robots for a Safer World’ scheme, four world-leading research hubs; FAIRSPACE, NCNR, ORCA and RAIN, have been developing solutions for use in hazardous and challenging environments, such as those found in the nuclear, offshore and space sectors.

Significant research has been undertaken across the Hubs in the use of digital twins as part of robotic inspections.  To maximise the impact of this research, a new cross-hub initiative has been launched on ‘Digital Twins and Digital Tissue for Robotic Inspections, Maintenance and Repair’. 

As well as enabling the sharing of state-of-the-art research between the hubs and fostering a collaborative community, it is also allowing researchers to continue their research during the Covid-19 pandemic lockdown. 

For example, NCNR researchers from the University of Lincoln have been able to control a mobile robot from the RAIN Hub at the University of Oxford.  Teams from the University of Manchester are working to control a mobile robot from the ORCA Hub at Heriot-Watt University as well as robot manipulators from the FAIRSPACE Hub at the University of Edinburgh.

This video showcases some of the cutting-edge research which forms the foundation of the cross-hub initiative.

ISCF Robots for a Safer World – Cross-Hub Activities on Digital Twins for Robotic IMR