To develop highly redundant, remote situational awareness using mobile robots carrying traditional and nuclear-specific sensors.
1. Improved dynamic mobility is essential to allow legged vehicles to cross rough terrain, climb ladders etc. This requires responsive re-planning of high dimensional robot systems and current approaches struggle to marry the uncertainty of real world sensing, with the limits of high frequency feedback control; 2. Robust situational awareness in low-light conditions and poor visibility. This requires the fusion of inertial, LIDAR, vision, radiation sensing and other measurements to achieve reliable and accurate navigation; 3. Autonomous navigation algorithms should effectively pair with human operators. Map representations communicated to human operators should be labelled with estimates of physical properties (radiation, temperature) as well as geometric structure. Reconstructions should support change detection on a semantic level to identify subsidence and fissure.
Work will be spread across the following initial work-packages:
A. Dynamic Motion Planning and Control [OXF1, MAN1] (T1RC1, link to T2) for dynamic mobile robots (legged, wheeled, etc). Dynamic trajectory optimisation and motion planning formulations will be developed that consider the terrain morphology on multiple levels of resolution to determine gait and enable robust real-time re-planning of routes.
B. Robust On-line State Estimation and Situational Awareness [OXF2, MAN2, MAN7, LAN1,2] (T1RC1, 2, link to T2) using accurate, low latency state estimation that fuses sensor sources. Robust estimation and terrain reconstruction in low light and degenerate structures, coupled with machine learning-based visual SLAM and radiological situation awareness.
C. Semantic Reconstruction and Re-navigation [OXF2,3] (T1RC2, 3 and link to T5). Dense visual mapping to build centimetre accurate reconstructions of structures to be decommissioned. 3D trajectory re-navigation will allow robots to avoid high dose areas and also update the facility’s knowledge representation using standard formats.
D. Change Detection and Integrity Analysis [RACE1, BRI1] (T1RC2, 3 and link to T2). Long-term change analysis to detect, for example, deterioration of absorber titles, localised gamma emissions and movement of radionuclides.
E. Co-active Design of Semi-autonomous Systems [SHE1, NOT1, BRI1] (T1RC1, 3 and link to T3 and T4). Sympathetic development of autonomous mobility and manipulation algorithms that consider interaction with operators.
F. In-Bore Robotic Delivery [RACE3, BRI2] (T2RC3 and link to T3, 5). Development of high reliability service joining and propulsion techniques focusing on in-bore robotic delivery of laser cutting, welding and inspection equipment.
User Engagement: In Y1, users will provide details of challenges and environments and identify initial benchmarks. In Y2-4 users will provide technical support, particularly with actuation (Moog) and support evaluation and benchmarking. NPL and BSI will be asked to support the publications of standard terrain courses / challenges.
SMART Measurement of Success: 1. Artificially implanted physical changes or radiation sources can be detected (in a challenge situation). [M8]; 2. Locomotion trials over standardised terrain courses, benchmarked against specific scenarios at Sellafield. [M12] ; 3. Reconstruction of challenging structures from hand held platforms. [M12]; 4. Reconstruction from legged and aerial platforms (separately) with active coverage control, follows on from #3. [M24]