D1 - University of Birmingham

Extreme Robotics Lab

  • The National Centre Director and Principle Investigator: Prof. Rustam Stolkin
  • NCNR Project manager: Peter Brewer

Focus (or foci) of the group for NCNR:

The University of Birmingham Extreme Robotics Lab has established one of the largest and best-equipped robotics research facilities in the UK. This comprises a 1,000m2 academic lab on campus, and a large rig-hall space at Birmingham Energy Innovation Centre containing large-scale heavy-duty manipulators.

The ERL team is known internationally as leaders in the application of advanced robotics to extreme environment industrial challenges. The team are particularly well known for their work on autonomous robotic manipulation driven by computer vision, however the research portfolio spans a wide range of core robotics and AI technologies that include:

  • Robotic manipulation, including autonomous motion planning and control for grasping, cutting and other tool use.
  • Remotely operating robot vehicles and mobile manipulators.
  • Advanced computer vision, and other perception modalities such as tactile sensing.
  • AI, machine learning and neural networks.
  • Human-robot interaction, including haptic exoskeleton interfaces and virtual, augmented and mixed Reality.
  • Human-AI collaboration for controlling remote robots, including: human-supervised autonomy, variable autonomy and shared control paradigms.
  • Landmark demonstrations at high TRL on nuclear industry sites.
  • Applications to other domains, including recycling and circular economy, disaster response and others.

“Significant advances are needed, beyond the technologies currently available, to develop robotic systems that can carry out complex tasks in hazardous environments, remote from human operators. In such cases situational awareness may be limited and communications difficult, requiring robots to have greater on-board intelligence and autonomous control capabilities. Novel sensors and advanced machine vision and perception will be needed for robots to navigate their environment and do useful work. High consequence, safety-critical interventions may still need a human to be in control, however, but these may be too complex for familiar forms of tele-operation, e.g. joystick. This creates an opportunity for novel approaches where operators and AI collaborate, in real time, to control remote machinery in very challenging environments.”

Professor Rustam Stolkin