STEP inside an operating theatre, and there are many items that you would expect to see surrounding you – bright lights suspended from a stand, sterilised instruments lined up on a tray, monitors ready to display the patient’s pulse, blood oxygenation levels, and other vital signs. In robot-assisted surgery, you would even see surgeon-controlled robots operating on the patients. But how about technology coupled with sensing and artificial intelligence (AI)?
How engineering will help robot-assisted cancer surgeons
How engineering will help robot-assisted cancer surgeons
Dr Yuhang Chen, Associate Professor of Biomedical Engineering at the National Robotarium and the School of Engineering and Physical Sciences at Heriot-Watt University, is bringing together academics, clinicians, and industrialists for a new project that aims to aid cancer surgeons during robot-assisted surgery, using mechanical sensing and computer vision, with more precise assessment of tissue condition and surgical margins intraoperatively.
Operating theatres are changing. The advent of keyhole surgery means that surgeons can now make smaller incisions and then insert tiny cameras with lights to view what’s going on inside the patient’s body. They can then make further small incisions to deploy slender tools to help with diagnosis or treatment.
Yet keyhole surgery also poses a problem. When it comes to removing a tumour, a cancer surgeon will use their years of knowledge and experience to decide how big to make the “surgical margin” around the growth. That surgical margin is an area of healthy tissue that will be removed along with the tumour – a safety zone to make sure everything goes.
Remove too much and it will take longer for the patient to heal, increase their pain, and could lead to the loss of too much functional tissue; remove too little and there’s a risk that some of the tumour will be left behind, potentially leading to more surgery or other aggressive treatments.
Surgeons can draw on lots of information when it comes to deciding on the size of that surgical margin. They can use computerised tomography (CT) scans and other pre-operation images, and they can even remove small samples of the surrounding tissue and have them rushed down to a pathology laboratory for analysis during surgery.
In traditional open operations, surgeons can also physically feel the characteristics of the surrounding tissue, using their years or decades of knowledge and experience to help guide their decisions. Keyhole surgery removes that ability to feel the tissue – but a new tool being developed by my team aims to give surgeons back that sense of touch during keyhole surgery.
We’ve brought together a team of academics, clinicians, and industrialists to build a probe that can take mechanical measurements of the tumour and the tissue that surrounds it. Our team has experience in a range of disciplines – from fibre-optic sensors, micro-mechanical engineering, laser manufacturing through to computer modelling and vision – and we’ll draw on all that knowledge during the project.
The probe is being designed to work in the types of confined spaces that are found typically in keyhole surgery, more formally called, for example, “laparoscopic” surgery on the abdomen. Those confined spaces are especially relevant when working around the pelvis or rectum, where there are risks to important blood vessels and vital nerves, along with the bladder, bowel, sexual organs, and lower limbs.
Our device will also be paired with software containing ‘mechanical intelligence’ algorithms, which will allow the programme to compare and contrast the data received from the probe with information from previous tumours, tissue, and surgical margins, helping it to make interpretations about what it’s touching during the cancer surgery. Together, the hardware and software will give the surgeon reliable quantitative data in real time. Coupled with surgical robotics, our technology would offer exciting opportunities for robot-assisted surgery where surgeons are informed, in real-time, about the tissue condition and optimal surgical margin.
Drawing on diverse expertise is crucial to the success of the project. Hugh Paterson, a consultant colorectal surgeon at the Western General Hospital in Edinburgh, and his colleagues, including Daniel Good, a consultant urological surgeon, are sharing their years of experience in minimally invasive (with or without robot-assisted) surgery. On the industrial side, IntelliPalp Dx was spun out from Heriot-Watt and Edinburgh universities and NHS Lothian in 2020 to help diagnose prostate cancer in doctors’ clinics, while Cambridge-based CMR Surgical developed the Versius robotic surgery system.
Cancer surgery is among the many areas of research and development (R&D) taking place at the National Robotarium, which opened its flagship building on Heriot-Watt’s Edinburgh Campus on 28 September 2022. The purpose-built facility will help us take our cutting-edge R&D from the bench to the bedside.
As well as our medical research, the National Robotarium is also home to robots being developed to assist people who need care while living in their own homes and devices that can have conversations with residents in care homes. Our health and care systems are becoming more closely intertwined and our facility is helping to develop the next generation of technology that will assist in both health and care settings.
Our robot-assisted surgery project was awarded £1.25 million by the Engineering & Physical Sciences Research Council, part of the UK Government’s UK Research & Innovation (UKRI), through their ‘Healthcare Impact Partnership’ initiative. The wider National Robotarium is part of the Data-Driven Innovation (DDI) initiative, supported by £21 million from the UK Government and £1.4 million from the Scottish Government. The DDI initiative is turning Edinburgh into the data capital of Europe as part of the wider £1.3 billion Edinburgh & South-East Scotland City-Region Deal.