Surgeons can use AI chatbot to tell robots to help with suturing

A virtual assistant for surgeons translates text prompts into commands for a robot, offering a simple way to instruct machines to carry out small tasks in operations.

Surgeons could use a ChatGPT-like interface to instruct a robot to carry out small tasks, such as suturing wounds and dilating blood vessels.

An AI tool makes it possible to issue text commands to surgical robots
Gorodenkoff/Shutterstock​


Surgical robots have been in use for decades, but these are normally controlled entirely by a human. Researchers are now developing autonomous versions that can perform parts of an operation without human assistance, but these can be difficult for people to work with because of a lack of fine control.

To address this, Animesh Garg at the University of Toronto, Canada, and his colleagues have developed a virtual assistant, called SuFIA, that can translate simple text prompts into commands for a surgical robot and defer to a human surgeon when it gets stuck.


SuFIA uses OpenAI’s GPT-4 large language model to break down requests from a surgeon, such as “pick up the needle and move it”, into a sequence of smaller subtasks. These subtasks will then trigger a piece of software to run in another tool, such as a robotic surgeon or camera.


“It’s like when you play video games, you have a much lower dimensional controller even though the actual game you’re playing is much more complex,” says Garg. “If you’re playing soccer, you’re not really controlling the humanoid player, you just press kick and shoot and jump and pass.”


Garg and his team tested four tasks in a simulated environment, including picking up and moving needles and dilating blood vessels. They then tested the needle tasks with a real Da Vinci robotic surgeon.


Automating simpler surgical tasks, such as moving a needle around, could help surgeons better perform procedures, says Danail Stoyanov at University College London. However, it might take some time before regulatory bodies are convinced that this should be used in clinical trials, he says.


“The question is, what real value does it add to the patient care?” says Stoyanov. “That’s the bit that, until it’s in place safely, is very difficult to answer because sometimes we develop technology with one indication, and then you put it in the clinic and you realise that actually it doesn’t quite function as you wanted.”


Using large language models in surgical assistants will also require very careful safety features to mitigate the risk of “hallucinations”, where the AI convincingly makes up facts, says Stoyanov.


Reference:

arXiv DOI: 10.48550/arXiv.2405.05226

Post a Comment

0 Comments