Watch a robot operate on a pork loin

It learned to pick up needles by itself.

Share

Robots can already mimic surgeons to a certain degree, but training them to do so often involves complex programming and time-consuming trial-and-error. Now, for the first time, a machine successfully learned to replicate fundamental operation tasks after simply analyzing video footage of medical experts. But before it gets to work on human patients, the tiny robotic arms practiced on a pork loin.

Doctors have increasingly integrated the da Vinci Surgical System into an array of procedures since the device’s debut in 2000. The small pair of robotic arms ending in tweezer-like graspers are already used in prostatectomies, cardiac valve repairs, as well as renal and gynecologic operations. But the device has its limitations, particularly when it comes to teaching it new tasks.

“It [was] very limiting,” Johns Hopkins University assistant professor of mechanical engineering Axel Krieger explained in a November 11th profile. Krieger added that programming previously necessitated every step of a surgery to be hand-coded by experts, meaning that a single form of surgical suturing could take as much as a decade to perfect.

As Krieger and colleagues explained at this year’s Conference on Robot Learning in Munich, Germany, that painstaking era may be nearing its end. Using similar machine learning principles behind reinforcement learning models such as ChatGPT, Kreiger’s team recently developed a new model based on kinematics. Instead of a large language model’s word-based datasets, the novel da Vinci Surgical System training program relies on kinematics, which translates robotic motions and angles into mathematical computations. After amassing hundreds of videos depicting thousands of human surgeons overseeing da Vinci robots, researchers then tasked the system to analyze the archival trove in order to best imitate the correct movements. The results surprised even the programmers.

[Related: First remote, zero-gravity surgery performed on the ISS from Earth (on rubber).]

“All we need is image input and then this AI system finds the right action,” postdoctoral researcher Ji Woong Kim said. “We find that even with a few hundred demos, the model is able to learn the procedure and generalize new environments it hasn’t encountered.”

Krieger added that their model is also great at learning things no human actually demonstrated through videos. “Like, if it drops the needle, it will automatically pick it up and continue. This isn’t something I taught it do.”

To test out their system upgrade, Kreiger’s team instructed a newly trained da Vinci robot to complete various tasks on a pork loin, given its biological similarity to human tissue. The small grippers then demonstrated their ability to pick up dropped needles, tie knots, and complete surgical sutures almost exactly like its human trainers. What’s more, it even did so after initially being trained using silicon skin stand-ins, meaning it easily transferred its skills to biological tissues without additional work.

Instead of waiting years for robots to learn new surgery strategies, Krieger believes the new learning model will allow da Vinci Systems to perfect procedures “in a couple days.” Although the autonomous robot system currently operates between 14 and 18 times slower than a human, researchers believe it won’t be long until their machines pick up the pace, as well.

“It’s really magical to have this model and all we do is feed it camera input and it can predict the robotic movements needed for surgery,” Krieger said.

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.