Sitting on a stool several feet from a long-armed robot, Dr. Danyal Fer wrapped his fingers around two metal handles near his chest.
As he moved the handles — up and down, left and right — the robot mimicked each small motion with its own two arms. Then, when he pinched his thumb and forefinger together, one of the robot’s tiny claws did much the same. This is how surgeons like Dr. Fer have long used robots when operating on patients. They can remove a prostate from a patient while sitting at a computer console across the room.
But after this brief demonstration, Dr. Fer and his fellow researchers at the University of California, Berkeley, showed how they hope to advance the state of the art. Dr. Fer let go of the handles, and a new kind of computer software took over. As he and the other researchers looked on, the robot started to move entirely on its own.
With one claw, the machine lifted a tiny plastic ring from an equally tiny peg on the table, passed the ring from one claw to the other, moved it across the table and gingerly hooked it onto a new peg. Then the robot did the same with several more rings, completing the task as quickly as it had when guided by Dr. Fer.
how surgeons learn to operate robots like the one in Berkeley. Now, an automated robot performing the test can match or even exceed a human in dexterity, precision and speed, according to a new research paper from the Berkeley team.
The project is a part of a much wider effort to bring artificial intelligence into the operating room. Using many of the same technologies that underpin self-driving cars, autonomous drones and warehouse robots, researchers are working to automate surgical robots too. These methods are still a long way from everyday use, but progress is accelerating.
where there is room for improvement — by automating particular phases of surgery.