STAR: The Smart Tissue Autonomous Robot

At the Children’s National Medical Center in Washington, D.C., led by associate chief surgeon Peter Kim, surgeons and nurses filed into an operating room where quickly put the patient, a pig, under anesthesia and prepared to operate.

Scalpels gleamed in the bright lights. Yet the suturing needles to sew the patient back up were nowhere in sight. Instead, a robotic arm with a long shaft hovered over the patient. The Smart Tissue Autonomous Robot, or STAR, would be doing the sewing today.

You can think of it as driverless surgery.

STAR is Kim’s vision of the future of surgery. Surgeons, he argues, vary greatly in training, dexterity, experience, and decision-making. This is the reason why 30% of the world’s 232 million soft tissue surgeries result in complications. By embedding the knowledge of the best surgeons in digital systems, autonomous and semiautonomous robots could deliver universal access to the best surgical techniques.

In addition to his surgical duties, Kim is also vice president of the hospital’s Sheikh Zayed Institute for Pediatric Surgical Innovation. He guides what the medical field calls “translational research,” applying new science to improve human health. To make his vision a reality, Kim recruited a team of engineers to build a robot that could perform soft tissue surgery.

After years of testing on plastic pads and dead tissues, STAR was ready for prime time. And today it would try to prove that it could, at least by some measures, do a better job of stitching those severed tissues together than experienced surgeons.

To make his case, Kim chose a complex surgery called circular anastomosis. It involved stitching together two severed ends of an intestine. To heal properly, every one of the sutures had to be perfect. Make them too far apart or too loose and they will bleed. Tie them too closely or too tightly and they will strangle and kill the tissue.

Because the anatomy of a pig resembles that of human beings, surgeons are comfortable working with them. The doctors rapidly severed the pig’s intestines, extended them through a cloth covering, and stretched them open for easy access.

Then STAR went to work. Its U-shaped stitching head pushed a curved needle through adjacent intestinal walls and tied off the knot. Then it went on to the next.

Robots in the surgical suite are nothing new. The best known of them, Intuitive Surgical’s da Vinci, is more than 15 years old and has performed 2 million operations worldwide. Its multiple hand-like effectors, which look like a prop from the movie “Alien” and require a team of surgeons to operate, act as extensions of the surgeons’ hands. They can perform incredibly precise maneuvers, but humans make every decision and control every move.

Autonomous robots are smarter, and they are beginning to edge their way into the operating room. These robots are savvy helpers. They locate and machine bone for hip and knee implants, make the slices in Lasik vision surgery, target radiation at tumors, and provide guidance during back surgery. They rarely, if ever, operate independently. They work exclusively with solid objects, such as bones or eyes, which remain stationary during surgery.

In contrast, soft tissues vary in shape, size, and location from patient to patient, and they are, by definition, pliant. Stitch them together and each stitch will alter their shape. Sometimes, the sewn seams cover the previous stitch or hide the location of the next stitch. Other times, leaking blood obscures the tissue.

While all surgeons have excellent hand-eye coordination, their most important skill is making good decisions while navigating this complex and changing environment. An autonomous robot must not only manipulate a needle and thread, but follow—and react to—the shifting shapes that it creates in real time.

This is the Grand Prix, the Super Bowl, and the World Cup of surgical robot challenges all rolled into one. An autonomous robot that masters these challenges would change the game.

“Just imagine having the best technology and technique available any time and any place for any surgeon and for any patient,” Kim said. “Having these intelligent systems working with surgeons will ultimately decrease complications and save lives.”

Stitching

Kim pitched those benefits to recruit Alex Krieger, who led the STAR engineering team. The German-educated mechanical engineer came to the United States to automate a Bosch AG automotive facility, but was drawn to bioengineering while earning his Ph.D. at Johns Hopkins. There, he was part of a team that developed a nonmetallic robot that could work within the high magnetic fields of an MRI machine. They went on to found a company that was acquired and commercialized the technology. After five years of corporate life, he was ready for a new challenge.

Despite his experience in medical robotics, Krieger was unprepared for soft-tissue surgery: “It was so bloody, and there were flaps of tissue obscuring everything. It was a mess, and so difficult to see what was going on or where the next stitch would go.”

Yet Krieger knew how to attack it. “As an engineer, you try to break difficult tasks down to small, manageable pieces.”

Before he could build a robot, he needed to learn how surgeons went about their business. Fortunately, he worked in a hospital filled with some of the nation’s top pediatric surgeons. He asked question after question: What was a surgical plan? How did they prepare the surgical area? Did they move right to left or left to right? Did they stitch outside-in or inside-out? How did they manage corners? Robots, even autonomous ones, do not improvise, so Krieger needed an answer to each question to program the robot effectively.

Some questions were harder to answer than others. Sutures vary with type, spacing, and tension, depending on the organ or tissue. Surgeons study some rules of thumb, but they also learn to recognize when a stitch feels right. Since surgeons don’t take precise measurements of tension, Krieger could not find hard data for his robots. Fortunately, his wife, an optometrist, remembered a reference on eye surgery that contained the force tables he needed.

Surgeons also linearize circular tissues like intestines, tying them to the abdominal wall to form the straight, easy-to-stitch straight lines of a triangle. It also keeps the rest of the tissue away from the work surface. The STAR team replicated this by placing a ring over the patient. Once they elevated the intestines out of the abdomen so the robot could see them more clearly, they tied them onto the ring.

Sewing, Looping, Threading

Then Krieger tackled suturing. It involved looping thread, pushing a curved needle through tissue, and knotting lines of stitches. Autonomous robots find this challenging and rarely get even half the stitches right.

Krieger needed a simpler method. He considered surgical glue, staples, and tape. Ultimately, he settled on the Endo360º, a suturing tool developed for laparoscopic, or minimally invasive, surgery. With its pistol grip, trigger, and long, thin shaft, it looked like the wand of a power washer—with levers, knobs, and cables to orient the moveable head. Once in position, the surgeon flicks a switch and the head shoots a surgical needle through folds of skin to make the stitch.

“It works like a nail gun,” said Simon Leonard, a Johns Hopkins University computer scientist who worked on the project. “We just aim and shoot. We don’t need to build a robot with the dexterity to move a needle to right place and push it through.”

The engineers replaced the Endo360º manual controls with motorized devices and attached it to a human-safe robot from Germany’s KUKA. They spent weeks straightening and calibrating the tool, since its length magnified every imperfection.

STAR also needed eyes that could resolve locations in three dimensions and follow blood-soaked tissue as it changed during surgery. The team opted for two separate cameras and a clever trick.

The first was a plenoptic camera, which uses an array of small microlenses that work like a bug’s eye. Each microlens sees the image from a slightly different angle. Computer algorithms reconstruct those vantage points into a single 3D image. The camera has a wider focus than a single lens and it is compact enough to hover above an operating table. Even then, it took months to calibrate the camera to the millimeter accuracy needed to control the long Endo360º.

Still, the camera was not fast enough to track tissue deformation accurately in real time. The engineers finessed the problem. They dabbed drops of an FDA-approved fluorescent glue onto the edges of the tissues. When illuminated, it fluoresced brightly in the near-infrared range, even when covered with blood or other tissues.

A near-infrared camera imaged the dabs, and a computer drew imaginary lines between them. Instead of trying to track the deformation of the tissue, the computer followed the dots and changes in the imaginary lines. It could do this in real time with capacity to spare.

“That was key to solving the problem of tissue recognition and tracking,” Krieger said.

Yet neither the plenoptic nor the NIR camera were accurate enough to determine the depth of a stitch. The STAR team solved that by mounting a force sensor between the jaws of the Endo360º’s stitching head. The sensor told STAR when the head was at the right depth, and provided the feedback needed to slide along tissue and into the difficult-to-suture corners.

At that point, STAR was ready to start sewing.

Surgery

STAR started its surgical career by working on rubbery pads with small protrusions. Surgeons use them to learn to stitch together wounds or tissues.

“Since the protrusions are all one height, it simplified the problem,” Krieger said. “There was no blood, no crazy deformations. We could begin testing before we finalized the vision system.”

To prove the robot could adapt to unexpected patient movements during an operation, the engineers shifted the pads randomly during testing. STAR successfully tracked the location of its target and repositioned its arm to make the stitch.

The experiments showed that STAR had the potential to complete one stitch every seven seconds on patients—faster and more consistently than humans, Krieger said.

The engineers had thought STAR would do as well on animal cadaver tissue—until their tools broke. “We didn’t have the right cleaning procedure,” Krieger said. “The tool tip gunked up and the cables controlling the head broke because we were putting too much force on them.”

Up and running again, their sutures began pulling out. Cadaver tissue, it turned out, was not as consistent as the practice pads. Some tissues were thick, others fatty, and others thin and flimsy. Over time, the team learned to optimize its stitching for different types of tissues.

Live tissue proved more resilient and durable than cadaver tissue and easier to work with. The force sensor enabled the team to make perfect stitches along the intestinal flaps. The cameras positioned the Endo360º accurately and tracked each twist and turn as it stitched.

STAR worked exactly as the team envisioned, though it did need some human help. Surgeons sliced the intestine and prepared it for surgery. They double-checked each stitch’s location and helped manipulate the thread. But Kim estimates that 60% of the operation was fully autonomous, and described these interventions as ‘minor adjustments’. He likened them to holding an infant’s hand while it is learning to walk, and said STAR could have done all the stitching autonomously.

Either way, STAR more than lived up to Kim’s high standards. It outscored surgeons with seven years training on stitch location and stitch tension. Such consistency is important, because a surgeon who makes only one imperfect stitch out of 20 has failed his or her patient.

STAR’s only drawback was time. It took nearly an hour to complete the surgery. This was simply a matter of making sure each stitch was perfect, Leonard said: “This had never been tried on real-life animal, and we played it very safe.” At full speed, he added, the robot could have stitched five to 10 times faster than a human.

“We were able to show the potential for robots to do significantly better than a human surgeon,” Krieger said. He argues that automating stitching after surgery leaves surgeons with more time for higher-value tasks. Krieger wants to begin using the robot clinically and to expand its repertoire to include hysterectomies and stomach operations.

The team plans to integrate additional sensors onto their robot to give surgeons better surgical information. Using a combination of force sensors and sophisticated multispectral cameras that see more than visible light, future robots might advise surgeons about tissue health, thickness, strength, and blood circulation. This would quantify knowledge that surgeons now learn only through experience.

STAR will help the team explore better ways for surgeons to collaborate with machines to provide safer and more consistent outcomes than either could alone, said Ryan Decker, a senior engineer on the team. Humans, he explained, are very good at segmenting objects, such as identifying the edge of a severed colon that has changed shape and is obscured by blood. They also excel at identifying new features in an environment, and generating hypotheses about the world around them.

Robots are a long way from emulating those intellectual skills, but they surpass humans in their ability to do precise work quickly and repeatably. For highly defined tasks, like measuring the distance between stitches or the force on a knot, robots leave humans in the dust.

“Our goal is to create a framework for human-machine collaboration that achieves something better than the sum of those parts,” Decker said.

STAR is only the start.

Source Alliance of Advanced BioMedical Engineering (AABME)

Comments are closed, but trackbacks and pingbacks are open.