The fleet of TUG autonomous mobil robots running at UCSF was covered by Becker’s Hospital Review.

See original article on their website –>  HERE

Big hearts of steel: Can robots reduce work burden in hospitals?

Photos courtesy Susan Merrell/UCSF

UCSF Medical Center at Mission Bay, part of the University of California, San Francisco, is filled every day by 4,000 employees, staff, students and patients and 25 robots.

Most robots in hospitals have many arms, don’t speak and are fixed to the floor. Surgical robots are fairly common — they purportedly reduce recovery times and provide more accuracy in delicate procedures. Automatic pharmacy dispensing cabinets are becoming more common as well, providing more security in hospital pharmacies over controlled substances.

However, robots are beginning to make their way into the hallways of hospitals, equipped with trays to carry food and storage containers to carry perishable blood samples, organs and medications. For UCSF Medical Center at Mission Bay, the robots are one element in a fleet of technological innovations meant to encourage patient engagement and improve efficiency.

The 289-bed UCSF Medical Center at Mission Bay, which opened in a new $1.52 billion complex in 2014, incorporates technology in a number of ways to smooth the patient experience and increase workflow efficiency. Suzanne Leigh, a senior public information representative, identified three major technological components introduced to enhance clinical care: technology for patient communications, for clinicians and staff,and the technology that drives the robot fleet.

Patient inclusion
The patient communication tools are the most visible part of the three. The media walls, screens mounted to walls in patient rooms, allow patients to view imaging, like X-rays or MRIs, and track their blood counts. Physicians essentially share their workstation with patients, allowing them to have an active role in their own care, according to Pamela Hudson, UCSF executive director of clinical systems and the Mission Bay hospital’s transition.

It’s still too early to tell about the hospital’s success with engaging patients through technology, but clinicians and patients have reported positive experiences, Ms. Hudson says.

“Clinicians at UCSF Medical Center receive training on how to incorporate technology, such as the media walls or electronic health records, to complement — not replace — their interactions with patients,” Ms. Hudson says.

Patients can email physicians to ask questions and download a video prescribed by their physician to explain care after discharge. Each patient room has equipment enabling them to Skype into their workplace or school, order a meal or snack, update social media accounts and pick from a bank of movies, Ms. Leigh says in an email.

New software also enables UCSF Medical Center at Mission Bay to be a quieter hospital. Hospital staff members can communicate via secure text messages instead of traditional land lines. The software that operates the secure text messaging also interfaces with patient monitors and issues alerts if vital signs change. The alerts are sent straight to the clinical team’s smartphones, creating a quieter hospital.

“We continue to have some landlines and overhead paging for code alerts,” Ms. Hudson says. “Through the use of new technologies, we are committed to providing a peaceful hospital environment minimizing noise levels, which studies have found is conducive to healing.”

Robotic delivery people
But it’s the robots that have attracted hurricanes of media attention. Ironically, the units — called TUGs, developed by Pittsburgh, Pa.-based Aethon and leased to hospitals on a monthly subscription — are designed to be as inconspicuous as possible, shaped a little like filing cabinets on wheels. UCSF did not disclose the details of its financial contract, but other hospitals have purchased TUGs for between $75,000 and $140,000 and the company leases them to hospitals for $1,500 to $2,000 per month.robot2

The TUGs don’t have eyes or arms. They simply ferry food, linens, medications and blood samples through the hallways of the long hospital in San Francisco’s Mission Bay area. They mostly navigate through the back halls of the hospital, using sensors and Wi-Fi to locate elevators and doors when necessary and giving way when humans need to get through. The units largely take care of themselves, pushing doors open and using fleet technology to navigate around other units and communicate to eliminate redundancy.

The 25 units log approximately 500 miles per day moving at a few miles per hour across the three-football-field long facility, spread over 1,300 trips. Although the robots sound like an extra convenience, this is a total of 500 miles the staff does not have to travel, saving time and energy and allowing them to spend more time with patients.

TUGs are popular in healthcare. They have been used in 140 hospitals, and Aethon estimated in 2014 that the units made 50,000 deliveries per week. They are secured by fingerprint scanners and controlled on trips by access from terminals, requiring authentication to access blood samples or medication, and the temperature-controlled units even allow for more stable delivery of organs to surgeons.

UCSF Medical Center at Mission Bay staff have even named them. Two robots named Wall-E and Eve deliver for the pharmacy department. Bashful, Grumpy and Dopey can be seen carrying materials from department to department, according to UCSF’s website.

“The staff has been surprised by the robots’ etiquette,” Ms. Hudson says. “Their vocabulary includes phrases like, ‘Thank you,’ ‘Backing up’ and ‘Your delivery has arrived.'”

Some have expressed concern that robots will replace healthcare workers. However, the Mission Bay hospital is a 600,000 square foot hospital — using the robots simply saves time on deliveries, Ms. Hudson says.

After awhile, robots may even become a normal part of the landscape, according to Henry Hexmoor, Ph.D, an associate professor of computer science at Southern Illinois University in Carbondale, Ill., who has consulted on engineering projects for NASA and the manufacturing industry. Dr. Hexmoor wrote a book called “Essential Principles for Autonomous Robotics,” published in 2013.

“After the first few interactions, [patients and staff] will not pay attention to them,” Dr. Hexmoor says. “On the first encounter, people may be taken aback by an object that moves in a personal way, but once they realize there is no threat, they will get used to them.”

Personal touch
The TUGs are a less patient-centric form of social robotics in healthcare, a trend slowly edging its way into patient rooms and bedsides. TUGs would be called service robots, which have been common in the manufacturing industry for years, Dr. Hexmoor says. However, the talking and interaction with humans gives them a social aspect.

Other social units, such as the humanoid NAO and Pepper robots from Aldebaran, an international robotics company with offices in Paris, Boston, Tokyo and Shanghai, can interpret basic human emotions through facial detection software and complex algorithms and react accordingly. Others, such as the PARO therapeutic robot developed by Japanese automation company AIST, provide social interaction for the elderly in nursing homes in Japan.

Humanoid robots have been used in education as well. In addition to STEM learning, Aldebaran’s NAO unit, a smaller robot than the Pepper that first became available in 2006, can be customized with software to interact with children as they receive shots and quiz them about their illness for educational purposes. The intention is for children to interact with an object that looks like a toy so they will be less frightened than if they interacted with adults.

Laura Bokobza, executive vice president and chief marketing officer for Aldebaran, says the robot has now been used in education, special education, research, healthcare, nursing care facilities, retail stores and tourism. It functions at a low level of artificial intelligence and can understand the situation around it to react appropriately. Aldebaran’s labs are working on developing a more complex AI system for its robots, Ms. Bokobza says.

“We believe robots will be the ultimate interface for machine-to-human interaction, so the better they are able to interact, respond and react to us, the more we will be able to take advantage of the benefits of using them,” Ms. Bokobza says. “One day in the near future, we are confident that robots will find their place in the household setting in addition to acting as medical aides in hospitals.”

robot3Dr. Hexmoor says this trend of social robotics with much more human interactions has been embraced in Asian countries but is still new in the U.S. He sees robots at conferences he attends and they interact with people in a variety of contexts. Some of the hesitance in the U.S. may be because of a perception through films and books that portray robots as the villains, but that is slowly changing as people begin to interact more with technology every day, Ms Bokobza says.

People tend to be uncomfortable with robots if they look like humans because of a concept called the “uncanny valley,” a principle in robotics that says if something looks too human but is not, humans will not be able to interact normally with it, she says.

“When you look at our robots, you can easily tell they are robots but with humanoid features that makes them better adapted to our world,” Ms. Bokobza says. “They are also cute, a bit silly, and gentle-looking, making them more attractive to people. This was very important to us because we knew if the robots weren’t visually accepted, people would not want to bring them into their homes.”

Because of their close interaction and collaboration with hospital staff, robots require a level of treatment above what other hospital technology does.

“It’s very much artificial intelligence,” Dr. Hexmoor says. “They have to deal with more or less sophisticated ways of interaction: Memory, learning, utterances, responses. It’s more than just a networking bunch of robots.”

Discussions of ethics have lagged behind development, however. If a robot can act on its own in the same space with people, how should humans treat it?

A 2007 study conducted by researchers at Carnegie Mellon University and Stanford University sought to observe how humans and autonomous robots communicated and interacted with one another. They found that humans and robots must have common ground to effectively interact. Secondarily, they found that communication highly depends on the level of a robot’s autonomy.

The TUGs are technically artificially intelligent, autonomous robots, but they do not make decisions themselves. Between missions, they return to their charging stations and await instructions on how to behave. Hospital staff must check them in and out, scanning a fingerprint at the destination to access medication and signaling that the delivery has been received before the robot can return home.

The CMU and Stanford study found that with less autonomous robots — like the TUGs — humans had to have more contextual information and feedback to effectively interact. With high-autonomy robots, the humans needed more transparency about the robot’s decisions.

Ms. Bokobza says the concept of how humans interact with robots is evolving. Because NAO interacts primarily with children who may sometimes push it, Aldebaran built safety mechanisms into the units so it can detect when it is falling and brace itself to protect fragile computer parts. NAO then readjusts itself to do whatever it was doing before it fell.

“We would hope that people follow the Golden Rule of treat robots in a nice and respectable way,” Ms. Bokobza says. “However we also know that things happen. We have a number of functions built in to the robots in order to not only protect the robot and the person but also strengthen the relationship between humans and the robots.”

Details aside, UCSF Medical Center at Mission Bay does not intend for the robots or other technology to supplant the core value of the hospital — patient care, Ms. Hudson says. Moving the hospital forward and improving patient experiences is still the burden of the providers and staff, she says.

“Technology is simply a tool,” Ms. Hudson says. “It is contingent on people to use it in ways that stimulate engagement.”