AI in healthcare

AI in healthcare

0

A data-driven future is on healthcare’s horizon with ninety percent of NHS jobs set to require digital
skills in the next twenty years.
Robotics and Artificial Intelligence (AI) could save 5.7m hours of GPs time across the country,
according to a recent report led by Dr Eric Topol, Founder of the Cleveland Clinic Lerner College of
Medicine and Cardiologist at The Scripps Research Institute.
Artificial intelligence is a sub-category of computer science where machines are programmed or
simulated to imitate human intelligence in terms of decision-making, speech-recognition, photo
manipulation and other visualisation applications.
Rehabilitative robots, dissection devices for eye surgery and Virtual Reality (VR) are some
technologies being trialled across UK surgeries, in a new, more holistic, approach to healthcare. Dr
Jasmina Kapetanovic, from the University of Oxford and Oxford Eye Hospital, Professor Robert
Stone of the University of Birmingham and Harriet Gridley of No Isolation start-up, spoke to KCW
Today about their developments in robotic surgical procedures, interest in the rehabilitative
capabilities of this new technology and how it could reintroduce chronically ill children into
educational settings.
The Royal Marsden Hospital is the world’s first institution dedicated to cancer diagnosis, research
and education. It is has performed over three thousand robotic procedures, with fourteen surgeons
qualified in this type of surgery.
Their Robotic Surgery Fellowship will train up to thirty multi-disciplinary surgeons in robotic surgery
over the next ten years, transforming the UK’s current surgical training programme.
Professor Vinidh Paleri, Consultant Head and Neck Surgeon at the hospital claimed the da Vinci
robot used at their practice reduces a twelve hour operation for open surgery of the neck or head
to two hours.
He said: “Patients can be home in three days rather than two weeks, and they are saved from
having visible scars.”
Open surgery has also seen surgeons retire early due to wrist, shoulder, or back problems so the
robot allows surgeons to have a longer working life, according to the hospital.
Professor Robert Stone, is Chair in Interactive Multimedia Systems within the College
of Engineering and Physical Sciences at the University of Birmingham.
He is interested in using Virtual Reality (VR) to support patient recovery after a traumatic injury. VR
is an interactive, computer-generated experience taking place in a simulated, sensory environment
through a headset and cycling machine.
Virtual environments are now being found to achieve the same psycho-physiological effects of
enhanced wellbeing in naturally restorative environments.
In a new software called ‘Virtual Wembury’, Professor Stone digitally recreates views of Devon for
patients in a headset or television screen. Other computer-generated environments include the
Northern Lights and mountain views.
At present, the technology is available in seven hospitals including Queen Elizabeth in Birmingham
and Belfast City. It is a self-contained process of ‘plug and play’ and requires minimal staff training,
funded by The Royal College for Defence Medicines.

He said: “We put a patient at Torbay Hospital on the Motor-Med Virtual Beach cycling and I lost
track of how many miles he did; his smile was incredible.
“His wife said ‘thank you for making his last days worthwhile’.”
Professor Stone claimed that ‘once [VR] becomes mature and widespread; then it will become
prescribable’.
This has also been used in palliative care by Professor Stone and his Human Interface
Technologies Team (HIT); collaborating with palliative care specialists to create VR locations
which can be edited easily and meet the needs of myriad healthcare issues. Some patients can get
very mentally fatigued by absorbing themselves in the headsets so Stone and clinicians adjusts the
number of features in the experience.
For example, a medieval catapult is added to the Wembury VR for those recovering from gastro-
intestinal illness. This encourages them to improve their lung capacity by breathing longer into a
Spirometer which catapults virtual stones in the headset.
Jasmina Kapetanovic, Clinical Research Fellow and Fellow in Vitreo-Retinal
Surgery, specialises in Opthamology, and is interested in how the robotic system can assist in
precision eye surgery.
She works closely with Robert MacLaren, Professor of Ophthalmology, at the Oxford Eye Hospital
who is carrying out the second trial of the Robotic Retinal Dissection Device; acting as a
mechanical hand for precision surgery. It has been developed using Preceyes BV, Dutch medical
robotics firm established by the University of Eindhoven. Coupling the motion controller used by
the surgeon and microsurgical instruments, the manual and robotic create a hybrid system.
Kapetanovic said: “If the surgery is robot assisted, it can overcome any human error; the patients
movement which you cannot predict.”
It is designed to eliminate unwanted tremors or movements in the surgeon’s hand which
compliments and assists their skills, rather than replacing them.The device ‘simplifies’ the set up of
the operating room; proving groundbreaking for greater ease and speed in operations for the
delicate eye organ. It can be used for robotic surgery under local anaesthetic and surgeons have
been trained on its usage, which was not possible in the 2016 trial.
She did, however, point out the importance of training to get everyone on board with the evolving
computer technology.
This medical trial is sponsored by the University of Oxford and funded by the NIHR Oxford
Biomedical Research Centre with support from Oxford University Hospitals NHS Foundation Trust.
No Isolation is a new start up helping reconnect the chronically ill with their peers, through their
AY1 Robot. This is a ‘distance learning avatar’ which means the child can engage with the
teachers and school friends from home. It helps reintegrate children back into the classroom after
suffering a prolonged illness which has kept them out of education; cancer, anxiety and cerebral
palsy to name a few.
The robot sits in the classroom and the child can remotely control it through an app; turning it
around with a swipe of the iPad and able to see the teacher or friends through the robot’s camera.
Emotion can be adjusted through the app and expressed in the robots eyes and also make its
head flash if they want to ask the teacher a question.

Harriet Gridley, Head of Business Development UK at No Isolation, said: “This compassionate
technology is something which enables human contact, rather than replaces it.
“The [child’s] friends would not know how to interact with them; it is really difficult even as adults to
support someone going through a difficult illness.
“But because its through this friendly interface, it makes it really easy and lowers the threshold for
human impact.”
The initiative created by the Department for Education is on its second edition; the child user did
not want to be seen so the start-up introduced a screen on the bot and in the second, LED eyes
were introduced because the classmates wanted to know how their friend was feeling. They can
pick a variety of expressions.
Harriet Gridley, Head of Business Development UK at No Isolation, revealed how the start-up’s
commitment to helping people out of loneliness means they could create products for first-time
parents or ‘perhaps people in the city which are more lonely than anyone else’ in the future.

READ  Carnaby Street to be changed beyond recognition by new development
About author