The doctor comes to you – sort of
About a year ago, I wrote a story that appeared in this blog. Since so much time has elapsed, I’ll go ahead and give away the ending. A patient is in a doctor’s office where all of the procedures are performed remotely. After a pleasant appointment, she asks a question:
“You’ve been very helpful. But I’ve always wondered exactly WHERE you were. If you were in Los Angeles, or in Mississippi, or perhaps in India or China, or perhaps even in one of the low-cost places such as Chad. If you don’t mind my asking, exactly where ARE you?”
Drumroll please – or perhaps the scary music.
“I don’t mind answering the question,” replied the friendly voice, “and I hope you don’t take my response the wrong way, but I’m not really a person as you understand the term. I’m actually an application within the software package that runs the medical center. But my programmers want me to tell you that they’re really happy to serve you, and that Stanford sucks.” The voice paused for a moment. “I’m sorry, Edith. You have to forgive the programmers – they’re Berkeley grads.”
Naturally, it’s going to take some time before software can intelligently perform an array of diagnostic tasks. But we’re getting there, and the machines have already mastered one important skill – navigating down the hall.
With a simple push of an iPad button, [Dr. Robert] Vespa can send the robot gliding down the hall to a patient’s room. Equipped with 30 sensors that enable the it to “see” when its route is blocked by a gurney or curious bystander, EVA possesses the intelligence to self-correct and plot a detour to its destination.
After the robot reaches a patient’s bedside, Vespa can examine the patient in real time. A two-way video monitor in EVA’s “face” enables the patient and doctor to see and hear each other. A 120x zoom capacity allows Vespa to magnify a single word on the patient’s chart or zero in on the patient’s eyes to check for dilated pupils.
As of now, the robot is still under the control of the doctor.
Give it time.