Sunday, September 18, 2011

Doc in a Box

This is something I have been thinking about for a while, but recent events have turned now into the time to put my thoughts together. In case you hadn't heard, Watson, the computer that beat the top humans at Jeopardy!, now has a job. (http://gizmodo.com/5839257/ibms-watson-gets-its-first-real-job and http://www.hpcwire.com/hpcwire/2011-09-15/can_supercomputing_help_cure_health_care_.html) IBM had said that they wanted to aim the DeepQA technology that made Watson what it was at the medical field for diagnosis and now they have done it. The second article above describes how this could help out with problems we have in healthcare. Martin Ford, the author of The Lights in the Tunnel, has an op-ed in the Washington Post where he goes through some implications of this (http://www.washingtonpost.com/opinions/dr-watson-how-ibms-supercomputer-could-improve-health-care/2011/09/14/gIQAOZQzXK_story.html). The insight on the recent Supreme Court decision is really significant, but I don't think Martin goes far enough when he is looks at how this could play out in the future. (Probably because if he said what I'm going to say here people would laugh at him and the piece would not have been published.) He mentions a situation where you have people with lower levels of training who can talk patients through things and then input information to Watson for diagnosis. I think the humans just get in the way here, especially given what I have heard about the caliber of most medical assistants.

What I see happening instead is the creation of a "Doc in a Box" chain of diagnostic clinics. These would be similar to the diagnostic clinics that are popping up around the US today with the one exception that they don't have humans in them. To make this happen, I'm going to call on three things: Watson's DeepQA, Microsoft's Situated Interaction, and remote sensing.

The first demonstration of Situated Interaction was the virtual receptionist. This technology greets you and runs the small waiting room associated with each office. As "exam rooms" become available, patients are directed toward them. The situated interaction requires video so I expect every inch of this place is being monitored all the time. In fact, the monitoring in the waiting room could be part of the input for diagnosis as well because it extends the baseline for observation.

In the exam room you have a virtual doctor. This is a combination of Situated Interaction with DeepQA. You sit in a chair that has the ability to take blood pressure, temperature, pulse, weight, and other basic vitals. The patient and the machine talk through everything a medical assistant, nurse, and final provider would normally do. That could include medical history, but honestly, this system works best if medical history has been put into a generalized database already. That generalized database can also make Dr. Watson extremely effective as it will have petabytes of previous diagnosis and outcomes to mine.

In addition to the normal vitals, I see this room having a whole array of more complex sensing capabilities. Samples or breath, blood, saliva, skin swabs, and whatever else could be taken. It should also be equipped with cameras that go beyond the visible. IR is easy and inevitably has diagnostic benefits, especially after it has been used with a few million patients. Other basic scanning technologies could be included as well. X-ray seems pretty easy though you have to be careful about exposure. You could probably do some other interesting things similar to MRIs. If they sit there for a while there might even be something useful you could get from detecting background radiation interacting with their bodies. Of course, cameras with high resolution could look at regions of the skin as well as in eyes, ears, nose, and mouth and they could do it with a lot more sensitivity than the human eye and easily go beyond the basic visible part of the spectrum to include near IR and UV.

I expect this system would be able to do a good job on day one. By the end of a year or two, it will put human doctors to shame when it comes to basic diagnosis, because it will have statistics based on this advanced sensing of every single customer that has visited. To really complete the circle, outcomes should be linked in as well. If the treatment is pharmaceutical, the Doc in the Box dispenses drugs, payment is done electronically and the patient is off.

Non-pharmaceutical treatment will take longer to get into this format. Robots aren't going to put in stitches for a while. They won't set bones for a while either, even if they can see what needs to be done. That will come, but it adds 5-10 years. In the mean time, I can easily see a setup where most primary care is done in a way where the only human involved is the patient. Of course, a patient might want to see a human practitioner and they certainly should be allowed to. However, that should become a luxury and have a price associated with that fact. This type of system could make basic healthcare have a very low marginal cost. So low that you could probably provide it to everyone in the country without causing the nation to go bankrupt.

No comments:

Post a Comment