Which jobs will never be replaced by machines?

There are many different ways to describe "best care." If all we care about is objectively, what will yield the best results for this particular situation, then sure, I would agree that stripping patients of their autonomy would be a great thing, because sometimes people make stupid decisions about their own health.

However, some people (myself included) would choose to describe "best care" as what yields the best objective result, in accordance with the patient's beliefs, values, hopes, wishes and fears. Should we force a Jehovah's witness to receive a blood transfusion against his or her will if this will yield the best objective result? For those who are preoccupied with objective success, the answer would be a resounding yes. I don't know that I would agree with that sentiment.

I think that your example of morphine vs ibuprofen is a poor analogy for this. In the counselor situation, one option will objectively yield better results, and the other option is also a reasonable option, yielding potentially worse results, but more in line with the patient's own philosophy. A better analogy would be if a patient was choosing between two effective (one slightly more effective) blood pressure medications. The patient has been on medication 1 for their entire life. They know it like the back of their hand. You, as the physician, know that medication 2 has been shown to produce better results so you suggest that to the patient, but the patient is inclined to stay on medication 1. Even though they appreciate that medication 2 will probably yield better results, they would like to stay on medication 1 for whatever personal reason. Is it better to go with the slightly less optimal option and preserve the physician-patient relationship, or deceive or force the patient into taking medication 2, jeopardizing their trust in you as a medical professional in the process?

At least in western medicine, medical paternalism (doctor knows best, just do what he/she says without question) is on its way out for this exact reason. No one, machine or otherwise, can perfectly empathize with every single patient, and know what is best for their physical, emotional and mental health all the time. This is where patient input has to come into play. If a patient understands that a robot counselor would provide them with better care, but still would like a human to speak to due to their personal values or beliefs, I don't see who any of us would be to strip them of that choice.

/r/TrueAskReddit Thread Parent