Putting aside the question of if the linked is fake, I don't necessarily fault medical workers for handling their work dryly.
Those doctors and nurses and technicians have to deal with all manners of disgust, biohazards, sadness, and most significantly death each and every single day. I cannot in good faith demand them to treat everyone with empathy, that might as well be psychological torture for the medical workers.
This isn't to say they shouldn't be courteous, professional, and kind to their patients, that should go without saying.
I understand this point. Many doctors I see always being very social, and smiling in public even though they have seen horrors in the ER room, etc. I kind of get a bit of the callous, emotionally stable, and always cheery, positive nature but isn't too emotionally deep (think of the doctor in movies, where they give the bad news with a straight face) they need to have to be able to operate like a robot, ironically enough.
But, maybe instead of what this article proposes, we do the opposite, we make our doctors more empathetic, and leave the robot to do the grunt work, machinery, surgery, etc. A lot of comments are saying they find the empathy useful. I'm not sure if I will be able to tell if someone sends me a crafted message, but I don't like the idea of a message being sent by ChatGPT that is meant to artificially create empathy, to me it's fake empathy.
This is all theory, but I don't think robots creating fake empathy would resonate with humans once this is widespread. Maybe it creates a sort of disconnection, where people just blatantly avoid falling for text messages and paragraphs that sound empathetic.
There are some similarities too with the movie Big-Hero 6, and the Bayman robot that was created as a care robot. Initially, Hiro is very annoyed by it, because of it's rote "artificial empathy" voice and messages. But it's intelligence is what brings him around, when it understands things like contexts better.
> Maybe it creates a sort of disconnection, where people just blatantly avoid falling for text messages and paragraphs that sound empathetic.
That would be good to a degree. Right now, people are constantly falling for maliciously crafted empathetic/emotional messaging, coming from the mounths and from under the pens of journalists, salesmen, politicians, advertisers, pundits, and social media influencers.
In some sense, it's really saddening that people find issue with emotional text written by a bot, while they don't seem to find any problem with being constantly subjected to malicious emotional messages from aforementioned ill-intentioned parties.
ER nurses doing triage are some tough mother*ers.
You walk in, bleeding and pretty sure you will die soon,
The nurse takes one look at you and is not at all impressed.
"Yeah, take a number, and keep pressure on the wound while you wait"
or
"We are really busy right now. You will have to wait for many hours.
You dont really need a doctor. Just do ... ... ... and it will be fine. "
One thing I have learned in life is that if you are at the ER
and you have to wait a long time. You are lucky.
It is when you are rushed into the back right away you know that
Whatever has happened, it is severe, and you should be scared.
Those doctors and nurses and technicians have to deal with all manners of disgust, biohazards, sadness, and most significantly death each and every single day. I cannot in good faith demand them to treat everyone with empathy, that might as well be psychological torture for the medical workers.
This isn't to say they shouldn't be courteous, professional, and kind to their patients, that should go without saying.