This article showing up, as it did, front-and-center on the printed New York Times (Online version here) so closely after the death of my pet really tweaked me. How dare “they” try to replace real furry miracles of love with a damn robot pet for the elderly! I kind of feel the same way about attempts to replace flesh-and-blood yoga teachers with iPhone yoga apps. I mean, “Hello?”
Thankfully, the article does include it’s own criticism:
“When something responds to us, we are built for our emotions to trigger, even when we are 110 percent certain that it is not human,” said Clifford Nass, a professor of computer science at Stanford University. “Which brings up the ethical question: Should you meet the needs of people with something that basically suckers them?"
An answer may lie in whether one signs on to be manipulated.
Signing on to be manipulated…
Someone who is no longer my boyfriend requested when we were together, "Just tell me that it'll be alright," when he worried about something. This was years ago—but anyway I said something like, “Yes, I believe that it WILL be alright, however it looks like it would be good to” either do or pay attention to X, Y or Z… And I remember him just really wanting the palliative response from me. And when the situation was reversed, I would sometimes want to hash things out together, and he would offer that, “It’s going to be fine…” Urgg. (Of course this memory was emblazoned during the last season of our relationship)
So anyway sometimes people just seem to want the easy, the comfortable, the false, the activity or behavior that hides the thing that needs attention…
The article’s first caption (partially shown in the above picture) reads, “Styled after a baby seal, a robot that blinks and coos when petted is often therapeutic for patients with dementia.”
Does a person with dementia really have the choice that the excerpt above posits? I don’t think that a person with dementia can “sign on” to be manipulated. But we can, as long as our mental faculties are intact.
The caption evoked for me a horrible image, almost like a nursery with probably at least a hundred-or-so elderly people in crib-like beds and holding little robot toys. Are these "robot companions" better than a TV blaring at all hours? This has been a way to entertain people who have lost the ability to move around freely, so perhaps a robot toy is a step up? (I'm not convinced...)
I guess the real situation lies with us. Are we okay with further dehumanizing care for elders in our culture?
Having spent some time with my Grandmother in a "rehab" care facility after her hip was broken, I can report that I was shocked by what "care" during that time of life can look like. I don't know what the answer is.
Nana had prepared me for what the end of her life would look like. She described going into the building (that she went to) for many years before she actually went. She didn't want her family to be responsible for her care. She wanted to "free" her family from that "burden". And while it happened, I told myself that we were following her wishes, but I think that the reality of that kind of care—while considered "good" as far as this kind of things go—was somewhat impersonal. It comforts me that one of the nurses had especially bonded with my Grandmother, though. I just have to believe that she made the right decision for herself, while she could.
I, however, do not think that I made the right decision, when I didn't make the flight to see her again, one more time, right before she left her body. Fritter taught me that. It's good to be there for those you love when they go. And actually Nana had her "special friend" (we might call him a boyfriend even though he was over 90) with her, holding her hand, when she left. Perhaps that was enough.
- Posted using BlogPress from my iPhone