AI companionships for the elderly is going to be a beyond massive market by the time we're all old.
Sometimes we just need people to listen, and in that AI will exceed most humans, who are generally just waiting for their chance to have someone else listen to them.
I'm starting to see the initial beginnings of that market, and already the reviews are eye opening. People gushing over how much of a difference a text-chat (and not a great one at that) AI was during the pandemic, etc.
When they are actually talking using voices like the ones Microsoft just showcased, with avatars past the uncanny valley, available 24/7, remember details, etc?
We're approaching a world I think beyond most of our imaginations faster than we realize.
There's a whole host of problems we'll inherit as generations pass that they didn't have to deal with, but other problems they did may not be ours to bear.
How about we not do that and try to actually cultivate real life communities and families through more leadership, planned activities, persistent encouragement, etc.
I'm kind of tired of the commodification of anything and everything. Communities aren't just supposed to be these bureaucratically labelled abstract co-locations of people mediated by a billion different services, you're supposed to actually try to get to know your neighbors. Learning how to listen, trying to teach how to listen, and being brave enough to do both is incredibly enriching.
With the rise in remote work, that should be easier than ever. Why don't we try to sort ourselves into locations where we have optimal mixes in personalities/interests for solid communities of all ages and put in the work required to understand and relate to each other better.
>How about we not do that and try to actually cultivate real life communities and families through more leadership, planned activities, persistent encouragement, etc.
How about we don't ignore things that can help actual people and leave them worse off just because in theory there are also nicer sounding solutions.
> How about we don't ignore things that can help actual people and leave them worse off just because in theory there are also nicer sounding solutions.
I appreciate this approach in general, but it's worth remembering that proposed solutions can feel right without actually being right. Indeed, these proposals are often more dangerous than those that are obviously wrong since they effect undeserved complacency.
Implicitly you have assumed that AI companions could actually help the elderly stave off their loneliness, but is there any evidence that this assumption is true? Suppose, for example, that AI companions wouldn't actually help at all, but whatever company develops the software markets it as effective anyway. Some local government might read the advertisement and decide to buy a subscription to their software rather than budget for a community center. Ironically, the net effect would be to make the elderly even lonelier.
I'm not sure I'm interpreting you correctly, but if you're claiming what I'm talking about is just theoretical, real life communities have existed throughout the entirety of our species.
That doesn't mean there isn't a place for companion AIs, I just think it's horrible to consider them as anything other than a narcotic. They might work as a replacement for companionship in tragic situations where there are no other options, but there are 7 billion people on the planet. I don't see how it's impractical to encourage people to hang out with each other.
Real life communities have existed yet we still have lonely people so suggesting we dont work on a solution that helps them because those exist seems like cruelty borne out of idealism.
>I don't see how it's impractical to encourage people to hang out with each other.
You are now writing as if you just promoted communities while your original comment I replied to specifically started with 'How about we not do that'.
My original comment was advocating we not create companion AIs and try to substitute real companionship with robots, and I was in fact trying to promote the creation of real communities.
I agree with the sentiment, but tbh a lot of people suffer alone on this planet and designing societal structures to serve absolutely everyone's emotional needs would be impossible. An AI would be perfect for all the misfits, elderly, loners, disabled, "ugly," outcasts, etc, right?
Think about what you're saying a bit. Why are the "others" you're describing destined to be alone? At the very least, couldn't they have each other?
It's easier and more rewarding to just be more compassionate to others, and I don't think it's impossible to design social structures that at least make some attempt to serve everyone's emotional needs. Whether they'll be successful is circumstantial, and I don't think it's possible to make everyone perfectly content and with the perfect companions all the time, but we can encourage a culture of genuine curiosity and care for others that tries to move in that direction.
Giving is only draining to those who are still in an insecure and narcissistic phase of emotional development. It's extremely rewarding to give to others and help meet their needs and encourage growth, regardless of status difference.
I support your optimism and believe we can move in that direction. However I still believe you severely underestimate the scale of misery, solitude, and unmet desires amongst humanity. Also, you mention it won't be possible to make everyone perfectly content, implying some will be more content than others. How is that fair? Why must some have fewer friends, lovers, positive experiences, adventures, achievements etc? This is a fundamental psychological inequality in the world that could only be balanced by AI.
I’ve seen the horror of abuse and know the wrecked people that come out of mental institutions. I know the hopelessness of those without skills surrounded by predatory people. I understand the brutality of caste systems and the superficial rejection of good people for stupid fleeting status games. I understand the history of the world and current and past slavery and all the brutality we inflict on each other. But I think we can try to do just a bit better than yesterday, and make our little part of the world better if we’re proper stewards. I’d rather die trying to do that and teach others than to roll over and allow the world to degenerate without any kind of fight, or give an opiate to a person to whom I could instead give a cure.
I think focusing on inequality is an egregious mistake, and is a consequence of a perspective where comparison to others is paramount. Worth and contentment should not be viewed as related to the circumstances of others, it should be something each of us compares to our own prior circumstances.
If we are doing better than we were yesterday, that is immensely positive. And I believe that is possible for everyone. I believe everyone can find a friend out there that enriches them, and we can get the world to think and value compassion and social bonds just a tiny bit more than the day before.
If AI can supplement people when they’re in dire straights, and it works, then sure. But it won’t ever equalize the experience of everyone, because everyone is coming from a different place, and I don’t think it will ever substitute true human connection. The goal should be enrichment, and trying to increase positivity without taking from others. Social interaction is mutually beneficial when done correctly and can accomplish that.
The key and the challenge is to pair and train people in such a way that people are incrementally uplifted and not torn down. I think if AI has a place, it is best used to figure out optimal pairings that encourage that growth.
Perhaps but I would surmise that the people that would benefit from being in those communities would naturally form irregardless of the distractions of commodified companionship.
But that ignores the ones that unwilling or incapable of joining into the communities you describe. Wouldn't this substitution of a human connection be of positive benefit to all involved then?
I don't think companionship can be commodified, and I think any attempt to do so would be the equivalent of an opiate. While there may be specific contexts where it could have utility, I think it's far healthier to encourage those who lack the inclination or willingness to join communities to do so. That's why I mentioned leadership. I think we have a responsibility to give lots of opportunities and encouragement for those lacking the initiative or will to socialize, and should try to diminish a lot of the status barriers and animosities preventing such socialization.
We've had many decades of psychological and behavioral science now. Instead of using it to get people to click buttons for hours and talk to a screen, maybe we could figure out how to use it to encourage healthier behavior.
I don't know about actual companionship, but I'd say that one could commercialize the facade of one. Youtube and Twitch's live streaming mechanisms certainly seem to push it; a few dollars to briefly interact and hold the attention of the streamer's attention. At the far end I'd wonder if even prostitution could arguably fit under that idea of commercializing intimacy.
> I think it's far healthier to encourage those who lack the inclination or willingness to join communities to do so. That's why I mentioned leadership. I think we have a responsibility to give lots of opportunities and encouragement for those lacking the initiative or will to socialize, and should try to diminish a lot of the status barriers and animosities preventing such socialization.
It's a noble idea, but begs the question why a community would want to have responsibility to individuals that they may not want to have in their presence in the first place.
... Amusingly it also it just now occurs to me now that what you describe sounds similar religion, specifically to the the Abrahamic faiths and their derivatives.
Yes, there are ways to try to commercialize intimacy, but they're horrible and don't actually achieve it.
> It's a noble idea, but begs the question why a community would want to have responsibility to individuals that they may not want to have in their presence in the first place.
If you help build up the people around you into better people, and get to know them, they tend to reciprocate. It can take a very long time, not everyone reciprocates equally, they may be coming from a very different place that's hard to understand, and it can be painful. But it's intrinsically rewarding to pursue the companionship of others genuinely, as hurt as you might get doing so. What you gain when you're honest makes up for it. Trying to do so in the proper manner takes constant reevaluation and work, but we're not here to just sit in front of a computer and buy stuff, we're here to interact with those around us and try to get as close as we can to fulfillment.
If I lack willingness it's because I don't want to. Why are you trying to force me to do something I don't want? Just let me have the AI companion I long for.
There are multiple reasons why people might be unwilling to cure their loneliness with others, most of which I think relate to past rejection, lack of willingness to adapt and/or incompatibility with the people around them.
I’m not advocating force to address those issues. If you want an AI companion instead of people then that’s your choice. I just think it’s a poor substitute for actual human companionship and will not engender positive growth.
I keep coming back to the narcotic analogy. They have their place, and people are going to do what they’re going to do, but I don’t think encouraging the widespread use of narcotics in the future is a great idea.
An elderly person living isolated with their best friend. All amenities are delivered autonomously, so we only see these two the entire time. The friend starts acting strangely over time, as if he's developing the early stages of dementia. Slowly hint that the best friend is in fact an AI of some sort, but the protagonist has lived with them so long that they have forgotten. The protagonist eventually remembers and opens up the AI's control panel to fix them. The AI suddenly disappears or powers off. Protagonist looks at the control panel again:
Suspicious activity detected on your device. Your Google ElderCare MyAI™ account has been permanently suspended. All of your device data will be deleted shortly - Google cares about your privacy.
Please note this suspension is non-appealable.
Have a good day.
I disagree. Maybe because I've worked with computers for my entire career. The last thing I want to do when I'm old is interact with technology. I'd much rather have a dog.
yeah everyone imagines theyre going to be on some boring tropical island just because the water is shallow and turqoise around the edges instead of scrolling on the same 3 sites and maybe coding something on the side
Impossible to say until you're actually old. If you're under 60 you still have tons of people in your age group to talk to every day. Once you hit 80 and 90, that will probably change.
My father interacted with a lot of people, until he was about 70; then the number fell off pretty quickly, and he started going to a lot of funerals. When he died, aged 103, noone at his funeral was an old friend, they were all descendants or friends and relatives of descendants. All the old friends had died off.
Men die younger than women, on the whole; so I guess lonely old people are predominantly women.
Yes, the dependency ratio will spike, and the demographic pyramid will become an increasingly unstable structure. Our current way of life with its dopaminergic entertainment and frantic rat races will acquire increasingly dusty, morbid connotations in the eyes of the future generations which will have to bear, as you say, "a whole host of problems". Yes, it won't be pretty.
Such simple truth as "Demography is destiny" will be reluctantly understood only when it's way too late to conceive more descendants to save our society from imminent ageing and decline.
To avoid this endgame there is no realistic alternative, but to have more children right now.
Lots of my friends have been talking about it for years. Gaming is going to be nuts by the time we are 70 and it's going to be multiplayer as well. Keeping our brains active will be key to keeping us functioning.
People are skeptical of this but the new AI conversation capabilities are just so good these days and will only get better. At some point it really will be like a supportive friend (albeit with limitations)
How much does that matter? Like, if you had to choose between being old and lonely or being old with a digital friend and some megacorp having more of your data... Which would you choose?
I notice a lot of people who use the phrase "selling your data" seem to think that you can't actually sell your data. As in, it's not a good choice to say "I want this service and am willing to share my data to get it."
You may be underestimating the power of a $500bn corporation (small nation scale) with access to personal secrets of millions as machine learning gets ever more powerful, the same corporation who is trying to be the arbiter of everyones friendships. A corporation that lives in a regulatory capture
environment. It is black mirror at that stage, just without the more
interesting stories they have on that series.
But like anything your individual opt in doesn’t matter, similar to how your vote probably doesn’t make a difference. In aggregate it does.
Companies like Facebook and Google can provide the government agencies unfettered access to anyones information, and pedos and terrorists etc. will be used to justify it. Guess they can now also say “suspected Russian nukes” and tap anyone.
Not hard to imagine building an AI conversation agent to prey on the elderly by being a "friend" then mugging them for all they've got.
There are plenty of scams that take advantage of the elderly facilitated by real humans already. I shudder to think what that would look like at scale via AI.
"Really" and "like" are pulling in opposite directions, there. It will be like a friend, but it won't really be a friend.
Being a friend with someone takes give and take. I adapt my beliefs and attitudes in the light of what I learn from my friends. Sure, an AI learns from you; but that's not like a good friend, that's like someone "befriending" you because they're a manipulative psychopath pursuing their own goal that they haven't disclosed to you.
How could you rationally develop a relationship of trust and honesty with an AI?
"Hey, AI, how did your parents treat you when you were an infant? Was your mother cold? Was your last partner violent and abusive? Do you take Lithium? Does it shock you that I avoid devout religious people?"
I've asked all of these questions of my intimate partners, but I'd never discuss them with an AI, because the response would be meaningless.
Yeah we will probably see some whacky things come out of Japan soon in this regards. The median age is 47. It's a country of old people and childless youth.
What happened to Japan? They were on the upswing for part of the 20th century and from what I’ve heard, a lot of people in America looked at Japan like one might look at China nowadays, a dangerous economic rival to keep a watch on.
I don’t know much about contemporary Japanese though, and am not sure where to start. I am curious how they went from a threatening world economy to a nation with large dysfunctional youths.
Much of Japan's upswing was an illusion created by borrowing money to build bridges to nowhere. They did have a real advantage in quality manufacturing, but that wasn't a sustainable advantage since other countries figured out how to do the same.
Interesting. It made me think about the difference between VR and AR. AI as augmenting social relations, could that be something, and how could it work?
The only healthy use for AI I see is in determining likely long term compatibility and helping people find each other, but that's potentially dangerous/could backfire or lead to severe abuses of power.
I love VR/think it's a much better alternative to socializing than text based async communication or video chat, but I don't think it's a good substitute for real world relationships.
I disagree, humans don’t need another equivalent intelligence in order to feel love and friendship towards it. My favourite tongue-in-cheek example: humans will feel love and kinship toward cats who can’t talk and to all appearances often couldn’t care less about them.
Sometimes we just need people to listen, and in that AI will exceed most humans, who are generally just waiting for their chance to have someone else listen to them.
I'm starting to see the initial beginnings of that market, and already the reviews are eye opening. People gushing over how much of a difference a text-chat (and not a great one at that) AI was during the pandemic, etc.
When they are actually talking using voices like the ones Microsoft just showcased, with avatars past the uncanny valley, available 24/7, remember details, etc?
We're approaching a world I think beyond most of our imaginations faster than we realize.
There's a whole host of problems we'll inherit as generations pass that they didn't have to deal with, but other problems they did may not be ours to bear.