Meta employee talks about AI that takes over dead people's social media
(media.scored.co)
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (16)
sorted by:
Although on the surface this looks plausible, I think it's bogus. A key problem is, that without a good enough model of a person's mind, you can't emulate them for long. For instance, if you were to ask a simulator for a person who's been dead for a year what they think about the Ohio train wreck, there's no way the AI can really guess the person's reaction to it. It might make some vague mumblings but it can't capture the person's emotions about the incident.
There are other failings for this too, but no need for me to lecture.
If Facebook's FAIR AI division does have emulators, I warrant that they will be breakable and they can't truly emulate a dead person well enough to pass for long. Of course, there may be some AI developers who have fooled themselves into thinking they can do this well, but their mechanism is going to be shallow and limited.
There is plenty of overambitious bogosity in the AI field; don't trust any goshwow media coverages.
I think it’s real. I heard about this years ago. https://singularityhub.com/2018/10/29/the-how-why-and-whether-of-digital-avatars-that-live-on-after-we-die/
Oh, there ARE such avatars, but the problem is in the degree of execution of them. In the SF Bay Area a year ago some guy 'cloned' his now dead fiancee digitally and had it 'talk' to him via messaging. It was of course just a dead puppet. The digital avatars you mention are not based on strong underpinnings. They do not have a true Self inside, no soul, and no ability to be a true avatar. Just chatbot hacks. In a way it is like some actor plays a doctor on TV, but the actor has no ability whatsoever to heal the ill. He just wears a white lab coat and looks purposeful. A chatbot could talk about getting laid, but it never has had sex and cannot understand the experience.