Although on the surface this looks plausible, I think it's bogus. A key problem is, that without a good enough model of a person's mind, you can't emulate them for long. For instance, if you were to ask a simulator for a person who's been dead for a year what they think about the Ohio train wreck, there's no way the AI can really guess the person's reaction to it. It might make some vague mumblings but it can't capture the person's emotions about the incident.
There are other failings for this too, but no need for me to lecture.
If Facebook's FAIR AI division does have emulators, I warrant that they will be breakable and they can't truly emulate a dead person well enough to pass for long. Of course, there may be some AI developers who have fooled themselves into thinking they can do this well, but their mechanism is going to be shallow and limited.
There is plenty of overambitious bogosity in the AI field; don't trust any goshwow media coverages.
Although on the surface this looks plausible, I think it's bogus. A key problem is, that without a good enough model of a person's mind, you can't emulate them for long. For instance, if you were to ask a simulator for a person who's been dead for a year what they think about the Ohio train wreck, there's no way the AI can really guess the person's reaction to it. It might make some vague mumblings but it can't capture the person's emotions about the incident.
There are other failings for this too, but no need for me to lecture.
If Facebook's FAIR AI division does have emulators, I warrant that they will be breakable and they can't truly emulate a dead person well enough to pass for long. Of course, there may be some AI developers who have fooled themselves into thinking they can do this well, but their mechanism is going to be shallow and limited.