Ok, so I think it's kind of cool that they're seeing if humans can retrain a programmed machine w/o code and such. But that is a terrifying concept to actually employ. Hopefully by the time machines can fully think for themselves and take over the world, we'll all be long gone.
I don’t see how they made the robot like a psychopath just because he sees the worst in things and has been exposed to violent images. I thought the clinical definition of a psychopath is more like a person who can’t empathize with others and doesn’t experience guilt or regret.
I don’t see how they made the robot like a psychopath just because he sees the worst in things and has been exposed to violent images. I thought the clinical definition of a psychopath is more like a person who can’t empathize with others and doesn’t experience guilt or regret.
By that definition (which is probably accurate, but I'm trainer so IDK) all robots are since I don't think science has figured out the emotion matrix yet. And now that the idea has been put in my head, I'm more worried about the robot uprising and will never get a roomba.
I don’t see how they made the robot like a psychopath just because he sees the worst in things and has been exposed to violent images. I thought the clinical definition of a psychopath is more like a person who can’t empathize with others and doesn’t experience guilt or regret.
A computer, by default, cannot empathize or experience guilt or regret.
I don’t see how they made the robot like a psychopath just because he sees the worst in things and has been exposed to violent images. I thought the clinical definition of a psychopath is more like a person who can’t empathize with others and doesn’t experience guilt or regret.
A computer, by default, cannot empathize or experience guilt or regret.
That doesn’t really answer my question though. I’m not sure what about this specific robot is like a psychopath. Psychopath is in quotes in the article, so that does seem to be the actual scientific goal of the project. It could be a sunshine and rainbows robot and still act like a psychopath. Or a violent, pessimist robot that does not act like a psychopath.
Since it’s a clinical term, I wouldn’t expect them to just throw that out there as a blanket term for “violently inclined”. Seems irresponsible. But maybe the article is just poorly written?
Lies. It's my grampsies name and he was the least psycho person in my whole family.
LOL you're not exactly saying he's *not* tho
LOL! I didn't even realize that implication....He was a very kind and gentle person. Unflappable. Stark contrast to his wife and the like 70% of the rest of my family who takes after his wife. (my mama takes after her daddy...mostly)
Since it’s a clinical term, I wouldn’t expect them to just throw that out there as a blanket term for “violently inclined”. Seems irresponsible. But maybe the article is just poorly written?
This is the project website, you can draw your own conclusions - norman-ai.mit.edu/
Since it’s a clinical term, I wouldn’t expect them to just throw that out there as a blanket term for “violently inclined”. Seems irresponsible. But maybe the article is just poorly written?
This is the project website, you can draw your own conclusions - norman-ai.mit.edu/
Well, the homepage image is not exactly reassuring.
Also, damn, the inkblot interpretations are disturbing. Especially compared to the ones interpreted by the "normal" AI. I mean, I guess that's the point, that these AIs are products of their training environment, but man. Eeeek.