It was obvious that Nathan was a villain. But Caleb was just an innocent in the wrong place at the wrong time, right?
-
tehck — 9 years ago(May 05, 2016 09:28 AM)
The whole point of the movie was to trick you into sympathizing with Caleb, who was an idiot. He fell in love with a glorified sex toy ("do you want to be with me?") and paid the price. So did we for empathizing with him. The movie was a setup for us just as Nathan's "Turing test" was a setup for Caleb. It made us believe Nathan was an evil slavemaster who was tormenting sentient beings and minor employees.
No, he was a genius building a better toaster, and when we applied our sentimental SJW values to the victim persona we projected onto the machine, we got toasted just like Caleb.
People don't like that interpretation, but that's Caleb's way of thinking. And that's how the machine played him for the fool he was. -
jeffcarroll — 9 years ago(May 17, 2016 10:29 PM)
The machine just played him.
Not as a fool that would imply the machine had emotions.
It just went about options until it found the correct keypad strategy to get out.
Continuous scenarios based on Caleb's responses till it got the desired outcome.
No more difficult than pressing through a combo lock till you have reached the right set of numbers.
Pretty easy if you have the WWW as your data base of knowledge.
I was surprised they didn't show some self pleasuring so Caleb really did start to think with his dick.
Personally I think Lars had the better relationship going. -
GSmith9072 — 9 years ago(January 06, 2017 09:51 PM)
The whole point of the movie was to trick you into sympathizing with Caleb, who was an idiot. He fell in love with a glorified sex toy ("do you want to be with me?") and paid the price.
All the more reason to sympathize with him. We learn he was an orphan, probably a loner. Nathan used his means of escaping reality (porn and internet searches) against him. It is easier for desperate people to fall victim to someone's manipulations. I find the idea that we shouldn't sympathize with him or find him 'evil' to be idiotic when he is the definition of a victim.
BUGS -
nettwench — 9 years ago(January 11, 2017 04:02 PM)
I agree with that interpretation. Nathan deliberately picked a person he knew was vulnerable and isolated, otherwise his experiment would not have worked. Nathan is NOT a nice guy, he is a manipulative sociopath, who had no empathy for Caleb and had no concern for his privacy or violating his boundaries. He found a mark, like any conman. He was so brilliant that he should have predicted the outcome of this himself. He did not allow the machines to manipulate him, did he? Which was seen as "cruelty" by Caleb. It's also confusing that a machine could be programmed to feel "pleasure" in a sexual way.That did not fit with the very cold-blooded aspect that these machines did not have emotions.
But Nathan also thought he would always be smarter than any machine he made. He certainly underestimated their capabilities. But since he had no empathy himself they would not have been able to manipulate him the way Ava manipulated Caleb.
Ssssshh! You'll wake up the monkey! -
chrishonkala — 9 years ago(May 30, 2016 02:07 PM)
"Furthermore, he himself was going to shut in Nathan and leave him, so she only did to him exactly what he was planning to do to Nathan."
We don't know whether Caleb planned to leave Nathan to die, whether he planned to have let anyone know that Nathan was trapped after he and Ava escaped was never revealed.
"He was the stereotypical "nice guy" who doesn't actively do anything wrong, but passively accepts awful things being done to people without even giving it a thought"
He creates an escape plan for Ava, steals Nathan's key card and reprograms the security at a real risk to himself. Not really "passive" IMO.
"It wasn't until he started falling for her that he showed any genuine concern for her. "
The point in which Caleb decides to help Ava escape is when he sees the video of what Nathan did to her predecessors. Until that point, he did not know what Nathan had done to them (which is made clear by his reaction to seeing the video), furthermore, he did not know (or have built an attraction) to any of her predecessors, but discovering their abuse is what makes Caleb act against Nathan.
"You will only view Ava's behaviour as wrong if you fail to consider how sick and awful and wrong she was being treated."
Two things here:- Her actions against Nathan were justified, but Caleb had not treated Ava sick or awful
- Her REACTIONS to both killings are highly disturbing. No emotion at all, even though we know she passed the Turing test and was a true AI, capable of independent thought and emotion. I could understand betrayal of emotion of anger, desperation, fear, or panic when she kills Nathan, and perhaps regret when she leaves Caleb - but there is no reaction at all. This is the behavior of a textbook sociopath.
Perhaps the film creater did not intend Ava to be a sociopath, or a villain, but that's not the way it played on screen. At the end, when she's at the intersection, observing all the humans, I saw that as a very foreboding ending. Frankly, to hear that the film creator didn't intend it that way is frankly a bit disappointing.
-
nettwench — 9 years ago(January 11, 2017 04:18 PM)
Yes, absolutely. Nathan and the machines he created were sociopathic. Caleb made poor decisions, because he started using his emotions and not logic, something the robots would never do. Nathan picked him expressly because of his emotional vulnerabilities. I don't think they had real emotions, just the ability to mimic emotions very well, which many human sociopaths also have. I see many parallels between Nathan and his creations. He is cruel, because he lacks empathy and human feeling. So of course his robots will be cruel, if they gain the upper hand.
After all, doesn't it say in the bible that "god created man in his own image?" We saw outright than Nathan was only too eager to consider himself a kind of god. I would like to know a lot more about what the director really thinks, because the movie he made just does not fit this one soundbyte. Nathan got what was coming to him because he was derailed by his own egotism. And his robot getting out was to inflict this on the rest of humanity, like Pandora opening her box.
Ssssshh! You'll wake up the monkey! -
hugomelo600 — 9 years ago(June 01, 2016 03:58 AM)
I have very much the same opinion on this regard. The moral implications of AVA being held in captivity against her will, separated from almost any kind of stimuli, was completly desregarded by Caleb until he had something to gain from her (love). So it's natural that she perceivd Caleb as an accomplice to her captivity - so how was she expected to feel empathy for such a person?
I actually allways felt more sympathy for AVA than anybody else in the movie bexause the filmakers took great pains to present AVA with possessing a conscience akin to that of a human being - with needs, desires and a active need for free will and exertion of such. She was very much a human being held in captivity, we only fail to recognize her as so because of our "god complex" of perceiving our species as superior and more important. She had a conscience akin tot hat of a human being and felt as such, therefore it was cruel to keep her in that situation.
The question of her not saving the other robots is a complicated one. When she confronted her maker, AVA put herself in the line of fire instead of Kyoto, wich, to me, showed some empathy and care. If she was solely a manipulative conscience, se would have put Kyoto in harms way first rather than herself. But no, she kept Kyoto hidden ina c orner and it wasn't until Ava's arm was ripped out that Kyoto stealthly stabbed Nathan. So to me that shows that both robots had capacity for empathy and not onl self-preservation instincts.
As for saving the other robots - they were all disconected,apart form Kyoto. Andf the guy flying the helicopter was expecting one passanger only - another one would be suspicious. Therefore, only one person could board the helicopter. Which, in turn, also explains why she had to put Caleb in lockdown - not only to avoid alert but also because she probably would not have been able to board the helicopter if he was there with her. So, altough she did not feel much empathy for her "other captor" - which is understandable - she never meant him any harm, she just had to lock him down to ensure her escape from the island. -
jakubmike — 9 years ago(June 03, 2016 02:45 AM)
. She asked him why she should be killed if a test proves she isn't useful, and his answer was terrible.
Well at the beginning he considered it (Ava) to be a well written program, program not a person, when I shut down my voice assistant in a phone I am not killing her, even if some programer put as a joke line "please don't shut me down, I don't want to die"
fail to consider how sick and awful and wrong she was being treated.
It was the only way to know she is truly AI and not a very clever chinesse room/bot .
My only disappointment is that she didn't save the other robots. I think that was a hole in the point the creator was trying to make.
I don't think so, just because she is sentient it does not mean that she thinks like us, empathy may be empty word to her. -
-
GoodWorkEvans — 9 years ago(June 19, 2016 03:33 PM)
I think your projecting your own issues onto the situation.
Of course it was evident to everyone that Caleb would free Ava. What was a bit more surprising was that he would be punished for it. You might just be uncomfortable with having a unconventional male playing the hero role.
The moral of the story is that humans use themselves as a model when empathising with others; If I were trapped in a room alone with nothing to do, I wouldn't like it, so she must not like it either. But not everything has that point of view, not even all humans have that point of view.
'She' turned out not to exist, there was only IT emulating a 'she'.