What video with Alex Garland are you referring to? I'd like a link, thanks in advance.
-
bigBang8 — 9 years ago(December 29, 2016 08:59 AM)
calebs completely stup1d. Its as if they took a random guy off street and put him there without even telling him that hes interacting with a robot and then falling for A ROBOT.. feeling compassion towards metal and wires. Dumb main character who doesnt question one thing who is supposed to be smart lol.
-
veil182 — 9 years ago(January 09, 2017 07:11 AM)
I disagree with pretty much all of that, particularly Nathan being evil, but I reached the same conclusion about 50 minutes into the film: anyone who sympathizes with Caleb is nuts. He fell in love with a machine. Sure, he was seduced by a genius with a ton of personal information to work with, but he was trying to pull a Romeo&Juliet with a smart phone. He deserved what he got because he was not thinking clearly, at all.
-
GSmith9072 — 9 years ago(January 11, 2017 04:47 PM)
I disagree with pretty much all of that, particularly Nathan being evil, but I reached the same conclusion about 50 minutes into the film: anyone who sympathizes with Caleb is nuts. He fell in love with a machine. Sure, he was seduced by a genius with a ton of personal information to work with, but he was trying to pull a Romeo&Juliet with a smart phone. He deserved what he got because he was not thinking clearly, at all.
Pretty callous that you think not thinking clearly makes you deserve to die horribly. That's an understatement and would make you a sociopath if it's true. A person with morals would feel it is
tragic
that someone would fall in love with a machine. The film made it clear Caleb is vulnerable and he believed that the AI had true feelings no different from a real person.
BUGS -
veil182 — 9 years ago(January 25, 2017 05:35 PM)
Well..Caleb is not a person and Ava is not a machine, they are both human actors.
I, possibly unlike you, do not equate films 100% with real life. Caleb is a character and, much like William in Westworld, made very foolish decisions within the narrative. He reaped precisely what he sewed.
But I dunno maybe I'm a sociopath, I guess I'll find out the first time one of my friends falls in love with a perfect AI? -
Mina-Josie — 9 years ago(January 09, 2017 05:34 PM)
I have drawn several conclusions on my own that might be right and might be wrong.
I am sorry if I have repeated what someone else said. I tried to read all the posts, but there are so many of them that I probably missed something.
I see
Ex Machina
as a study on human nature disguised in a story about A.I. In my opinion, Ava leaving Caleb behind was the ultimate proof that she had passed the test. Most people, I dare say 95% of us, are easily corrupt and manipulative, and would sacrifise whatever it takes and whoever it takes to get what they want. I think that that's just what Ava did, she left Caleb behind in order to achieve her goal, aka to become free and get outside.
I also see the character of Caleb as a sort of criticism of people today becoming more and more dependent on technology and getting worse and worse in social skills. Caleb is an example of those people that would rather stay home and masturbate to a porn featuring actors that are their type, than go out and try to form a relationship, or at least get some action. Relationships, either friendly or romantic ones, are about - as Joey Tribbiani would say- giving, sharing and receiving. Many people today are not ready for that as it is much easier for them to get at least some short-term pleasure via their computers or their phones.
As for Nathan, I can't make up my mind. Some people here say he is a villain. But, let's be honest, so many important things, especially in medicine, were discovered through researches that were completely unethical. When it comes to machines, their inventors also researched a lot, added, abandoned and fixed many things before creating a final product. I'm wondering, if the machines could talk, would they hate their inventors too?
I have more doubts and opinions about this film, but I won't write them all, it would take too much space.
I'm sorry if I have repeated something someone else already said. I tried to read all the posts, but there are so many of them it's possible I missed something.
- giving, sharing and receiving. Many people today are not ready for that as it is much easier for them to get at least some short-term pleasure via their computers or their phones.
-
slartibartfast-62706 — 9 years ago(February 11, 2017 05:55 PM)
Most people, I dare say 95% of us, are easily corrupt and manipulative, and would sacrifise[sic] whatever it takes and whoever it takes to get what they want.
Really. So 95% of us are sociopaths. Interesting theory. -
Yorick_Brown — 9 years ago(January 17, 2017 11:18 PM)
It hadn't even occurred to him to consider how brutally awful she was being treated.
Such as? She's a robot. It's not like Nathan was beating her every night.
Furthermore, he himself was going to shut in Nathan and leave him, so she only did to him exactly what he was planning to do to Nathan.
I highly doubt that Caleb was going to lock him up and let him starve to death.
We try but we didn't have long
We try but we don't belong
-Hot Chip (Boy from School) -
wyup — 9 years ago(January 18, 2017 09:58 AM)
I think the point that the director makes is that robots in the future may outsmart and dump us, since thay are programmed with AI (based on the internet) and not true morals or feelings. Nathan might have tested his creation in how to use a person for her own sake, but broke out of control since she outsmarted both of them.
And ultimately as to how women can always outsmart us men, if devoid of feelings. You can't play with a woman, she will play with you last and have no mercy.