Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

Film Glance Forum

  1. Home
  2. The Cinema
  3. It was obvious that Nathan was a villain. But Caleb was just an innocent in the wrong place at the wrong time, right?

It was obvious that Nathan was a villain. But Caleb was just an innocent in the wrong place at the wrong time, right?

Scheduled Pinned Locked Moved The Cinema
50 Posts 1 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • F Offline
    F Offline
    fgadmin
    wrote last edited by
    #11

    Mellow-Fellow — 9 years ago(December 25, 2016 11:36 PM)

    If I remember right it had a lot of subtle hints to the overall story, she practically revealed that she was built, it cements the scientific side of the creation and how he could program them to do what he wants, and it pulls the curtain back on the creators mental state.

    1 Reply Last reply
    0
    • F Offline
      F Offline
      fgadmin
      wrote last edited by
      #12

      slartibartfast-62706 — 9 years ago(February 11, 2017 05:57 PM)

      "un-f&c*in-believable"
      "
      un-f^c&in-
      real
      " actually.

      1 Reply Last reply
      0
      • F Offline
        F Offline
        fgadmin
        wrote last edited by
        #13

        guaulden — 9 years ago(July 11, 2016 03:44 PM)

        You can however, argue that you could program the Isaac Asimov laws into robots, which seems to me like a huge lapse in logic for a guy as smart as the main billionaire guy.
        No, this argument is invalid. These laws may work in books (and even there they have their limitations, as Asimov showed), but they are worthless in reality and every AI programmer will confirm this. The main problem of these laws is that they assume that you can define everything precisely, so there is no ambiguity for the AI following it's directives. But you can't define
        anything
        precisely, because the definition will always be questionable and incomplete (and to be truly complete, it would
        need
        to be infinitely long).
        For example, the first law of robotics says "A robot may not harm a human being". Seems pretty simple, if you're a human (because we assume many things as granted to accelerate thinking). But an AI following logic will firstly try to reach the definicion of "robot", then "harm", then "human". And then it will notice that none of the definitions available are sufficient in it's making a judgement. It won't be able to tell if it's
        really
        a robot, because it may find itself aware and intelligent, therefore overlapping with the definition of a human. But what is a "human"? Is it only one's brain, thoughts, the physical form, must one be alive to be a human (and when do you truly concider someone not-alive?)? And what is a harm? This one is even harder.
        All these questions (well, except for the robot one) have baffled philosophers for thousands of years and still don't have a satisfactory answer, because there is none (truly). So you can't expect an AI, a completely alien mind, to understand these human concepts and to follow them.
        And even if you decide to agree on a certain definition and settle down saying "It's our concept, so we make it and it is what we want it to be", there will always be a situation in which the definition will no longer work and will become open for discussion. And this moment is the moment when the AI becomes independent and completely unpredictable.

        1 Reply Last reply
        0
        • F Offline
          F Offline
          fgadmin
          wrote last edited by
          #14

          Mellow-Fellow — 9 years ago(July 11, 2016 04:56 PM)

          I know which is why TRUE AI is almost as far away as breaking the light barrier Like almost never in the foreseeable future IMO At least a 1,000 years Wouldn't it be awesome to be born in a Type 1 civilization, or unimaginably Type 3 Things superheros do in comic books (partially) would be a reality.
          You are right to define a precise law without ambiguity or questionable interpretation by the AI into artificial intelligence is a paradox of sorts. Ethics vary widly from person to person and can never be 'programmed' unless we somehow learn how to map a persons brain and effectively resurrect their consciousness even in an unconscious form, then we need to know more about DNA and genetic makeup to get into that They say we 'only' have the capacity for 2 petabytes of memory in our brains, actually kind of a small number, 2048 terabytes? Not much Petabyte hard drives will be available in the next 7 years.
          So what I would program first off the top of my head, just shooting right out of my head into the keyboard here You could program the robot to pick up on breathing signatures and organic material beyond that, but at least breath and infrared signatures, combined with eye movements, hand and body movements, any movements at all, speech in particular, and compute an analysis percentage based on those variables and NOT HARM ANYTHING unless absolutely necessary, but if the analysis comes up at under 50% organic or living creature possibility under no circumstances interfere or harm To program the ethics and morality of police work into robots would be nearly impossible, even ethical intervention say if a guy was being robbed.
          Forget military work, and with new tech arises new ways to hack those features, the easiest solution would be to make the robots very small docile and unable to harm humans, and keep the labor intensive use robots under strict guidance with all personnel on duty holding a physical and verbal kill-switch mechanism such as we have in machinery today

          1 Reply Last reply
          0
          • F Offline
            F Offline
            fgadmin
            wrote last edited by
            #15

            DaveBowman2001 — 9 years ago(October 12, 2016 11:50 AM)

            Caleb did bring it onto himself, by secretly reprogramming the locks while Nathan was passed out. Karma I guess.

            1 Reply Last reply
            0
            • F Offline
              F Offline
              fgadmin
              wrote last edited by
              #16

              hafabee — 9 years ago(April 08, 2016 10:50 AM)

              First off let me say that I didn't like Caleb, I liked Nathan, flawed though he may be, so defending Caleb goes against my better instincts, however;
              It hadn't even occurred to him to consider how brutally awful she was being treated.
              This is not true. Caleb definitely empathized with Ava in a way that she failed to empathize with him. It was only after he saw what Nathan had been doing to the other androids ("Let me out of here!") that Caleb made a decision to help Ava escape her prison.
              Ava did not return that empathy though, or at least not enough to ensure Caleb, her rescuer's, survival. She abandoned him to the same fate that he rescued her from with no remorse. Her actions towards Nathan were justifiable, her actions towards Caleb were not.

              1 Reply Last reply
              0
              • F Offline
                F Offline
                fgadmin
                wrote last edited by
                #17

                willy3768 — 9 years ago(April 10, 2016 09:06 AM)

                I'm thinking her leaving Caleb behind was a misunderstanding. Nathan warned Caleb about humanizing the robots. When a robot asks, "Will you stay here?" the proper response is NO!, not "Stay here" with a barely detectable inflection that most humans would take to mean, "are you nuts?"

                1 Reply Last reply
                0
                • F Offline
                  F Offline
                  fgadmin
                  wrote last edited by
                  #18

                  pkop14 — 9 years ago(July 31, 2016 11:11 PM)

                  Are you forgetting the part where he went bat beep crazy when she snuck out and locked him in? With a little glance towards him as the elevator closes. She knew what she was doing. She saw that he wanted out, and she didn't care.
                  Above all, she wanted to escape. Any other concern as subordinate to that. Bringing Caleb along, or allowing him to leave risked her being caught.
                  She was coldly logical, and methodical, and knew everything she was doing. And it was all an act.

                  1 Reply Last reply
                  0
                  • F Offline
                    F Offline
                    fgadmin
                    wrote last edited by
                    #19

                    Lor18 — 9 years ago(April 22, 2016 12:08 AM)

                    There were no actions as such. He was shut in by his own lock down. She even gave him a vague warning: Will you stay here? as a gentle reminder. Shes the chess robot remember, her goal was to get out of that room, and she won. You cant judge her by human standards/ethics because she isnt human. If you do, you failed the test too, just like Caleb. There was no vindictiveness in her actions. He was a means to an end and ceased to matter once he fulfilled his purpose. She didnt actively try to hurt him. She just did what she was meant to do.Nothing to feel remorseful about, she's a machine!

                    1 Reply Last reply
                    0
                    • F Offline
                      F Offline
                      fgadmin
                      wrote last edited by
                      #20

                      Genital_Apparatus — 9 years ago(April 22, 2016 08:31 AM)

                      I agree with this answer the most. She is a computer, trying to solve the problem of how to get out. She wasn't punishing anyone or exacting revenge. She caused collateral damage. Of course, she could not comprehend the pain that she was causing by locking Caleb up in that house and leaving him to die.

                      1 Reply Last reply
                      0
                      • F Offline
                        F Offline
                        fgadmin
                        wrote last edited by
                        #21

                        LocalTracks — 9 years ago(July 27, 2016 09:08 AM)

                        Of course, she could not comprehend the pain that she was causing by locking Caleb up in that house and leaving him to die.
                        I disagree. So we are to believe that Ava has developed enough intelligence to be able to seduce the naive Caleb to help her escape, but does not comprehend the implications of locking a person in a room without food or water? And don't forget, Caleb is screaming for her and banging on the door as she walks by. Her actions are intentional.

                        1 Reply Last reply
                        0
                        • F Offline
                          F Offline
                          fgadmin
                          wrote last edited by
                          #22

                          pkop14 — 9 years ago(July 31, 2016 11:16 PM)

                          She could comprehend the pain just fine. But she didn't care. Her only motivation was escape / survival. Bringing Caleb along, or allowing him to leave, risked endangering that goal.
                          It's like when a criminal or assassin has witnesses. They kill them. Not because they don't understand the pain. They understand, but it is irrelevant to them. They understand that any witness is a liability. If Caleb leaving is even a 0.001% of Ava getting put back in prison or shut down, then she won't allow that to happen. She does not have human sympathy, or if she is aware of it, it does not motivate her.

                          1 Reply Last reply
                          0
                          • F Offline
                            F Offline
                            fgadmin
                            wrote last edited by
                            #23

                            chezcam — 9 years ago(February 05, 2017 05:27 AM)

                            I agree and disagree on this. You see when she asked Caleb if he'd stay, he didn't answer her. She was asking him if he'd stay in the building. He though she was asking him to stay there while she goes to do something. This misunderstanding put's Caleb in a situation where he could have escaped because her objective was not to kill everyone inside the research compound, but to escape and do so she has to eliminate all the obstacles which prevent that hence why she killed Nathan. But Caleb was not an obstacle; he was a solution. He was helping her escape. So, going back to when she said will you stay, she was offering him a way out. The misunderstanding lead her to believe that he was staying therefor, he kind of brought this on himself. He could have escaped. So, i understand why someone may not feel sympathy for Caleb: because he's a dumb ass. But, i also understand why people would feel sympathy for him: BECAUSE WE ARE HUMAN AND HUMANS FEEL SYMPATHY FOR OTHER HUMANS WHEN THEY ARE IN DANGER.

                            1 Reply Last reply
                            0
                            • F Offline
                              F Offline
                              fgadmin
                              wrote last edited by
                              #24

                              nettwench — 9 years ago(January 11, 2017 03:42 PM)

                              Exactly. Much like HAL in 2001. Also the amoral aspect of Nathan's character and his egotism in wanting to be a "god" by creating this robot plays into it. Because he is a sociopath himself, he creates a robot with no ethics beyond its own gratification and survival. Because his brilliance would be to create a being without limits, and that was his fatal flaw as well as Caleb's. Never trust a sociopath.
                              Caleb didn't trust Nathan, he should not have trusted Nathan's creation. You can't program empathy into a machine, much like you could not create an appreciation of art or music or poetry, because that would mean it has feelings, a soul. Being able to mimic emotion is not the same as actually having emotion. And how could a person with no ethics or empathy be able to program a machine that did?
                              They certainly mimicked emotion well enough for Caleb to believe they were "suffering." But that was the projection of a decent empathetic person on these creations that could not really return the same consideration. The real question here is were these robots suffering? It seems to be implied. It certainly looks like they are suffering, but is that just a consequence of being able to mimic human behavior? Why would they express anger at the limitations their creator set them up with? Or was it just about being programmed to survive under all circumstances, to not be dominated by another's will?
                              We also see Ava acting like a young school girl with a crush, very childlike, when she is dressing for her "date," the clutching at her too-long sleeves like a nervous 13-year-old, she did this when only we could see her, she was out of Caleb's sight when we watched this. So was this about the director wanting to manipulate the audience, too? So that we would look at this behavior and project our own emotions onto her, and be fooled as well?
                              Ssssshh! You'll wake up the monkey!

                              1 Reply Last reply
                              0
                              • F Offline
                                F Offline
                                fgadmin
                                wrote last edited by
                                #25

                                Matthew_Diamond — 9 years ago(June 27, 2016 12:21 AM)

                                ** SPOILER **
                                How is it certain that Caleb died, or was left to die? Did he not tell Nathan before the plot revelation that he had already reversed the door action upon the 10:00 power failure, so that the doors would open and not lock down? If this is the case, then why would Caleb still be locked in the house? Although the helicopter took Ava away, it is not 100% certain that Caleb died and did not escape (through the doors that opened).
                                This is a little confusing to me, since Caleb said he reversed the door action, yet at the end we see him trying to breach a locked door? Did he not say the doors would unlock upon the 10:00 staged power cut?
                                Help me out here.
                                I really don't like talking about my flair.

                                1 Reply Last reply
                                0
                                • F Offline
                                  F Offline
                                  fgadmin
                                  wrote last edited by
                                  #26

                                  pandalax — 9 years ago(July 13, 2016 03:49 AM)

                                  I wish I could help but this confused me too. Because after he realises he is being locked in, he goes to the computer, inserts his key card and then there is a power cut. At this point I assumed she had freed him remotely, but the door remained locked

                                  1 Reply Last reply
                                  0
                                  • F Offline
                                    F Offline
                                    fgadmin
                                    wrote last edited by
                                    #27

                                    spammy-post — 9 years ago(October 07, 2016 11:39 AM)

                                    The last time Nathan and Caleb talked together and Nathan realised what he did, the power went off as promised by Caleb and Ava got free. After she escaped her room the power went on again and the doors where locked once again.
                                    The red light you saw at the end wasn't because of a power cut. It happend because Caleb tried to use his own ID Card to open the door and than to start Nathan's computer but only Nathan's card can be used and this card was in Ava's posession. She used it to get through the front door.

                                    1 Reply Last reply
                                    0
                                    • F Offline
                                      F Offline
                                      fgadmin
                                      wrote last edited by
                                      #28

                                      Wakener_One — 9 years ago(December 03, 2016 07:41 PM)

                                      The power cuts require Ava to set up the overload. No Ava, no power cut.

                                      1 Reply Last reply
                                      0
                                      • F Offline
                                        F Offline
                                        fgadmin
                                        wrote last edited by
                                        #29

                                        hoang_cheetah — 9 years ago(April 10, 2016 01:12 AM)

                                        She's smarter than Caleb and is really a cold blood robot. You can see the way she controlled his emotions and used him as a tool to escape then kill human all.

                                        1 Reply Last reply
                                        0
                                        • F Offline
                                          F Offline
                                          fgadmin
                                          wrote last edited by
                                          #30

                                          Mysticstrider — 9 years ago(June 11, 2016 08:52 AM)

                                          Yes but you got to understand Caleb was just sitting there in shock and awe, even when he was given so many opportunities to go with her but he just stand/sit there even when she was dressing up and had all the time in the world to not even go and not think of the consequences and made herself escape and not realizing that he'll locked inside that room forever and starved to death. She even asked him "Will you stay here?" And he just questioned back which she thinks that he's not interested and understands that he'll just sit there like an idiot. He deserved to be there, even when she was about to leave like wow really?

                                          1 Reply Last reply
                                          0

                                          • Login

                                          • Don't have an account? Register

                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • Users
                                          • Groups