Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

Film Glance Forum

  1. Home
  2. The Cinema
  3. one thing he forgot while coding spoiler

one thing he forgot while coding spoiler

Scheduled Pinned Locked Moved The Cinema
10 Posts 1 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • F Offline
    F Offline
    fgadmin
    wrote last edited by
    #1

    Archived from the IMDb Discussion Forums — Ex Machina


    maratonrunner — 9 years ago(January 20, 2017 03:39 PM)

    a robot may learn "free will" and AI will learn a lot how to react.
    however as a robot has nearly no morality or ethics or parents to make proud I think as a coder you need to enforce Asimovs 3 laws on robotics :
    1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    2.A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
    3.A robot must protect its own existence as long as such protection does not
    conflict with the First or Second Laws

    1 Reply Last reply
    0
    • F Offline
      F Offline
      fgadmin
      wrote last edited by
      #2

      stitch_groover — 9 years ago(January 21, 2017 08:12 PM)

      They are fictional laws and maybe Asimov didn't exist in this universe?

      1 Reply Last reply
      0
      • F Offline
        F Offline
        fgadmin
        wrote last edited by
        #3

        maratonrunner — 9 years ago(January 22, 2017 09:38 PM)

        that might be if someone ever creates a real AI robot I hope they remember those laws as it makes sense.

        1 Reply Last reply
        0
        • F Offline
          F Offline
          fgadmin
          wrote last edited by
          #4

          nightcal — 9 years ago(January 24, 2017 05:48 AM)

          Like someone else said, those laws are fictional. Also, have you really created a fully functioning AI if laws can restrict its behaviour? Nathan didn't want to create a simple robot. He wanted a proper artificial being with free will and consciousness.
          https://twitter.com/CMoviegrapevine
          www.moviegrapevine.com

          1 Reply Last reply
          0
          • F Offline
            F Offline
            fgadmin
            wrote last edited by
            #5

            maratonrunner — 9 years ago(January 25, 2017 12:47 PM)

            I agree to some extent and Nathan wasn't smart afterall as he wanted a lion.
            But what stops you from killing?
            I guess you are not a killer and your opbringning and ethics and so on stops you from killing someone + you don't want your parents to hear that you have become a murderer.
            you have free will yet you don't use free will to do what ever you desire.
            humans have programmed limitations in our upbringing.
            A robot with free will can kill anyone it is angry at.
            it does not feel shame from anyone. unless you want a killing machine for war you need to code ethics in the dna of the robots.
            Even the army wants a robot that only kills the enemy. -not friendly kills

            1 Reply Last reply
            0
            • F Offline
              F Offline
              fgadmin
              wrote last edited by
              #6

              nightcal — 9 years ago(January 31, 2017 05:40 AM)

              "But what stops you from killing?"
              Right. That is the conundrum with creating real AI. Can you code ethics in if you have a being with free will? As you pointed out, we have our own human limitations but they are generally something taught to us. We pick them up from parents, teachers etc.
              If you just hard-wire ethics into someone like Ava, is she really AI or is she just a robot?
              It is not an issue of Nathan wiring ethics into her, he should have taught her more about that but he never had any intention of letting her out anyway.
              https://twitter.com/CMoviegrapevine
              www.moviegrapevine.com

              1 Reply Last reply
              0
              • F Offline
                F Offline
                fgadmin
                wrote last edited by
                #7

                slartibartfast-62706 — 9 years ago(February 11, 2017 06:27 PM)

                It really doesn't matter. We are programmed with our morals and ethics through social means and life experiences.
                An AI needs to also be programmed with ethics and morals, period. Unless you can replicate human development from an infant state and program those things naturally and organically, they need to be programmed like the rest of the psyche. I mean, we don't question whether the AI can think if thinking is just part of its programming, do we? We question whether it is really thinking or not, but if it is, it doesn't matter that it's programmed to think, that's what AI is. Whether hard-wired or software or what-have-you. Giving a non-human free will and intelligent thought capability without any source of morals and ethics is insane.

                1 Reply Last reply
                0
                • F Offline
                  F Offline
                  fgadmin
                  wrote last edited by
                  #8

                  bananabry — 9 years ago(January 24, 2017 05:49 AM)

                  I was listening to a discussion of Asimov's Three Laws and one speaker pointed that many of the robots we build are specifically designed for the purpose of war, and sometimes killing. Pretty much, the Three Laws don't apply.

                  1 Reply Last reply
                  0
                  • F Offline
                    F Offline
                    fgadmin
                    wrote last edited by
                    #9

                    IMDb User

                    This message has been deleted.

                    1 Reply Last reply
                    0
                    • F Offline
                      F Offline
                      fgadmin
                      wrote last edited by
                      #10

                      phantom2-2 — 9 years ago(February 01, 2017 09:53 AM)

                      asimov's three laws aren't perfect, and there are loopholes that a clever robot could exploit. and nathan was building a very clever robot.
                      even asimov didn't think his laws were perfect and added a fourth law.

                      1 Reply Last reply
                      0

                      • Login

                      • Don't have an account? Register

                      Powered by NodeBB Contributors
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • Users
                      • Groups