<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[one thing he forgot while coding spoiler]]></title><description><![CDATA[<p dir="auto"><em>Archived from the IMDb Discussion Forums — Ex Machina</em></p>
<hr />
<p dir="auto"><strong>maratonrunner</strong> — <em>9 years ago(January 20, 2017 03:39 PM)</em></p>
<p dir="auto">a robot may learn "free will" and AI will learn a lot how to react.<br />
however as a robot has nearly no morality or ethics or parents to make proud I think as a coder you need to enforce Asimovs 3 laws on robotics :<br />
1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.<br />
2.A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.<br />
3.A robot must protect its own existence as long as such protection does not<br />
conflict with the First or Second Laws</p>
]]></description><link>https://filmglance.com/discuss/topic/241482/one-thing-he-forgot-while-coding-spoiler</link><generator>RSS for Node</generator><lastBuildDate>Wed, 13 May 2026 11:50:04 GMT</lastBuildDate><atom:link href="https://filmglance.com/discuss/topic/241482.rss" rel="self" type="application/rss+xml"/><pubDate>Mon, 04 May 2026 17:28:30 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:42 GMT]]></title><description><![CDATA[<p dir="auto"><strong>phantom2-2</strong> — <em>9 years ago(February 01, 2017 09:53 AM)</em></p>
<p dir="auto">asimov's three laws aren't perfect, and there are loopholes that a clever robot could exploit. and nathan was building a very clever robot.<br />
even asimov didn't think his laws were perfect and added a fourth law.</p>
]]></description><link>https://filmglance.com/discuss/post/2023243</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023243</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:42 GMT</pubDate></item><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:40 GMT]]></title><description><![CDATA[<p dir="auto"><strong>IMDb User</strong></p>
<p dir="auto">This message has been deleted.</p>
]]></description><link>https://filmglance.com/discuss/post/2023242</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023242</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:40 GMT</pubDate></item><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:39 GMT]]></title><description><![CDATA[<p dir="auto"><strong>bananabry</strong> — <em>9 years ago(January 24, 2017 05:49 AM)</em></p>
<p dir="auto">I was listening to a discussion of Asimov's Three Laws and one speaker pointed that many of the robots we build are specifically designed for the purpose of war, and sometimes killing.  Pretty much, the Three Laws don't apply.</p>
]]></description><link>https://filmglance.com/discuss/post/2023241</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023241</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:39 GMT</pubDate></item><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:38 GMT]]></title><description><![CDATA[<p dir="auto"><strong>slartibartfast-62706</strong> — <em>9 years ago(February 11, 2017 06:27 PM)</em></p>
<p dir="auto">It really doesn't matter.  We are programmed with our morals and ethics through social means and life experiences.<br />
An AI needs to also be programmed with ethics and morals, period.  Unless you can replicate human development from an infant state and program those things naturally and organically, they need to be programmed like the rest of the psyche.  I mean, we don't question whether the AI can think if thinking is just part of its programming, do we?  We question whether it is really thinking or not, but if it is, it doesn't matter that it's programmed to think, that's what AI is.  Whether hard-wired or software or what-have-you.  Giving a non-human free will and intelligent thought capability without any source of morals and ethics is insane.</p>
]]></description><link>https://filmglance.com/discuss/post/2023240</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023240</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:38 GMT</pubDate></item><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:36 GMT]]></title><description><![CDATA[<p dir="auto"><strong>nightcal</strong> — <em>9 years ago(January 31, 2017 05:40 AM)</em></p>
<p dir="auto">"But what stops you from killing?"<br />
Right. That is the conundrum with creating real AI. Can you code ethics in if you have a being with free will? As you pointed out, we have our own human limitations but they are generally something taught to us. We pick them up from parents, teachers etc.<br />
If you just hard-wire ethics into someone like Ava, is she really AI or is she just a robot?<br />
It is not an issue of Nathan wiring ethics into her, he should have taught her more about that but he never had any intention of letting her out anyway.<br />
<a href="https://twitter.com/CMoviegrapevine" rel="nofollow ugc">https://twitter.com/CMoviegrapevine</a><br />
<a href="http://www.moviegrapevine.com" rel="nofollow ugc">www.moviegrapevine.com</a></p>
]]></description><link>https://filmglance.com/discuss/post/2023239</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023239</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:36 GMT</pubDate></item><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:35 GMT]]></title><description><![CDATA[<p dir="auto"><strong>maratonrunner</strong> — <em>9 years ago(January 25, 2017 12:47 PM)</em></p>
<p dir="auto">I agree to some extent and Nathan wasn't smart afterall as he wanted a lion.<br />
But what stops you from killing?<br />
I guess you are not a killer and your opbringning and ethics and so on stops you from killing someone + you don't want your parents to hear that you have become a murderer.<br />
you have free will yet you don't use free will to do what ever you desire.<br />
humans have programmed limitations in our upbringing.<br />
A robot with free will can kill anyone it is angry at.<br />
it does not feel shame from anyone. unless you want a killing machine for war you need to code ethics in the dna of the robots.<br />
Even the army wants a robot that only kills the enemy. -not friendly kills</p>
]]></description><link>https://filmglance.com/discuss/post/2023238</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023238</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:35 GMT</pubDate></item><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:34 GMT]]></title><description><![CDATA[<p dir="auto"><strong>nightcal</strong> — <em>9 years ago(January 24, 2017 05:48 AM)</em></p>
<p dir="auto">Like someone else said, those laws are fictional. Also, have you really created a fully functioning AI if laws can restrict its behaviour? Nathan didn't want to create a simple robot. He wanted a proper artificial being with free will and consciousness.<br />
<a href="https://twitter.com/CMoviegrapevine" rel="nofollow ugc">https://twitter.com/CMoviegrapevine</a><br />
<a href="http://www.moviegrapevine.com" rel="nofollow ugc">www.moviegrapevine.com</a></p>
]]></description><link>https://filmglance.com/discuss/post/2023237</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023237</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:34 GMT</pubDate></item><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:32 GMT]]></title><description><![CDATA[<p dir="auto"><strong>maratonrunner</strong> — <em>9 years ago(January 22, 2017 09:38 PM)</em></p>
<p dir="auto">that might be if someone ever creates a real AI robot I hope they remember those laws as it makes sense.</p>
]]></description><link>https://filmglance.com/discuss/post/2023236</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023236</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:32 GMT</pubDate></item><item><title><![CDATA[Reply to one thing he forgot while coding spoiler on Mon, 04 May 2026 17:28:31 GMT]]></title><description><![CDATA[<p dir="auto"><strong>stitch_groover</strong> — <em>9 years ago(January 21, 2017 08:12 PM)</em></p>
<p dir="auto">They are fictional laws and maybe Asimov didn't exist in this universe?</p>
]]></description><link>https://filmglance.com/discuss/post/2023235</link><guid isPermaLink="true">https://filmglance.com/discuss/post/2023235</guid><dc:creator><![CDATA[fgadmin]]></dc:creator><pubDate>Mon, 04 May 2026 17:28:31 GMT</pubDate></item></channel></rss>