Technology in movies.
-
mandrake62 — 17 years ago(June 11, 2008 08:25 AM)
In the late 80s I was sourcing CD-ROM players for a proposal. At that time it was and emerging technology and it was difficult to find player units for under a $1000 even in quantity. CD-ROM was a highly competitive technology as depicted in the film.
Even though some of the technology depicted was over the top such as the virtual reality corridor, the film did capture the heady atmosphere of that era when everything seemed possible. But the VR corridor with such resolution and interactivity, is orders of magnitude beyond the technology of a CD-ROM unit. Showing them as contemporary top-of-the-line technologies is over the top. But it wasn't just gratuitous since it was an became an integral part of the plot and made for a good scene. -
Bush_Pilot — 13 years ago(August 05, 2012 12:29 AM)
Hey, my oldest brother was a project engineer for Digital Equipment Corp in the 80's and 90's and his bosses bet everything on main frames and only a little on their own proprietary pc, precluding any possibility of being compatible with the personal computer revolution until it was too late. One of the largest computer companies is just a memory. A prime example of how fast technology evolves I do remember playing the first ever dungeons and dragons game on a DEC computer. The one that was text only written at MIT in the late seventies or early eighties I think. I still have the program.
http://sloanreview.mit.edu/improvisations/2011/02/17/lessons-from-ken- olsen-and-digital-equipment-corp/#.UB4f9aPo7Is -
trvolk — 19 years ago(November 20, 2006 11:40 PM)
A 'guest' logs into a 'demo' of a 'prototype' 'user interface' and has full and immediate access to sensitive company records. No, not even in 1994 would such a glaring security hole be allowable, not even by Meredith in her cost-cutting ways. Perhaps in the early 80s, when virtually every computer geek knew the backdoor to DEC mainframes was user: field pass: service, but not by the mid-90s.
Crichton constantly amazes me like this; he gets the minor details right, those little bits that flesh out the story but have nothing to do with the plot, however he misses the big picture and makes huge glaring mistakes.
I can think of numerous ways for the Manager of Manufacturing to hack into a system, the most obvious of which would be guessing Meredith's password. A simple explanation that audiences in 1994 would have swallowed without question.
(True, having 2 Merediths in the system would be problematic, but just think along the lines of a catfight with the Angel as referee.) -
nyc1298 — 19 years ago(January 09, 2007 05:28 AM)
I'm surrounded by technology everyday at work at home. I'm a network engineer in fact so I can't escape it. But for some reason, absurd technology in movies don't bother me at all. 10 years ago when I was younger it did bother me, but now that I realize that everything in a movie is a fabrication by someone who isn't super technical, who cares? I guess I look at it like what a laymen would envision as being the next big thing or some cool technology.
Joe / NYC -
soruht — 17 years ago(May 16, 2008 11:40 AM)
"Dad you got this E-MAIL! You really should stop what you're doing and come read this E-MAIL on the COMP-YOOO-TER! It's an E-MAIL because it's ELECTRONIC MAIL! Read the E-MAIL dad! It's this new thing called E-MAIL! It's really cool, dad, this E-MAIL! E-MAIIIIL!"
-
Peter-5000 — 17 years ago(August 23, 2008 08:40 AM)
These terms were used in quite a few 'quasi' programming languages. I remember in the 80s I worked on a realtime language (RTL/2) which also used it.
Besides that I remember at the time I thought it was quite a clever movie. In particular the non-technical part. Don't get stuck in the technical part.solve the problem
-
joey_pesci — 14 years ago(August 14, 2011 04:06 PM)
Yep, being in IT I like seeing these sorts of thing in a geeky way

Funny thing is sometimes its in a bid to advertise the latest products at the time. I think the Virtual Reality part was in an attempt to market the big VR machines that use to be in arcades. Like the Virtuality machines.
http://www.youtube.com/watch?v=SP8wSw4bBuA
http://www.vrtifacts.com/hmds/all-brawn-virtuality-1000cs-hmd/
And the e-mail thing, yeah, was stupid.
Also, even today in modern films, you'll notice almost always they'll use the keyboard and not the mouse. Even though we know most of what they are doing would require a mouse. But it's because it's not cinematic enough. Moving the mouse is just to silent and lacks movement. Hence we see them all typing away.
The computer tech in CSI really annoys me though as it's bollards. -
FourKay — 14 years ago(September 26, 2011 05:04 AM)
I actually like how they showed technology in the old movies.
Remember in Hackers how the guy is visually browsing the network? LOL!
As with most things, if they showed how it's really done, it would look very boring. -
avortac — 13 years ago(October 23, 2012 09:50 AM)
"As with most things, if they showed how it's really done, it would look very boring. "
You don't think movies show most things how they are really done? I mean, come on - there are many movie-specific clichs and shortcuts that don't exist in real life, but if "most things" were not shown like they are really done, movies would be almost complete fantasy (and I would actually like that). But movies range from complete fantasy to extremely depressing 'realism', so I don't think your claim holds water.
As for it looking really boring - I don't know about that. Look at War Games - it shows it pretty close to how it would have been in real life back then (except for the A.I. stuff and the whole N.O.R.A.D.-scene) - and it wasn't boring at all, but on the contrary, it was very interesting.
If you do it interestingly, you can make any computer usage look entertaining and exciting. If you can't make using a computer look interesting, it's not because you are doing it too realistically - it's because you don't know how to show things in an interesting way.
You can show a really exciting thing in a very boring way in a movie, or a relatively boring thing in an interesting way. Too many people seem to think that computer realism somehow automatically equals boredom, and use this lazy rationalism as an excuse to accept the stupidity and illogical nature of movie technology.
It would not look boring to me to watch someone do it like 'how it's really done'. Why would it? To me those russian hacker videos, where they show how to hack a server and get the root password are very interesting, although they are indeed showing 'how it's really done'.
With slight modifications, it would be just as interesting to the general audience, if done right.
The problem is that mainstream is so stupid and technologically dumb that they wouldn't understand what's going on, unless someone was constantly explaining what they are doing. So computers in movies are doomed to be dumbed down, visually exploded fantasy machines that shine a cat-sized "ACCESS DENIED", flashing in red in the middle of the screen with black background whenever a hacker (or whoever) can't log in - and a dog-sized green "ACCESS GRANTED" or something similar, when he succeeds.
The problem is thus not that it would look very boring, the problem is that masses are stupid. This problem has deep and wide-reaching consequences that are not limited to movies and computers though .. but what can you do..? -
englisher101 — 13 years ago(July 30, 2012 09:56 AM)
I think what makes a lot of movies based on computer technology so ridiculous is that they tried to choose a middle ground between total fantasy and the real world without doing proper research.
This movie didn't bother me nearly as much as movies like The Net because it was more on the fantasy side of the spectrum. The operating systems and software presented weren't real. When they used real terminology to describe hardware, it was actually okay there were a few very minor goofs like using 'fields per second' to describe the video streaming rate of a CD-ROM drive instead of frames per sec (fields per sec would typically be applied to display devices, not data streaming rates). It was only ridiculous because it was made impractical to make it more visually exciting.
Meanwhile movies like The Net show a system analyst being targeted by a hacker/phreaker using America Online to check her e-mail and perform traceroutes. She ends up defeating this software/hardware/phone system genius by putting a virus on a floppy disk. There it's so ridiculous because it's based on the most ridiculous scenarios involving real technology and real software which would not and could not ever be used that way. There anyone who knows anything about operating systems, networks, software, and/or programming would want to pull their hair out over the extremely awkward technical inaccuracies.
Movies like The Net are a lot worse IMO for that reason. They're so horrible because they're based on real technology but exhibit complete lack of understanding of how they work. Meanwhile, movies like Disclosure are only awkward because of the impracticality of the fantasy technology being presented.
To some degree, this impracticality will most likely always be present in films. It's like how futuristic depictions of computers often involve voice recognition: voice recognition itself is not very practical when it comes to executing commands, e.g., for the simple reason that hardly anyone can speak at 80+ words per second, but a fluent typist could easily type at that speed. It's just not very exciting to watch people typing as opposed to speaking. -
junktom — 13 years ago(December 14, 2012 12:36 AM)
I don't mind a technology being wrongly presented in a movie, such as speaking the dialogue or goof in technological terms. What bugs me is - when technology is unnecessary, but being decorated just to make the film fancy, such as the virtual reality system. I remember back then, VR was a big thing, but I don't see any sense on making a virtual file library that users still have to walk, open cabinate, and browser through page after page of documents.
Classics are names that everyone heard, yet most have never seen!! -
alcockell — 12 years ago(September 12, 2013 03:15 AM)
At the time the book was written, virtual reality was meant to be the Next Big Advance in human-computer interaction. Also at the time, trust-based access was must less granular - so it was perfectly feasible for an NFS mount to grant much greater access - there was less perceived need for security between different hosts it was assumed that if you were logged in on a trusted box - you were OK. Remote root rights were granted at the time.
Took several hacks before that was tightened up..
1990-1991 was main period when Sun were pushing the Network Computer which we now have with tablets -
Darkfalz1979 — 12 years ago(July 06, 2013 11:48 AM)
Technology is incremental. You could have conceived of a smart phone 10 years ago with more processing power than the fastest home computers at the time, but you couldn't build it. It takes years of R&D funded by incremental advances and funded by revenue from product cycle after product cycle based on those advances.
The VR file system thing was admittedly stupid, but the CD-ROMs sold for hundreds of dollars then, and were the current big thing in multimedia. -
bigal_a — 10 years ago(August 11, 2015 10:05 PM)
In the first edition of William Gibson's cyberpunk novel, "Neuromancer", the protagonist is trying to fence "8 megabytes of hot RAM" to pay off a large debt.
You get more RAM in a packet of breakfast cereal these days, particularly if yo think that in those days, RAM was 8-bit.
I recently got the Kindle edition - the 8 MB is gone.
The restitution of life is no great feat. A variety of deaths may well enter into your punishment