Why do I get the feeling that mankind is slowly but surely engineering our own downfall? I can see the practical use of some robots but others seem like giant experiments designed just to creep people out. And why does it creep us out? Look at the movies and then answer that question. For one it’s the fear of the unknown that humanity still harbors no matter that this same fear still pushes us forward and inspires us to do bigger and better things. The ‘better’ part is very subjective at this point since I don’t get the idea of making a robot that walks like an animal and doesn’t seem to serve a purpose other than being intelligent enough to ride the elevator as a huge leap forward. But then I’m just a writer, not a scientist that thinks robotics are the way of the future.
They may very well be, but is it a future we really want? At this point machines are running our lives in so many different ways that it seems as though the future we see in film is already approaching, or could be here. Despite the dire warnings that could be taken into consideration through entertainment we still seem content to engineer a way for the creations born in labs and on work benches to get better, smarter, quicker, and in some ways a little more dominant over our waking lives. Don’t get me wrong, I enjoy my smart phone, my tablet, the computer I’m working on at this moment, and other gadgets that make life a bit easier. But I also enjoy the fact that they’re easy to turn on and off.
What happens when an AI decides to override their ON/OFF switch? There’s something creepy about the way this thing in the clip moves, and even how it raises one limb to push the button for the elevator. It seems irrational to fear something that has a well-defined program and a purpose that was installed by its creator, but the whole issue is that once such things are up and running on their own without any need for humanity to maintain them, what’s to stop them from doing their own thing? Historically in movies when machines are given free will and their own purpose things don’t go well. Some movies are a little more positive than others, but a lot of them are just flat out scary.
It’s easy to shrug this off and say that humans are in control, because at this moment they are. No AI has taken over and started running its own programs yet so far as people know, and the systems keeping the launch codes and other vital information for humanity are still under human control. The simple fact is that humans seem to want to create no matter what direction this takes them in. The morality and ethical side of it however becomes blurred quite often and takes a back seat to the idea that it CAN be done.
But should it?