Eyenixon said:
No, no. There is NO excuse.
It made no sense, there wasn't even deliberation, it wasn't some long elaborate spiel as in with the Master, it was practically ten fucking lines of dialog that were so incredibly vague that it made absolutely no sense.
Okay, this is rose-tinted glasses. I have been replaying the last two games over the past couple of days. I just finished Fallout again, and I specifically loaded up a save right before Eden so I could contrast his dialogue with the Master's dialogue. You get three lines about the Master's plan, then you say there's a problem with his plan, then you tell him mutants are sterile. He then says the equivalent of "bullshit", you counter, he says the equivalent of "where's your evidence", you supply it, he says "you're lying", you say he's denying the truth, and you finally confirm that his race will die in a generation. It wasn't as long or elaborate as you remember. In fact, it's probably ten lines of dialogue.
The conversations are of similar length. The Master is really not some font of exposition; a lot of his dialogue is repetition in his various voices. Yes, his dialogue is well-written, and goes a long way in establishing his character (given that it's pretty much all you experience of his character) but it is not reams and reams of dialogue. Eden's is undeniably similar in length. Eden's dialogue, however, seems more direct. He states the facts as he sees them, and his dialogue seems quite short and to the point, and very grounded, compared to the Master.
And regardless of people's personal opinions on the 'logic trap' in Eden's shutdown, the argument that it's vague just isn't the case. He makes the argument that the decisions and plans he has made are right, because he is infallible, 'unlike humans'. He knows he is infallible because he has been programmed to be infallible. Forcing him to acknowledge the circularity of this perspective makes him realize that yes, his logic is in error.
k9wazere said:
I have two problems with Eden.
1. AI's that become "self-aware" are the silly. You either design a machine with sentience or it never has any, I'm afraid.
2. No human would ever follow the orders of a machine. The chain of command only applies to humans following human orders.
No military would ever, ever put a machine in charge. No officer would ever agree to be the subordinate of an AI. Especially if it's running Windows 2077.
As a side note, what's with people bitching about how you need to build a sentient computer to have a sentient computer? I mean, seriously? The concept of a supercomputer becoming self-aware is not exactly a groundbreaking advance in science fiction. The Terminator franchise is 24 years old at this point, and that's only the most mainstream example. AM, the supercomputer in I Have No Mouth And I Must Scream, developed sentience independently. That story was published 41 years ago. And the argument above that people wouldn't follow a computer... why not? Both of the stories I mention above involve humans putting an AI in charge of militaries and nuclear weapons. Are those stories absurd now? You cannot rationally argue that this is an absurd occurrence in a sci-fi setting.