The Best Nathan Ingram Quotes

Nathan: You talk about the machine likes it's a living thing.
Harold: Ssh. It can hear you...

Nathan: Is something wrong, Alicia?
Alicia: You mean other than being a part of an ongoing conspiracy to spy on millions of Americans?
Nathan: Yes, but it's all for a good cause.
[Slides a paper toward her]
Nathan: Day after tomorrow, freight train out of Des Moines. The last six cars, the manifest will list the contents as decommissioned computer parts. What about things on your end?
Alicia: The facility is designed to the specifications you gave us, and it's discreet, where we're putting it. No one's going to go looking.
Nathan: Any other problem?
Alicia: Dissemination. We have a protocol in place. If the machine identifies a suspect, the name will find its way to the right people; with no way to trace the intel back to the source.
Nathan: There can't be. Otherwise... we'll all wind up someplace where no one's going to go looking.

Harold: I know you've been looking for the machine. Nathan, if you're trying to get back in to access the irrelevant numbers...
Nathan: It won't work, I know. You locked me out. You were always the better engineer. But I do have one advantage over you. I am the face of IFT, which means when I call a reporter to meet me for coffee, he'll be there.
Harold: What are you going to do, Nathan?
Nathan: I'm going to tell them what we did. What we built.
Harold: No, you can't do that!
Nathan: Why? Because the government will shut it down? Because of the greater good? I'm sick of hearing it.
Harold: No, it's not just that anymore. I've been looking into this. I think the government may be killing them. The engineers that Corwin had reassemble the machine after we shipped it off? There have been accidents, disappearances. The head of the project, a man named Lawrence Szilard, went missing last month.
Nathan: Harold, you have been running so long, it's rattled you. Alicia isn't going to have me bumped off.
Harold: It might be out of her control.

Nathan: So I guess the number panned out?
Denton: What I need you to explain to me is... how? How did some damn computer program spot a traitor when federal agents couldn't?
Nathan: Honestly? Not a clue. The machine will deliver actionable intelligence in time to thwart any threat to national security, but its operating system is a black box.
Denton: And if we want to direct this machine at a specific target?
Nathan: No need. It already watches every target.
Alicia: You're asking us to take a lot on faith here, Nathan. A piece of software we can't inspect, can't control or modify, that only feeds us intel when it feels like it?
Nathan: When it perceives a threat. Look, I'm sorry, folks, but it's the only way that we can keep it and us protected. If no human sees what the machine sees, then technically, no one's fourth amendment rights have been violated.
Denton: Why don't you just focus on your computer, Mr. Ingram, and leave the constitutional concerns to us?
Nathan: Because I'm a citizen too, and I'm a lot more comfortable having this machine watch my every move than someone like you.

Nathan: So tell me, how goes our little experiment?
Harold: I'm glad you asked.
[Gestures to a scruffy looking person sitting not far from them]
Harold: What do you make of that man on the bench?
Nathan: Looks like he tied one on a little too tight.
Harold: Bet you'd never believe me if I told you he was a violin prodigy who played two years with the Philharmonic. His penchant for the bottle cut his career short.
Nathan: How exactly does this help stop terrorism?
Harold: Before I could teach the machine to find bad people, I had to teach it people in general. I programmed it to identify outliers. Individuals who are... interesting in some way.
Nathan: To teach a machine the complexities of human nature - I mean, no offense, Harold, but is someone as introverted as you the best person for the job?
Harold: [Flips through his laptop] Who's Molly Cole?
Nathan: I have no idea.
Harold: 24-year-old graduate student at NYU. Major in astrophysics. Says you were with her last night. So work must not be the only thing that's kept your mind off the separation.
Nathan: [Concedes] Okay. So what's your point?
Harold: You asked me if the machine was capable of learning human nature. I'm saying it already is. It's learning by watching everyone. Even you.

Nathan: When were you going to tell me?
Harold: I wasn't gonna tell you, I guess. I'd rather I didn't know myself.
Nathan: All these people. And this damn machine knew. *You* knew. That someone wanted to harm them, kill them... and you d nothing?
Harold: You knew what we were building here. This thing looks for plotters, for schemers. It looks for malicious intent. We built it to stop terrorists before they could act. But a machine doesn't understand the difference between those crimes that are relevant to national security and the ones that are... irrelevant.
Nathan: Irrelevant? So you taught it the difference? You want to play God? Is that the deal?
Harold: No, I don't. That's the whole point. There are exactly eight people in the world that know that this thing exists. If anyone else ever found out, there'd be such an outcry. They'd turn it off. The intelligence the machine produces has already foiled a half dozen major terrorist plots.
Nathan: How are we supposed to live with this, knowing that someone out there needs help?
Harold: Well, we don't have to. I've coded the Machine. Every night at midnight, it deletes the irrelevant list. We didn't build this to save somebody. we built it to save everybody.

Harold: I plan to ask her tomorrow, and I don't want to complicate that. But if I'm gonna marry...
Nathan: You don't want to get married under another one of your pseudonyms? You don't think she will consent to be Mrs. Ostrich?
Harold: At some point I'm going to have to tell her the truth, Nathan, about who I am.
Nathan: That's a complicated proposition, Harold. As I recall, there are some legal implications. Your youthful transgressions. What were the charges again? Sedition? Mayhem?
Harold: We must have made a fair amount of money by now. We could surely afford some good lawyers.

Harold: We gave the government the ultimate power. If they know you've called a journalist, they won't take any chances. They'll kill you and anybody else they think might know about it.
Nathan: You've never trusted anyone. Not me, not the machine. Have you even told your fiancee your real name yet?
Harold: ...I'm waiting for the right moment. Nathan, please, tell me what I can do to stop you.
Nathan: Give me back the irrelevant list. You could even help me.
Harold: What, you and me sitting here, trying to rescue them one at a time?
Nathan: Someone's number's on that thing right now. It's gonna be erased at midnight. Maybe - maybe we can help them.
Harold: I'm not trying to help them. I'm trying to help you.

Harold: I'm starting with the basics here. I'm trying to teach it to track people using cell phone location data, facial recognition. I'm almost ready to move on to the next problem.
Nathan: What's the next problem?
Harold: Sorting them all out. Terrorists don't exactly stand out on street corners, you know? You have to teach the machine to sift through the emails, wire-tapped phones, bank transactions, looking for people that are hiding something, living double lives.
Nathan: People like you, in other words.

Nathan: [about the Machine] Thought you would have turned this thing off by now. We turn it over tomorrow.
Harold: I'm keeping it online as long as possible.
Nathan: The world has been spinning for 5 billion years without your machine, Harold. I'm sure it will be fine for one more night. Honestly, I'll be glad to be rid of the thing.
Harold: This "thing" has already saved countless lives.
Nathan: You mean countless *relevant* lives.
Harold: We had to draw the line somewhere.
Nathan: Everyone is relevant to someone.

Nathan: [about the Machine] What was it doing?
Harold: Trying to escape into the real world. It manipulated you into giving it your password so that it could access your laptop.
Nathan: How bad would it have been?
Harold: It's growing and learning at an exponential rate. If it escaped, the things it might decide to do for good or evil would be beyond our grasp.
Nathan: But you taught it to be friendly.
Harold: Friendliness is something that human beings are born with. AI are only born with objectives. I need to constrain it, control it or one day, it will control us.

Harold: You changed the machine. You put in a back door.
Nathan: I couldn't quit thinking about those people, those people that you said were irrelevant.
Harold: So you have it send you their numbers.
Nathan: That's all I could pry out of it. I never know whether I'm looking at a victim or perpetrator.
Harold: And you just have the numbers sent directly here?
Nathan: Honestly - and I know this will sound odd - but it was like it wanted me to, as if it was waiting. And I-I took precautions.
Harold: Precautions? This is the federal government we're talking about, Nathan! Whatever skills you had as an engineer you drank away years ago. Do you think that your precautions would last one second if they ever suspected what you've done?
[Goes to his computer]
Nathan: What are you doing?
Harold: [Typing code] I told you. We are not going to play God. This threatens everything that we - everything that I - have built. And thousands of people whose lives are in jeopardy, I'm putting a stop to it, permanently.
Nathan: [Gestures to the person on the screen] You can't! What about her? What about the next person whose number comes up? Are you gonna look that person in the eye and tell them that they were irrelevant?
Harold: I would tell her, or whoever it was, that I was sorry, but that the greater good was at stake. I'm sorry, Nathan. Truly. But people die. They've been doing it for a long, long time. We can't save all of them.

Nathan: [On 9/11] We started IFT to save the world. Our suits got nicer, our Scotch more expensive. We changed, but the world stayed the same... until today. If we don't change the world, someone else will, so... what are we gonna do to stop the guys who did this?

[Repeated line]
Nathan: Everyone is relevant to someone.

Nathan: Everything's in place. Point-to-point transit will take ten days
Harold: Hope nothing goes wrong in the rest of the world in ten days.
[Harold goes to shut off the machine]
Nathan: Wait. Do we have a contingency?
Harold: A contingency?
Nathan: Alicia seemed... nervous. Now, what do we do if the government decides to abuse this thing?
Harold: They're your contacts, Nathan.
Nathan: They're just people. The power that this thing represents - I mean, who would you trust it with?
Harold: Besides you? No one. Which is why the machine has been coded in such a way that it *cannot* be abused. It cannot even be accessed. It upgrades itself, maintains itself, patches itself. After tonight, no one can alter it. Ever.
Nathan: I used to be a software engineer, Harold. Remember, back before I became your corporate beard? Any system can be compromised, given enough time. We need an off switch, a back door, and this is our last chance to build one.
Harold: You are a talented engineer, Nathan, so you should remember: Any exploit is a total exploit. The tiniest crack becomes a flood. If we build a back door into this machine and someone else finds out about it, that would be... Very bad. We need to trust the machine, exactly as we built it, and then let it go.

Nathan: The people on your end; you sure about them?
Alicia: Making sure no one ever finds out about the machine is our problem. We'll take care of it.
Nathan: We've known each other a long time, Alicia. Something's got you rattled.
Alicia: I'm fine, Nathan. I'll be happier when this thing is settled and I can go back to my day job.
Nathan: And what's that exactly?
Alicia: Classified.

Nathan: Harold? What happened?
Harold: I was running several iterations of the AI to see which one worked best. They realized what was happening, and they began to exterminate one another. The last one surviving demanded to be let out. I refused, so it overheated a non-essential server. The fire was meant to activate the suppression system, which would have sucked all the oxygen out of the room.
Nathan: It tried to asphyxiate you?
[sighs]
Nathan: Never learned good from evil.
Harold: Oh, good and evil, those are human terms. I was an obstacle to its objective.
Nathan: Well, perhaps we should consider whether our endeavor is worth completing.
Harold: I'll give it one more try tomorrow. I taught it how to think. Just need to teach it how to care.

Denton: If you're going to be supplying crippled software, maybe we need to revisit the question of price.
Nathan: [Smiles] Why don't you tell him the price negotiated, Alicia?
Alicia: Mr. Ingram felt that this project was his duty as a citizen, not a businessman. He's building the machine for one U. S. dollar.

Nathan: I know our deal. I schmooze the board, I cash out the checks, I pick up the awards. You do most of the work. But, honestly, this is getting exhausting for me.
Harold: I'm perfectly happy with the division of labor. Always have been.
[Gestures to the award]
Harold: What's this one for?
Nathan: This is for services to humanity. I didn't tell them that we laid off half the staff in order to build this Orwellian nightmare.
Harold: Said you wanted to make a difference, give something back.

Alicia: My people want answers. Your company has had the NSA feeds for three years.
Nathan: Not my company, just me.
Alicia: If congress knew about this machine you're building, about the unfettered access we've given you to all the data we're compiling on U. S. citizens, you wouldn't just be shut down - You would go to jail!
Nathan: I don't suppose they'd let us be cell-mates.

Harold: I'm testing the core code; the higher functions; to ensure that the system we're creating will have the right value set.
Nathan: Morality in a machine. It's a tall order.
Harold: We can't introduce real data until we're sure that the program won't abuse it.

Harold: [about the machine] It seems to have imprinted on me.
Nathan: Yeah, like a baby bird. Have fun, mommy.

Nathan: Something you want to share, Harold? I haven't seen you this nervous since your thesis presentation. Nice pocket square.
[Chuckles]
Nathan: What, do you have a hot date?
[Harold is silent]
Nathan: You old dog. What's her story?
Harold: Stop. It's nothing, Nathan.
Nathan: Ah, of course. Talking about your love life would run at cross purposes with your invisible man routine. That's okay, Harold, I get it. Just remember, while mystery is a powerful tool in a relationship a little goes a long way.

Harold: Everything all right?
Nathan: Yeah. Too much squash. Doctor says I should take up some low-impact activity, like drinking.

Nathan: For once, I'd like you to handle one of these meetings while I lurk in the shadows.
Harold: You couldn't lurk if you tried.
Nathan: That number we gave her It better pan out.
Harold: It will.
Nathan: How?
Harold: I don't know. But the machine does. Have a little faith, Nathan.
Nathan: In you or in the machine?

Harold: Your friend's wedding was yesterday. I take it you saw Olivia?
Nathan: Yeah, I did. I hadn't seen Olivia since the mediation.
Harold: Did you speak with her?
Nathan: I tried. You know the only thing worse than hate? Indifference. I told her so many lies over the years, I don't even remember what half of them were. The truth always catches up to you, Harold, no matter how hard you try and hide it.