Here's the exchange from which I took the quote. They're discussing what appears to be a Central Air Command computer failure which may have inadvertently ordered a nuclear strike. Mr. Knapp is the electronics contractor, Mr. Swenson is the Secretary of Defense, Prof. Groeteschell is a hawkish government advisor, and Colonel Cascio is an Air Force commander who suspects sabotage.
KNAPP: The more complex an electronic system gets, the more accident-prone it is. Sooner or later it breaks down.
SWENSON: What breaks down?
KNAPP: A transistor blows... a condenser burns out... sometimes they just get tired, like people.
GROETESCHELL: Mr Knapp overlooks one factor. The machines are supervised by humans. Even if the machine fails, the human being can always correct the mistake.
KNAPP: I wish you were right. But the fact is, the machines work so fast, they are so intricate, the mistakes they make are so subtle, that very often a human being can't know whether the machine is lying or telling the truth.
CASCIO: Maybe this time there wasn't any failure. Maybe the Russians have come up with a way to mask the real position of Group Six.
Personally, I like the quote just because it brings to mind some principles still used in modern computer programming about systems design and risk management.