A system of morality may be based on whether the outcome of an action is favorable or unfavorable.
Suppose you were placed at the control panel of a nuclear bomb about to explode and kill millions, and in order to stop it, you had to flick the switch off. Does that make switch flicking a moral action, because the outcome was favorable in that instance?
Now suppose that even though you thought flicking the switch off would stop the bomb, it actually causes it to explode. Is your action still moral?
Suppose you were placed at the control panel of a nuclear bomb about to explode and kill millions, and in order to stop it, you had to flick the switch off. Does that make switch flicking a moral action, because the outcome was favorable in that instance?
Now suppose that even though you thought flicking the switch off would stop the bomb, it actually causes it to explode. Is your action still moral?
