Home
If you're familiar with Star Trek, you know that a young Star Fleet cadet named James T. Kirk had an innovative approach to a training exercise that no one had ever beaten. (I'm going to go out on a limb and assume that most Dilbert Blog readers are familiar with Star Trek.)

That Star Fleet training exercise essentially asked young Kirk, "What would you do if this happened to you?" In my post from earlier this week, I asked readers if it was moral to kill a guy who was 99% likely to kill you in a year. The most common response was something along the lines of "You can't calculate the odds of that sort of thing."

This is a fascinating response, and it's the sort of response I often get when asking a hypothetical question on any topic. It leaves me wondering if the person is unclear on the concept of hypothetical questions, or if he's pulling a James T. Kirk maneuver to avoid exposing some flaw in his reasoning.

Do any of you James T. Kirks want to try answering the hypothetical question again, this time without cheating?

If it makes it easier, I will stipulate that in the real world, people are notoriously bad at predicting the future. You could never have 99% certainty that some guy was going to kill you within a year. But in a hypothetical world where you COULD know that the odds were 99%, is it moral to kill that guy in order to probably save yourself?

 
Rank Up Rank Down Votes:  +10
  • Print
  • Share
  • Share:

Comments

Sort By:
0 Rank Up Rank Down
Jun 5, 2009
I hope no-one said this before: I believe it is selfdefense if I crash his head with a shovel.
If there is a 99% chance that he will kill me within a year, it is the same probability as if he would attack me right now with dagger in his hands. So I am entitled to react to this attack - the bastard is probably planning the details of his wrongdoings by now.
 
 
-1 Rank Up Rank Down
Jun 5, 2009
It isn't a question of morality, it's a question of practicallity. Given that our only purpose is to fulfil our role in Natural Selection then whether you should kill him or not depends upon his likelihood of further repoduction vs yours.

 
 
Jun 5, 2009
you cant have your cake and sit on the fence too.
heres some unstructured ramdom thoughts

- using the morality of this universe to judge an action in some other hypothetical universe doesnt work. in that hypothetical world i'd wait till the probability was 100% and then kill the other guy,
but while i was waiting i'd probably work out some way to reduce the probability.

- our morality is informed by the consequences of our actions, in a hypothetical universe where the probabilities are better defined morality would be different so again the question doesnt work

- folowing on from one of the earlier comments, in this hypothetical universe, anytme the probability of someone killing me goes above zero, then that would raise my probability of killing that person first to at least the same, which would triger them to increase their probability of killing me, and then we keep cycling till one of the probabilities hits 100% - (so everytime i cut someone off in traffic, it ends up in a gunfight)

I think the question becomes easier if you phrase it along the lines of 'if you know for sure (100%) that someone is activley planing to kill you, do you kill them first' .
in this case my answer is yes.

but I wouldnt act until i know for sure that the asination attempt / nuclear strike etc. is underway.

on the other hand, if someone that is likley to kill me goes out and buys a sniper rifle, i'd probably try to take away the gun, but not kill him until i know for sure that he plans to use it (and i'd probably try to stop him getting his hands on the sniper rifle in the first place)
 
 
Jun 5, 2009
hmm...

If there is a 99% chance that the person will kill me, then there is a 100% chance that I will (attempt to) kill them first. Which means, (assuming all stats are known to all parties) that as a result of knowing that im 100% going to kill him he will now have a 100% attempt at killing me, (bear with me here) so No, by trying to kill him i increase the chance of me being killed. In actual fact I will just reprogram the stats to give me a 99% chance of him saving my life.

Beam me up Scotty.

God doenst need to save England, coz I reprogramed the moist robot to do it for him!
 
 
Jun 5, 2009
Yes. I would kill a man who had a definite 99% probability of killing me within a year.
 
 
Jun 5, 2009
It may not be moral to kill him, but it's certainly moral to try - assuming your chances of success are 99% or less. (Can you calculate THOSE odds in this hypothetical world?)
 
 
-2 Rank Up Rank Down
Jun 5, 2009
According to our human standards it is immoral to kill. End of story.
The problem is wiht defining morality. For example lions don't seem to think it's immoral when the new leader of the pack kills the offsprings of th old leader.
It's called suvirval of the fittest.
Now humans figured out a way to escape survival of the fittest natural law. It's called morality & law. The weak can hide behind them.
Since defining some actions as being moral or immoral seems to be based on cheating the laws of nature I would say your point is .... pointless.
 
 
+5 Rank Up Rank Down
Jun 5, 2009
I don't agree that morality is binary. It seems safe to say that most people would feel that killing someone under your hypothetical !$%*!$%*!$%*! would be somewhat more moral than killing someone if they were not likely to kill you. I doubt it's possible or even meaningful to assign numbers the degree of morality and I'd guess that people would differ greatly on the details, but most people seem to have a more-or-less shared sense of when something is more moral than something else.

I think this is the trap in your question. It might be fairer and more meaningful to ask people to compare two situations and decide which they thought would be the more moral action and why. Asking them to label the killing as either moral or imoral (by which I assume you mean morally justifiable or not) is a lot harder. This is also partly because our sense of morality is not based on logic. We tend to use logic to justify our moral conclusions, but apparently not to reach them in the first place. A lot of people have said that the motives of the would-be killer are important and I think this is a very important point: our judgement on morals requires information like that. We care about motivations and fairness and whether people are being considered as objects. These and a whole bunch of other considerations are an important part of how we come to moral conclusions.

If you make the question *too* hypethetical, we just don't have enough information to make a moral decision and asking people to pick whether an action is either moral or not ignores the way they percieve morality being distributed across a spectrum, which is much more useful information.
 
 
+3 Rank Up Rank Down
Jun 5, 2009
It depends...

If there is another way to avoid getting killed, or at least to reduce drastically the chances (e.g. running away and hiding for a year), then it is not moral to kill the guy.

If the ONLY way to avoid being killed, is to kill him, then it is OK.

If the chance is 5%, then it is not OK.

Of course, the question then arises, if it's not OK at 5% but is OK at 99%, at what percentage does it change from not OK to OK? I don't know, but it's somewhere between 5% and 99%...
 
 
0 Rank Up Rank Down
Jun 5, 2009
The thing is, you COULD be 99% sure this guy would kill me in a year because you've told me beforehand that this man intends to kill me within that time frame. This is the death race 2000/ running man sort of alternate universe, then. And you know how those always turn out... the people HOSTING the game are the ones who wind up hunted down. Better to let sleeping dogs lie.
 
 
Jun 5, 2009
I think the problem with your question is that "morals" are largely a product of societal consensus, rather than springing into being within each person (even though they may imagine they thought of them themselves). Note how what is acceptable has changed throughout history. Given that, who is to say what a society that could predict the future would settle on, as being moral?

Such an ability (precognition) would so profoundly affect all aspects of a society, that it would be essentially unrecognisable. For example, the ability to predict the future would presumably give one infinite computing power, and likely superhuman artificial intelligence, the outcome of which is by definition, unknowable.

However, I can say that if I knew someone would kill me with 99% certainty, I would do them in first if I could, and sleep like a baby! That's nothing to do with morals though.


 
 
0 Rank Up Rank Down
Jun 5, 2009
One of the problems is that, Say i Do in fact kill this man based on the ideer that there is a 99% chanse he will kill me within a year I have accually changed the numbers aroud havent I ? The other man now being dead Now has 0% liability o killing me But I on the other hand have rised to a full 100% chanse of killing him. Unles you define exaktly what the % equal in real life the nummbers are pointless, Was the 1% of him killing me equall to my 100% of killing him? For in that case the 99% are pointless arent they?
 
 
Jun 4, 2009
The funniest thing about your blog today was this one liner to those that said you couldn't live in a world where there could be predictions at 99% accuracy:

"It leaves me wondering if the person is unclear on the concept of hypothetical questions"

Ha, ha! I know how you feel. I get the same response when I try to pose hypothetical questions to my wife. We just can't get past the part where she understands it's hypothetical!! Frustrating.

In answer to your question, if I knew there was a 99% chance that somebody would kill me later then to NOT do something about it would mean I value his/her life more than my own and I would be a meek little lamb to accept that fate. It's right to fight for your own survival even if it means your enemy must die.
 
 
Jun 4, 2009
I'm shocked at the conviction of your readers, Scott.

Do any of you even know on what basis you make your judgments? Philosophers have discussed right and wrong for millenia - and for good reason. Mathematics began with civilizations who didn't even know that zero wasn't a number. Philosophy, like mathematics, has developed through the ages by deep thought and consideration from some of humanities greatest intellects.

But no, Joe Blow has morality all figured out. What ever sounds good, man. Whatever feels right.

Most living moral theories today support preemptively killing someone in self defense. Self defense doesn't require absolute knowledge in order to be justified.
 
 
Jun 4, 2009
No not moral. It cant be moral to kill some on the chance they may kill you.

I think I would do it however. If I thought it likely you were going to kill me and I couldn't run, hide or gain protection I would have few options. I have no desire to die and a real or imagined likely threat would most likely result in me attempting to protect myself by whatever means necessary.
 
 
0 Rank Up Rank Down
Jun 4, 2009
Yes. It's justifiable self defense.
 
 
Jun 4, 2009
Honest response...

I'm not killing the guy for either one or both of these reasons:

1 - I'm not good at coping with guilt. Being dead would be preferable to the guilt.
2 - I'm too lazy to kill people.

 
 
Jun 4, 2009
You just won't let this thought die the quiet death it deserves, will you?
In this hypothetical world of yours I'll be flying on pink elephants so no one has a chance to kill me.
Oh, and it's not fair to use your argument to justify Israel attacking Iran and then fall back to 'hypothetical argument' when everyone and his dog has punched holes in it.
 
 
+3 Rank Up Rank Down
Jun 4, 2009
you don't kill the guy, you just put him in the red suit, and beam him down. the guy in red always dies. there is no exception to this rule. (it is even followed in the new movie)
 
 
Jun 4, 2009
Sorry, Spock, half Martian, not Dr. Benjamin Spock pediatrician.
 
 
 
Get the new Dilbert app!
Old Dilbert Blog