Kill 1 person. I feel it would be cowardly to pass the buck and risk killing 2.
Lest they do the same and kill 4, etc.
But what happens when you get to, say, the 34th person, and there are 2^33 people tied up, more than there are living humans in the world? Pass the buck, break the simulation, save the world
I’ve never been much of a gambler. I’ll stick with my one kill.
Double it. Then the other guy will double it, and so on. Infinite loop = no deaths.
Or straight up 8 billion
Can’t kill 8 billion, when half of them are tied to the “no kill” tracks.
Instead of killing one, you’re saving half of humanity! Double it!
Eventually everyone is tied to the tracks and there’s no one left to change the trolley’s course.
Who said there was a limit?
And then there’s some psycho on round 34 who kills all 8 billion people alive on earth.
This will create an incentive for people who have 2, 4, 8, maybe even more more people om the tracks to not double, making the idea even worse.
That’s on them
I think everyone here is missing the real answer. If you look at the picture you will notice a third option, there are track switches, two of them, you can bypass the people tied to the track, then kill the monster forcing you to kill for no reason.
If I must kill 1 person or cause even more death, I suppose I’d kill the person responsible for this scenario.
Also interesting: What would you choose here if you were an evil psychopath? (Asking for an acquaintance.)
Switch the track from the bottom to the top as the train is half way over the switch, causing the train to drift across both rails hitting all three tied up people and the second switch operator.
MULTI TRACK DRIFTING??!!!
Depends on if you’re happy with someone else killing lot more people, or if you want to kill someone yourself.
Assuming this goes to infinity, the reasonable thing to do is to kill one person to prevent someone else killing a lot of people. But that would make you directly responsible for killing that person.
Isn’t redirecting making you directly responsible for minimum of 2 deaths?
No, that’s someone else’s choice to kill.
Not the person who you were talking to but I don’t think i will feel responsible if conductor after me decides to kill bunch of people with a trolley. And i wouldn’t be responsible if i saved people by not getting them gored by some trolley.
You’d be indirectly responsible for those deaths.
Fair enough
Electrifying the tracks.
You would need a crazy low probability of a lunatic or a mass murderer being down the line to justify not to kill one person
Edit: Sum(2^n (1-p)^(n-1) p) ~ Sum(2^n p) for p small. So you’d need a p= (2×2^32 -2) ~ 1/(8 billion) chance of catching a psycho for expected values to be equal. I.e. there is only a single person tops who would decide to kill all on earth.
Well what about the fact that after 34 people the entire population is tied to the tracks. What are the chances that one person out of 35 wants to destroy humanity?
Also thing the entire human population to the tracks is going to cause some major logistical problems, how are you going to feed them all?
I just calculated the sum from n=0 to 32 (because 2^33>current global population). And that calculation implies that the chance of catching someone willing to kill all of humanity would have to be lower than 1/8 billion for the expected value of doubling it to be larger than just killing one person.
Yeah I think I was in a stupor when I commented. I don’t think I even tried to understand your comment. My apologies. But now that I am trying, I am struggling to understand the notation.
Oh come on. A trolley is not going to have the momentum to kill that many people nor would the machinery make it through. The gears and whatnot would be totally gummed up after like 20 or so people.
You don’t even need a lunatic or mass murderer. As you say, the logical choice is to kill one person. For the next person, the logical choice is to kill two people, and so on.
It does create the funny paradox where, up to a certain point, a rational utilitarian would choose to kill and a rational mass murderer trying to maximise deaths would choose to double it.
It’s always “double it” Anyone after 34 flips the kill all humans, that’s their fault not yours
Why do you care whose fault it is? You’d want to minimise human deaths, not win a blame game.
Doubling action forever minimizes human deaths.
Unless someone decide to hit kill. In that case, it’s them doing it. I’m invalidating the argument that pre-empting imaginary future mass murders justifies killing one person today.
Idk which moral system you operate under, but I’m concerned with minimising human suffering. That implies hitting kill because chances of a mass murderer are too high not to. You also don’t follow traffic laws to a t, but exercise caution because you don’t really care whose fault it ends up being, you want to avoid bad outcomes (in this case the extinction of humankind).
My moral system somehow does not chose to kill people through action against an imagined threat and is therefore objectively superior as is it not susceptible to hostile memetic manipulation (Molloch, Pascal’s wager, Pascal’s mugging, basilisks, social hysteria etc.) and is capable of escaping false choices and other contrived scenarios, breaking premise and the rules of the game as needed to obtain the desired outcome.
So how does that killing thing work, doing it by yourself or just thinking and the person dies?
I think with this scenario it’s indirectly caused by you. Either you ‘press a button,’ directly resulting in the death of a specific individual, or another person is given the same scenario but the button directly causes double the number of deaths if they press it.
Guess the kill one person thing isn’t that bad then. There are quite some people doing major bullshit right now…
Oh, 100%. Fuck the next generation, I mean person.
They simply have to choose not kill anyone.
Nobody in this situation ever has to die. It is not some difficult choice that you are burdening the next person with. The choice is obvious.
Loop continues until entire human population tied to track and there’s nobody left to pass the switch to. kill the scapegoat on round one and done
I’d try to talk to the person on the track to see if they were an asshole and decide from there.
Welcome to climate policy.
That implies that if nobody tries to stop climate change, it’ll never destroy the world.
Perhaps it roughly analogizes to Zeno’s Paradox.
Seems like exactly what politicians are doing. Pass the problems along to the next one.
The Boomer Method
Can I move the rails to kill them all and then circle around and hit me?
The zoomer way
For legal purposes this is a joke
I would pull the lever after the first set of tires were past then the car would tumble and kill everyone but me
Throw the switch to pass and then sprint ahead 31 spots so I can kill 4 billion people like Thanos.