Dice rolls and how they relate to moral truth (sort of?)

Isn't utilitarianism making the choice that produces the most happiness? This is even to the point of discounting your own feelings to make the best choice, right? While it seems nice, it seems impractical. I know there is two schools of thought here. Short-term thought, where harvesting organs from one person would be fine as long as it saved multiple lives, which is Act Utilitarianism. There is also Rule Utilitarianism, which keeps people from doing shit like that, and thinking more long-term about their actions.

A flaw I see in Utilitarianism is bias. All people are naturally egocentric, to varying degrees, but it's part of being human. Realizing this, you'd have to allow the least egocentric people to make all the decisions, all while knowing we could never entirely escape egocentrism influencing our decisions. At this point, how would we decide who would be the least egocentric? What metric can you use in the real world to measure this? Is it even possible really?

I suppose you were suggesting the best way to reach a subjective, well-informed opinion, while knowing it's not entirely objective. In that sense utilitarianism works for creating an opinion, but people can simply point out that you are deciding what would be "optimal happiness" from a position of bias, which could potentially nullify your argument.

I don't think Kant works for me here either. The whole notion of a moral constant is a little too much for me.

An example of why I don't like it. Crash Course video with a time stamp

/r/askphilosophy Thread Parent