Morality

Discussion for all topics (until the forum becomes large enough to justify splitting things up this will be where all topics go)
chel_of_the_sea
Posts: 27
Joined: Mon May 08, 2017 9:39 am

Re: Morality

Post by chel_of_the_sea » Thu Jun 01, 2017 5:59 pm

Raininginsanity wrote:
Thu Jun 01, 2017 2:38 pm
It seems pretty easy to justify Machiavellian policies based on utalitarianism. I don't think it's in the fringes that the philosophy leads to strange results. But even the fact that it ever leads to strange results means we should probably define our utility function better.
The problem is that those policies have, historically, ended up setting up tyrants sooner or later. Even if we're fairly confident that won't happen - well, so were the folks in the past, and we should bake in a strong unknown-unknown term to our utility.
:arrow: TL;DR: 27, trans woman, M.A. in math, Seattle area. Tutor by current trade, but in a bit of professional limbo (if you know anyone hiring, let me know!).

archon
Posts: 59
Joined: Thu May 25, 2017 11:02 am

Re: Morality

Post by archon » Fri Jun 02, 2017 1:35 pm

chel_of_the_sea wrote:
Thu Jun 01, 2017 5:59 pm
Raininginsanity wrote:
Thu Jun 01, 2017 2:38 pm
It seems pretty easy to justify Machiavellian policies based on utalitarianism. I don't think it's in the fringes that the philosophy leads to strange results. But even the fact that it ever leads to strange results means we should probably define our utility function better.
The problem is that those policies have, historically, ended up setting up tyrants sooner or later. Even if we're fairly confident that won't happen - well, so were the folks in the past, and we should bake in a strong unknown-unknown term to our utility.
Yeah, I think this kind of thing is a important point. Also, in a weird twisty kind of way, I think that there is a lot of utility in trying to maintain hard and fast rules in some circumstances, especially where reputation or temptation are on the line.

That said, are there other ethical systems which allow for nuance/trade-offs within them? All of the other systems I have heard of seem kinda prone to creating fixed rules to follow, and just give a obligation to follow them(with the differences being in how you get those rules, and what they are). This seems inherently unworkable to me, so I assume I missed something. I haven't ever been systematically educated in ethics.
"Don't be silly -- if we were meant to evolve naturally, why would God have given us subdermal implants?"

chel_of_the_sea
Posts: 27
Joined: Mon May 08, 2017 9:39 am

Re: Morality

Post by chel_of_the_sea » Fri Jun 02, 2017 9:37 pm

Oh, I'm by no means saying utilitarianism is bad.

On a practical level I tend to use virtue ethics to get through the day and build good habits, but utilitarianism is always the tiebreaker when I'm unsure of my principles.
:arrow: TL;DR: 27, trans woman, M.A. in math, Seattle area. Tutor by current trade, but in a bit of professional limbo (if you know anyone hiring, let me know!).

theojones
Site Admin
Posts: 14
Joined: Mon May 01, 2017 2:18 am

Re: Morality

Post by theojones » Mon Jun 05, 2017 1:15 am

For me, probably some kind of non-utilitarian consequentialism. With a lot of sympathy to preference utilitarianism, but not quite fully on-board there.

I think there are some things that have intrinsic value beyond utility, whether defined in terms of hedonism or preference satisfaction, but I have a lot of the same intuitions as utilitarianism.

I don't find hedonistic utilitarianism very plausible because of the wireheading problem and because of there being more things that have value than pure pleasure versus pain. Preference utilitarianism in my opinion goes too far in the other direction, as you can imagine some bizarre preferences that would still be given strong weight (think of a non-powerful paperclip optimiser). I also have some qualms about agent neutrality and the over demandingness problem. As far as more formal philosophy I've read, I'm sympathetic to variants of perfectionist consequentialism, along with the variant of consequentialism proposed by Philip Pettit that weights liberty as intrinsically valuable (at least on the larger scales of political institutions, and society as a whole), and the attempts by Ian King and similar to mix consequentialism and virtue ethics.

theojones
Site Admin
Posts: 14
Joined: Mon May 01, 2017 2:18 am

Re: Morality

Post by theojones » Mon Jun 05, 2017 1:33 am

chel_of_the_sea wrote:
Thu Jun 01, 2017 5:59 pm
Raininginsanity wrote:
Thu Jun 01, 2017 2:38 pm
It seems pretty easy to justify Machiavellian policies based on utalitarianism. I don't think it's in the fringes that the philosophy leads to strange results. But even the fact that it ever leads to strange results means we should probably define our utility function better.
The problem is that those policies have, historically, ended up setting up tyrants sooner or later. Even if we're fairly confident that won't happen - well, so were the folks in the past, and we should bake in a strong unknown-unknown term to our utility.
Yah. And its important to create political and social norms that can not easily be abused by bad people, because the rest of the world is not perfect in its ability and willingness to do good. When you do Machiavellian behavior you run the risk of breaking down some of the social trust required by society to function, and risk moving norms in a direction that is susceptible to abuse.

theojones
Site Admin
Posts: 14
Joined: Mon May 01, 2017 2:18 am

Re: Morality

Post by theojones » Mon Jun 05, 2017 1:36 am

dylygs wrote:
Wed May 31, 2017 1:23 am
Interestingly, what comes to mind as a good analogy for what I'm thinking is Eliezer's coherent extrapolated volition, about which I've only read Nick Bostrom's summary near the end of Superintelligence. True morality seems more like a hazy, democratically-defined thing that we can derive from saying that it's what we would all want for ourselves and each other if we were smarter, wiser, more empathetic, etc., as those qualities get arbitrarily high. This gives a more "go with your gut" sort of attitude, which simultaneously fits my personality pretty well and clearly isn't as immediately useful as a solid theory like utilitarianism or something.
My intution is that is too ill-defined for my tastes. Just referring to what people want to think doesn't really do much to resolve the issue. Because I want something that is a guideline to what I should do, what values should I have, and such.

CEV would in part be a reflection of what people currently think and what they currently value, not really a metric of what they should value.

archon
Posts: 59
Joined: Thu May 25, 2017 11:02 am

Re: Morality

Post by archon » Mon Jun 05, 2017 2:09 am

theojones wrote:
Mon Jun 05, 2017 1:36 am
dylygs wrote:
Wed May 31, 2017 1:23 am
Interestingly, what comes to mind as a good analogy for what I'm thinking is Eliezer's coherent extrapolated volition, about which I've only read Nick Bostrom's summary near the end of Superintelligence. True morality seems more like a hazy, democratically-defined thing that we can derive from saying that it's what we would all want for ourselves and each other if we were smarter, wiser, more empathetic, etc., as those qualities get arbitrarily high. This gives a more "go with your gut" sort of attitude, which simultaneously fits my personality pretty well and clearly isn't as immediately useful as a solid theory like utilitarianism or something.
My intution is that is too ill-defined for my tastes. Just referring to what people want to think doesn't really do much to resolve the issue. Because I want something that is a guideline to what I should do, what values should I have, and such.

CEV would in part be a reflection of what people currently think and what they currently value, not really a metric of what they should value.
See now, this seems to be one of the basic problems with arguing about morality. There seems no point if your thought is just going to spit out "go with your gut" - this provides no new insights or thoughts. One the other hand, nobody seems quite so angry about a new system of morality than when it fails to match their intuitions in every possible case.

I'm not sure what to do about that.
"Don't be silly -- if we were meant to evolve naturally, why would God have given us subdermal implants?"

dylygs
Posts: 7
Joined: Thu May 25, 2017 5:18 pm

Re: Morality

Post by dylygs » Mon Jun 05, 2017 3:12 am

chel_of_the_sea wrote:
Fri Jun 02, 2017 9:37 pm
On a practical level I tend to use virtue ethics to get through the day and build good habits, but utilitarianism is always the tiebreaker when I'm unsure of my principles.
I think, practically, this is probably the answer, or at least really close to it if you want to be maximally ethical. I don't think we can create a full ideology that satisfies all of our intuitions and also gives us pointers to where we don't have intuitions, and I don't think spending time looking for one is practically productive.

Post Reply