Home · Maps · About

Home > SubChat
 

[ Read Responses | Post a New Response | Return to the Index ]
[ First in Thread | Next in Thread ]

 

view flat

Re: Editorial: ''De Blasio's Subway Follies''

Posted by Nilet on Wed Jul 26 14:26:44 2017, in response to Re: Editorial: ''De Blasio's Subway Follies'', posted by New Flyer #857 on Wed Jul 26 12:45:49 2017.

edf40wrjww2msgDetail:detailStr
Proof, please. Or revise to something like "That has not been substantiated or shown."

Linguistic nitpick. It's an absurd claim with no evidence; "that has not been substantiated or shown" is just a euphemistic way of saying: "It's factually untrue."

Agreed. But why is it that constant universal desires should automatically be "blessed" via rights?

Because everyone wants them. If every single member of a group reaches a unanimous decision on something that doesn't affect any outsiders, why shouldn't the group follow the unanimous decision?

In effect, they're not. Nobody wants to go to prison, and yet some go. There are benefits to others when one goes to prison.

Until every human on the planet has perfectly compatible desires, there will always be people who don't get their way. Morality seeks to minimise that, knowing it can never be eliminated.

So nobody wants to be shot. . .you still have not indicated why one should not be shot, objectively speaking

What does that mean? If everyone decides they don't want to be shot, that unanimous agreement is an objective fact. If a no-shooting rule guarantees everyone's desire to not be shot will be fulfilled, then that's an objective fact. If every rational person would agree to that rule, knowing that they forfeit their right to shoot others and perceiving that loss as worthwhile to attain the benefit of not being shot, then that agreement is an objective fact and a moral rule is cemented.

since there almost always stands someone to benefit, however indirectly, from the loss of another's life.

And again, morality doesn't mean everyone always gets what they want. It means that in general, the extent to which people don't get what they want is at minimum. Shootings are invariably negative sum; the benefit to the shooter is always smaller than the cost to the victim. As such, forfeiting the right to shoot in exchange for the right to not be shot is a good deal.

Yes, if what you want is indeed good for you, ok (there's no guarantee of that by the way).

As I noted before, getting what you want can make you miserable. However, that's not a moral concern— morality isn't about an abstract "greater good," it's about traffic control— who gets their way when people's desires conflict? Morality only considers what you want; whether getting that is good for you is your own problem.

But that still doesn't tell me that it's good for a select population to gain. And if we don't know that, then we don't know if overall "gain" is even a good thing because it's defined by you or whoever labels it gain.

Again, morality is about traffic control. The traffic control system is concerned with preventing crashes and minimising the number of times a car must yield; where the cars are ultimately going and whether they'll enjoy being there is not really within its purview. Similarly, morality is about minimising the extent to which a person finds their desires subordinated to those of another person— and, by extension, maximising the extent to which you get your way. Whether getting your way will make you happy is your own problem; morality take your expressed desires into consideration but has no need to ask why you express those particular desires.

You don't want felons to gain. Or those who hurt society in not-so-obvious ways.

Well, actually...

One of the big concerns for morality is cheating. A moral system can only stand if people obey it en masse, and people can only be counted on to obey it if the benefit of doing so is greater than the cost. As such, moral systems are vulnerable to cheaters; people who break the rules while benefitting from other people's compliance. If cheaters can claim a free ride, the entire system will collapse, so any moral system must impose a heavy cost on cheating.

That said, violating the rules of morality doesn't necessarily preclude you from gaining. (I'm assuming that's what you meant by "felons or those who hurt society.") It does mean, however, that you'll need to face a penalty sufficient to discourage cheating by other people.

Or resist it, or manipulate it, out of self-interest. Selfism is always a dead end.

On the contrary. Selfishness is a given. Any moral system predicated on the assumption that people will act purely selflessly is a dead end. That's just human nature; we can't change it, so morality must adapt around it.

Luckily, altruism is selfish. Most "selfless" actions are actually within the self-interest of the person performing them. Morality arises out of selfishness.

So then there are billions of different moralities, really. Now if you use the term "basic morality" to describe what those 8 billion agree to, you will not come up with much. Maybe you can squeak in the idea of no shooting for the sake of not being shot, but there are instances in which one actually may need to shoot in order to avoid being shot. Now what?

There are billions of different sets of desires. There is one optimal moral system that guarantees desires are filled to the maximum extent possible. There are countless inferior moral systems, some closer to the optimal one than others.

Morality is not limited to the desires that are themselves purely universal; its goal is to minimise the extent to which one person's desires must yield to another's. If the optimum system were made available to us, we'd all know it would deny us our wishes at least some of the time— but we'd also know that no other system would deny us our wishes less often.

Since the optimal system isn't available, we're stuck with an inferior version, groping towards the optimal form but never actually reaching it. However, the same principle applies.

You're looking to make the world "better." (If not, why bother with this whole enterprise?)

I'm looking to get what I want.

And the best way to do that is to impose a set of rules such that people get what they want to the maximum extent possible. No other system could get me closer to my goal of getting what I want.

That means you're looking to approach some "goodness" mark, or some type of perfection. What would that be?

"Perfection" is the state in which the number of times someone has to yield to someone else's desires is as low as it can be given the conflicting nature of human desires.

You can't go by majority happiness (since minority unhappiness may outweigh it), you can't even go on greater overall happiness at a given time becuase human happiness at any given moment is not reflective of whether future human unhappiness will outweigh that, or even if other species' happiness should be taken into account.

"Happiness" isn't the relevant criterion— getting your way is. After all, it is your responsibility to choose your goals such that achieving them will make you happy.

Different people weight different desires differently, but pretty much everyone places the greatest value on autonomy, which largely precludes the utilitarian minority problem— we all agree that respect for our autonomy is vital, so oppression is always a negative sum game and thus precluded as not in the aggregate interest.

Since past and future desires can never come into conflict, morality will never be called upon to resolve such a conflict.

As for the desires of other species—

I noted above that cheaters can quickly destroy a moral system. This is because a moral system can only function if the cost of obeying its rules is outweighed by the benefit of other people's obedience; a free rider who gains the benefit without paying the cost creates an incentive for others to ride for free, which eliminates any benefit, which destroys the incentive to participate, which destroys the system.

A moral system can only function if it excludes free riders.

As such, while moral consideration is not specific to humans per se, it is limited to people who can understand, accept, and obey its rules. We have not yet found any other species who can do so, though some might be worth partial consideration.

Again, not necessarily. Those people may hurt others by challenging the system to help themselves. That doesn't necessarily make it bad for them and it can be a quite rational decision for them to preserve their own happiness (since selfism is key in your philosophy anyway) even at the expense of others' happiness. Humanity has been doing this all through history. . .we may say it's wrong, but your system has not explained why or how.

All through history, humans have tried to get ahead at others' expense.

And all through history, humans have gone out of their way to punish cheaters who profit at the expense of others. With limited success, admittedly, but the drive is practically instinctive.

If you're proposing those "harmed" by a system will try to cheat it, see my examples above; moral systems need to be resilient against cheaters. If you're proposing they night tear down the system entirely, then that's irrational— however much they're "harmed," they're still guaranteed to be worse off with anything else.

Ok, but then how do you decide who goes first, or who gets the longer "green" phase of the cycle?

Again, the goal is to minimise yielding and eliminate crashes. Which ruleset does that is a practical problem; a question of fact, with one answer that is objectively true for everyone.

Yes, you want a "fair" cycle so that it is enforceable and most people will agree to it, but this tells me nothing about how rights transcend governments.

If Ruleset A produces zero crashes and 12 yields, and it is logically impossible for any ruleset to produce zero crashes and 11 or fewer yields, then Ruleset A is optimal and every rational driver will support it.

That's a fact and will be true no matter what any government decrees.

Since Ruleset A and the process that establishes it is the morality of our simple traffic-based analogy system, that proves morality transcends government.

Real-life morality is more complicated of course, but the same principle applies.

It just tells me that governments should look at what people will agree to in their respective territories and then decide what rights are good just based off that so they won't be constant lawbreakers -- in which case those rights can vary over time and do not transcend the government.

That's really just a fancy way of saying that governments should look to morality to establish which rights should be enshrined in law— or, in other words, that morality transcends government.

So do rights come from the government or do they transcend it? If in a given territory circumstances exist such that a right recognized elsewhere is not seen as practical (by an overwhelming majority of the people), does that mean that that government should not recognize that right?

Morality transcends government. Rights derive from morality. As such, rights transcend government as well.

However, while morality is by its nature universal, rights are a tool to facilitate morality which means they may change slightly with circumstances. Most fundamental rights won't change much under most circumstances, but extreme circumstances may create exceptions.

Just for example— the right to privacy is pretty fundamental. Humans really can't function without some degree of privacy, so we all want it.

But suppose a hurricane sweeps through and a large crowd find themselves in a shelter where privacy is functionally unattainable. In that case, the right to privacy is temporarily suspended simply because it's impossible to fulfill. Remember, morality is about resolving conflicts between the desires of different people; it can't provide a remedy for someone whose desires are rendered physically impossible to achieve. Even a universal desire of the sort that gives rise to a fundamental right must sometimes yield to intractable physical reality.

However, no government suspension of the moral right to privacy is valid— if the government removes the legal right to privacy under circumstances that violate the moral right to it, then the government is in violation of morality and should be corrected.

Responses

Post a New Response

Your Handle:

Your Password:

E-Mail Address:

Subject:

Message:



Before posting.. think twice!


[ Return to the Message Index ]