Something I’ve been thinking about a ton recently:

People invested in social change often use arguments of the form “you should do this or else you’re a bad person,” presumably because they think that this will motivate people to do it more effectively than arguments like “you should do this because here are all the ways it would make the world better,” or “you should do this because you would benefit from it yourself,” or others.

But in my experience as a writer and educator, this actually gets in the way because it paralyzes people. The thought of being A Bad Person looms so prominently in their minds that rather than reevaluating their opinions or behavior, they protest and get angry and resentful. The idea of being judged A Bad Person by someone they don’t even know makes them indignant, and that’s not a situation conducive to learning or social change.

That’s why I’ve largely been moving away from labeling people Good or Bad, both out loud and in my own thinking. Yes, sometimes it’s unavoidable to think that someone is A Bad Person, but I find that these thoughts *get in the way*. A Bad Person can be written off forever. A Bad Person can also be abused and treated cruelly with no ethical qualms, and you can probably see how this easily leads in a really bad direction. 

Feminism dares to expect more from men. Feminism expects a man to be ethical, emotionally present, and accountable to his values in his actions with women—as well as with other men. Feminism loves men enough to expect them to act more virtuously and actually believes them capable of doing so. Feminism is a vision that expects men to go from being ‘just guys,’ accepting whatever they might happen to do, to being just guys—capable of autonomy and authenticity, inspired by justice.
—  Michael Kimmel, ‘Guyland’

Brain scans reveal how people ‘justify’ killing

A new study has thrown light on how people can become killers in certain situations, showing how brain activity varies according to whether or not killing is seen as justified.

The study, led by Monash researcher Dr Pascal Molenberghs, School of Psychological Sciences, is published today in the journal Social Cognitive and Affective Neuroscience.

Participants in the study played video games in which they imagined themselves to be shooting innocent civilians (unjustified violence) or enemy soldiers (justified violence). Their brain activity was recorded via functional magnetic resonance imaging (fMRI) while they played.

Dr Molenberghs said the results provided important insights into how people in certain situations, such as war, are able to commit extreme violence against others.

“When participants imagined themselves shooting civilians compared to soldiers, greater activation was found in the lateral orbitofrontal cortex (OFC), an important brain area involved in making moral decisions,” Dr Molenberghs said.

“The more guilt participants felt about shooting civilians, the greater the response in the lateral OFC. When shooting enemy soldiers, no activation was seen in lateral OFC.”

The results show that the neural mechanisms that are typically implicated with harming others become less active when the violence against a particular group is seen as justified.

“The findings show that when a person is responsible for what they see as justified or unjustified violence, they will have different feelings of guilt associated with that – for the first time we can see how this guilt relates to specific brain activation,” Dr Molenberghs said.

The researchers hope to further investigate how people become desensitised to violence and how personality and group membership of both perpetrator and victim influence these processes.

Like anybody can tell you, I am not a very nice man. I don’t know the word. I have
always admired the villain, the outlaw, the son of a bitch. I don’t like the clean-shaven
boy with the necktie and the good job. I like desperate men, men with broken teeth
and broken minds and broken ways. They interest me. They are full of surprises and
explosions. I also like vile women, drunk cursing bitches with loose stockings and
sloppy mascara faces. I’m more interested in perverts than saints. I can relax with
bums because I am a bum. I don’t like laws, morals, religions, rules. I don’t like to be
shaped by society.
—  Charles Bukowski, South of No North”

The amorality of technology

Sparing a woman from street harassment by charging five dollars to walk with her is pretty questionable, but this video parody of the sharing economy brings to mind the ethical issues that a new tech-fueled economy might present.

Eric Giannella, a PhD candidate in sociology at UC Berkeley, is exploring these very issues. His recent essay asks what the idea of progress really is and how morality may be comprised by it:

The progress narrative has a strong hold on Silicon Valley for business and cultural reasons. It makes business easier because companies can claim to be contributing to progress while skirting the moral views of the various groups affected by their products and services. Most investors would rather not see their firms get mired in the fraught issue of defining what is morally better according to various groups; they prefer objective benefits, measured via return on investment (ROI) or other metrics. Yet, the fact that business goals and cultural sentiments go hand in hand so well ought to give us pause.

Read the full essay.

People who think that our morality is all about big grand principles rather than emotions should try playing a video game. Most people I’ve talked to about this have a hard time with things like killing “innocent” people in games, and feel compelled to do “good” things in the game. I certainly do. I recognize that this is fundamentally irrational, but it’s also a really useful reminder that my sense of right and wrong probably comes from my automatic emotional responses to things, and if those responses aren’t in accordance with my higher-order values, I’ll have to intentionally work on that.

This is a totally random rant but I am so against circumcising babies. If a twenty something year old wants to get himself circumcised for religious or “health” reasons, go for it. But cutting an infant in a way that will affect him for the rest of his life is so freeking wrong. Allow your child to make those sort of decisions for themselves.

anonymous asked:

Hi, I've been wondering for a while about how different types develop their sense of morality/ethics. I know the difference between Fi and Fe, but would it make a difference where the feeling function fell in someone's functional stack? Like, would an EXFJ have a different process of developing their morality from an IXTP? Or an EXTJ from an IXFP? I realize this could maybe turn into a long post, sorry. If you wanna save this for later or refer me to someone else, that's fine. :-) Thank you!

No problem!

As far as average health morals go:

Fe: Focus on other people, morals revolving around other people on a case by case basis
Fi: Focus on oneself and how one feels, universal, one size fits all morals

SFJ: A lot of ambiguity, if twisting the morals is what seems right at the moment, helping these people, it is right.

NFJ: A lot of ambiguity, but different. Any means justify this end. Long term.

IxTP: Morals are irrelevant. I like understanding but I don’t want to interact much. I prefer to avoid talking about this whenever possible.

ExTP: Morals aren’t important, but entertaining, sometimes I like twisting and bending morals to suit what I want.

NFP: Always stay true to oneself. No matter what. Don’t let anything change how you feel about this, it’s either right or wrong.

SFP: Make sure to always do what you personally believe is right, at that moment. If it what is best for you right now, it is right.

NTJ: Morals are not usually necessary to achieve goals, but sometimes you need morals to even realize what you want in life. Morals when you can, but logic and the plan takes precedence.

STJ: Morals are usually necessary to uphold what has been done, but sometimes morals are part of the foundation of things and need to be taken into consideration. Morals when it applies, but you have to ignore morals when trying to uphold what is important.

Question about sacred values...

Can anyone explain to me why they’re bad? With an example that doesn’t involve a supposedly sacred value that’s causally connected to the supposedly secular one?

Like, the standard example is something like money vs. human lives but… if the actual trade was “some quantity X of money with which I am absolutely not allowed to save any lives” vs. “saving one life,” I will always choose the life. Like, lives feel like a sacred value to me, that are only tradeable for secular values because there is a causal chain that can trade one for the other, and once that chain is severed (however artificially), it suddenly doesn’t feel intuitive at all that they’re commensurable.