A while back, a far right sub got closed. People were kind of lost as always happens when moderators take a sub private.
Others, and this happened during the heay of the 2016 election, offered up a space. A thread on a definitely not far right sub.
Was, "have a beer, tell us about it" essentially. No judgement, just a place to talk past frustration.
The chatter that night was interesting. Relations were permenantly improved. Some of the "far" was also rolled off. (I was there, ran that little experiment and followed users and discussion for a time afterword.)
Aaron was passionate about this effect. Reddit, in general, used to be. When people talk broadly, things improve.
When they get contained, things amplify.
Reddit sees people with interests, things to say. Others, including some moderators at Reddit, sees groups of people.
That clash is not talked about as much as it should be.
Humanizing others different from us is something Reddit has done well. When it is done, the state of conflict improves. Reddit has also contained discussion, and amplified the toxic too. Ugly stuff.
We all run the same basic way, and we all have more in common than we often recognize, or admit.
One other thing:
A very toxic shitpost sub got to hating on a community I moderated. Two of us decided to make some friends.
For a month, we would rate their toxicity, employ good humor, laugh with, etc...
Amazingly, this worked far better than we expected. Soon, there were people in that toxic mess rooting for us, and it became increasingly difficult for the sub to be toxic and believe it had impact.
As the tipping point approached, we got banned, and the shitposts to follow fell flat. Everyone saw how ugly it was and our humor was missed.
They quit after that. The friends remained, some left the toxic haven.
Now, there are people like you and me that have the same bad taste for "far lefties", and in general, "those other people."
What we community managers, app builders, data scientists should be doing is finding ways to exploit the effects I put here.
And we should be making tools to foster better communication.
Those tools are software, and giving people options, making them aware of their own agency in dialog.
The former has only limited success as many communities will show. Blocks, bans, other things...
When combined with empowering people to take charge of their interactions, weigh what others say, it is a whole new game.
I have applied these to communities who end up very troll resistant and able to communicate across broad swaths of humanity.
Righteous indignation is not the only way to respond to toxic speech. In fact, it is often the very worst, yet the domimant exchange is indignation, largely for the often set expectations encouraging it.
Some rando says something toxic and when weighted, it is laughable, pathetic. It for sure is not meaningful.
Trolling works on righteous indignation.
Free hugs, humor, empathy, and more all snuff it out.
The same goes for very different worldviews. Make some friends, have a common human basis and those differences begin to melt.
Frankly, I see 3 billion being used to sell shit, get user data.
Shame that some of it is very unlikely to be used to improve communications and add some real value along the way.
Edit: Reddit is an amazing playform to run the social dynamics experiments. Lots to be learned there for those who go looking.
They will cry foul and get plenty of attention.
A while back, a far right sub got closed. People were kind of lost as always happens when moderators take a sub private.
Others, and this happened during the heay of the 2016 election, offered up a space. A thread on a definitely not far right sub.
Was, "have a beer, tell us about it" essentially. No judgement, just a place to talk past frustration.
The chatter that night was interesting. Relations were permenantly improved. Some of the "far" was also rolled off. (I was there, ran that little experiment and followed users and discussion for a time afterword.)
Aaron was passionate about this effect. Reddit, in general, used to be. When people talk broadly, things improve.
When they get contained, things amplify.
Reddit sees people with interests, things to say. Others, including some moderators at Reddit, sees groups of people.
That clash is not talked about as much as it should be.
Humanizing others different from us is something Reddit has done well. When it is done, the state of conflict improves. Reddit has also contained discussion, and amplified the toxic too. Ugly stuff.
We all run the same basic way, and we all have more in common than we often recognize, or admit.
One other thing:
A very toxic shitpost sub got to hating on a community I moderated. Two of us decided to make some friends.
For a month, we would rate their toxicity, employ good humor, laugh with, etc...
Amazingly, this worked far better than we expected. Soon, there were people in that toxic mess rooting for us, and it became increasingly difficult for the sub to be toxic and believe it had impact.
As the tipping point approached, we got banned, and the shitposts to follow fell flat. Everyone saw how ugly it was and our humor was missed.
They quit after that. The friends remained, some left the toxic haven.
Now, there are people like you and me that have the same bad taste for "far lefties", and in general, "those other people."
What we community managers, app builders, data scientists should be doing is finding ways to exploit the effects I put here.
And we should be making tools to foster better communication.
Those tools are software, and giving people options, making them aware of their own agency in dialog.
The former has only limited success as many communities will show. Blocks, bans, other things...
When combined with empowering people to take charge of their interactions, weigh what others say, it is a whole new game.
I have applied these to communities who end up very troll resistant and able to communicate across broad swaths of humanity.
Righteous indignation is not the only way to respond to toxic speech. In fact, it is often the very worst, yet the domimant exchange is indignation, largely for the often set expectations encouraging it.
Some rando says something toxic and when weighted, it is laughable, pathetic. It for sure is not meaningful.
Trolling works on righteous indignation.
Free hugs, humor, empathy, and more all snuff it out.
The same goes for very different worldviews. Make some friends, have a common human basis and those differences begin to melt.
Frankly, I see 3 billion being used to sell shit, get user data.
Shame that some of it is very unlikely to be used to improve communications and add some real value along the way.
Edit: Reddit is an amazing playform to run the social dynamics experiments. Lots to be learned there for those who go looking.