For me, this whole fracas solidified a decision I made a few months ago to treat meta platforms as hostile zones and engage with them only briefly and as strictly necessary (no Facebook app ever, account left deactivated except for a monthly/quarterly check or as specifically needed to follow an event then back to deactivated afterward, using messenger but strongly discouraging it as means of contact)
This is because this whole thing left me feeling no different about FB's management. Sure, they have some nebulous "free speech" commitment now, which really is no such thing as the porn comment is exactly right, and porn/obscenity/extremism will always been the thin edge of the wedge versus all content. I got zucced ten million times like everyone else and managed to post again by giving them fake drivers license printouts that I took pictures of with my phone while shaking it around to make the photo nice and blurry, which was apparently enough to satisfy whatever overworked person who looked at it for 250 microseconds before clicking the nihil obstet button. That state of affairs will continue as far as everything I read above, it's just that there's some new stupid inside rules that satisfy the groups crowd that they aren't being shadowbanned for calling you a mentally ill gay. This differs from the last 15 years only in that I didn't see the old leaked rules, which likely sucked exactly as much.
The commitment to agency that I would like is "All users get access to an AI agent that allows them to be extremely aggressive with content classification, and when I block someone it's as if I shunted them and all posts they ever made into Universe B. Otherwise, you can propagandize for ISIS all you want, we're the internet phone company, if the FBI wants to get you we'll give them the records."
When I worked at Facebook I knew a guy who managed a team that worked on automatic removal of inappropriate images. One thing he said to me that stuck in my head was, there are some issues that are “policy debates”that are high visibility outside the company that my team is responsible for. The definition of what is porn, leaked private pictures, etc.
But the vast majority of issues measured by what the team has to spend time on, and the way that people are trying to break the rules, are totally different. Like automatically taking down posts offering guns for sale, which really isn’t too controversial as a policy, but is very visible when it fails (like, press reports the criminal illegally bought their gun on Facebook).
No particular point just context for how to interpret, why these teams have a huge amount of legitimate censorship work to do, and the “political” parts of it, from some peoples’ point of view, are actually just a small part of the problem.
The wrong questions are being asked about fact-checkers, for one. For all the griping people make about them being left-wing or biased in some way, research shows an overwhelming amount of misinformation and outright lies on public platforms are disproportionately made by far-right radical political parties and movements: https://journals.sagepub.com/doi/10.1177/19401612241311886
The issue there is not fact-checkers being political biased in some way, but why there aren't more right-wing fact-checkers? Why isn't the right-wing holding itself to account for when its political figures lie and muddy about the facts of something? This strikes me as a failure on the right side to set some sort of standards for itself or something.
The bit about 'genocide' and misinformation is treated like a joke but there's verifiable proof that Facebook, prior to implementing any content standards, directly fed ethnic cleansing and genocide in Myanmar and India. Some detail on the former here: https://faineg.com/facebook-destroys-everything-part-1/
There's a balance between content standards that prevent mob attacks on minorities and censorship. I dunno why Facebook shouldn't have that balance.
I think the broader issue I don't grasp here is why fact-checking and content moderation is incompatible with free speech. You can still say whatever you want. But if you're spouting some clown shit or just plain wrong, there should be a little tag saying so. Seems smart to me to ensure people get dialogue, debate, etcetera, maybe I'm not overthinking it enough...
On the 'genocide' bit: If you let people stir up hatred against folks and provide a forum for them to organize lynchings, they'll do it. Ergo Facebook in Myanmar circa 2015 or so. Thinking otherwise is child-like. Whether we should incentivize/permit that is the debate, and I know where I fall on that question.
I mean, the US government directly feeds genocide too, in plenty of instances. Let's start by rooting that out, before we put the fire to a social media company.
I don't think the wrong questions are necessarily being asked. I think your post strikes me as coming from someone who hears too much about the lies of the right and doesn't take into account the lies widely accepted and spread by the left. Each party has different followers, and different types of lies resonate with each one respectively. I get a lot of "Well, Trump is a bad guy, here are 18 reasons why, so there's no reason for you to really point out one potentially wrong thing being said about him, you should put your focus elsewhere" when I point out outright false information being spread on the left. Nobody takes responsibility for their "team" in politics.
COVID was a great example of that. Through my eyes, the left "won" the facts argument (mostly due to widespread "head in the sand" and just completely baseless claims made by widely followed right wingers), but not without massive spreads of false information themselves. But the left lost in terms of censorship metrics. Deleting posts that talk about deaths due to vaccines is irresponsible, even if the right was greatly exaggerating the dangers, the left was acting like there was zero danger (the facts showed that there was small, but real dangers, and almost every demographic would easily see those dangers offset by the benefits of the jab). Deleting posts with regards to masks (especially when "wear any type of mask, it complies" is such a factually incorrect statement to make to begin with!), with regards to isolation and spread, especially when the left was far weaker on these points, was irresponsible/wrong.
Each party would do very well to look inward and start fact checking themselves. I'm not so naive to believe that will ever happen, but it's a damned sports cheerleading party out there in politics right now (except sports teams criticize themselves more than the fans of these parties do).
"[R]esearch shows an overwhelming amount of misinformation and outright lies on public platforms are disproportionately made by far-right radical political parties and movements."
Oh, the fact checkers have assured us that the misinformation comes from the far right? Well, I guess that settles that.
This strengthens my belief that I don't ALWAYS want free speech. If I want absolute free speech I can always go to 4chan. But I don't want every website to be 4chan.
My go to example is if I am running a puppy forum for cute pictures of puppy's it is the right of the community to ban anyone posting kittens.
I feel like for a while there was some uneasy consensus for what speech is allowed on what platform. That consensus is getting lost and most platforms feel a bit more chaotic than they used to.
@Noahpinion predicted the fragmentation of the internet for roughly these reasons into insular bubbles with their own norms and customs. I also don't want that to fully happen, but I'm sure that a single platform will never be the solution, because we want to go into different spaces for different experiences.
I think your support of the changes blinds you a bit to Zuck's true intentions here. I don't think he made any sort of ideologies obvious on the podcast. He's a pure opportunist, and always has been. I mean, we all know the origin story of Facebook already.
He really struck me as a character from the Silicon Valley show on that podcast though. You heard the bow hunting exchange, right? He's so obviously trying to appeal so much to the incoming administration and winds of change.
I think your interpretation of the prior fact checking, the changes, and what will happen is pretty accurate, I just think it fails to appreciate how much of a snake Zuck truly is (but that really doesn't matter all that much on this topic).
The policy contradiction on abuse of women, gays, etc. is indeed rather insane; I expect a lot of negative sentiment will be in reaction to that, and will be misattributed to the broader policy changes.
Fundamentally if a 4chan-like culture emerges on Facebook a lot of people are going to hate it. That's largely distinct from the problems of fact checking, etc. but the discourse will lump it all into the same political taxon.
For me, this whole fracas solidified a decision I made a few months ago to treat meta platforms as hostile zones and engage with them only briefly and as strictly necessary (no Facebook app ever, account left deactivated except for a monthly/quarterly check or as specifically needed to follow an event then back to deactivated afterward, using messenger but strongly discouraging it as means of contact)
This is because this whole thing left me feeling no different about FB's management. Sure, they have some nebulous "free speech" commitment now, which really is no such thing as the porn comment is exactly right, and porn/obscenity/extremism will always been the thin edge of the wedge versus all content. I got zucced ten million times like everyone else and managed to post again by giving them fake drivers license printouts that I took pictures of with my phone while shaking it around to make the photo nice and blurry, which was apparently enough to satisfy whatever overworked person who looked at it for 250 microseconds before clicking the nihil obstet button. That state of affairs will continue as far as everything I read above, it's just that there's some new stupid inside rules that satisfy the groups crowd that they aren't being shadowbanned for calling you a mentally ill gay. This differs from the last 15 years only in that I didn't see the old leaked rules, which likely sucked exactly as much.
The commitment to agency that I would like is "All users get access to an AI agent that allows them to be extremely aggressive with content classification, and when I block someone it's as if I shunted them and all posts they ever made into Universe B. Otherwise, you can propagandize for ISIS all you want, we're the internet phone company, if the FBI wants to get you we'll give them the records."
When I worked at Facebook I knew a guy who managed a team that worked on automatic removal of inappropriate images. One thing he said to me that stuck in my head was, there are some issues that are “policy debates”that are high visibility outside the company that my team is responsible for. The definition of what is porn, leaked private pictures, etc.
But the vast majority of issues measured by what the team has to spend time on, and the way that people are trying to break the rules, are totally different. Like automatically taking down posts offering guns for sale, which really isn’t too controversial as a policy, but is very visible when it fails (like, press reports the criminal illegally bought their gun on Facebook).
No particular point just context for how to interpret, why these teams have a huge amount of legitimate censorship work to do, and the “political” parts of it, from some peoples’ point of view, are actually just a small part of the problem.
Write-up strikes me as pretty naive.
The wrong questions are being asked about fact-checkers, for one. For all the griping people make about them being left-wing or biased in some way, research shows an overwhelming amount of misinformation and outright lies on public platforms are disproportionately made by far-right radical political parties and movements: https://journals.sagepub.com/doi/10.1177/19401612241311886
The issue there is not fact-checkers being political biased in some way, but why there aren't more right-wing fact-checkers? Why isn't the right-wing holding itself to account for when its political figures lie and muddy about the facts of something? This strikes me as a failure on the right side to set some sort of standards for itself or something.
The bit about 'genocide' and misinformation is treated like a joke but there's verifiable proof that Facebook, prior to implementing any content standards, directly fed ethnic cleansing and genocide in Myanmar and India. Some detail on the former here: https://faineg.com/facebook-destroys-everything-part-1/
There's a balance between content standards that prevent mob attacks on minorities and censorship. I dunno why Facebook shouldn't have that balance.
I think the broader issue I don't grasp here is why fact-checking and content moderation is incompatible with free speech. You can still say whatever you want. But if you're spouting some clown shit or just plain wrong, there should be a little tag saying so. Seems smart to me to ensure people get dialogue, debate, etcetera, maybe I'm not overthinking it enough...
On the 'genocide' bit: If you let people stir up hatred against folks and provide a forum for them to organize lynchings, they'll do it. Ergo Facebook in Myanmar circa 2015 or so. Thinking otherwise is child-like. Whether we should incentivize/permit that is the debate, and I know where I fall on that question.
I mean, the US government directly feeds genocide too, in plenty of instances. Let's start by rooting that out, before we put the fire to a social media company.
Wow, really? I find that surprising. Do you know of any resources where I can learn more about these instances?
I don't think the wrong questions are necessarily being asked. I think your post strikes me as coming from someone who hears too much about the lies of the right and doesn't take into account the lies widely accepted and spread by the left. Each party has different followers, and different types of lies resonate with each one respectively. I get a lot of "Well, Trump is a bad guy, here are 18 reasons why, so there's no reason for you to really point out one potentially wrong thing being said about him, you should put your focus elsewhere" when I point out outright false information being spread on the left. Nobody takes responsibility for their "team" in politics.
COVID was a great example of that. Through my eyes, the left "won" the facts argument (mostly due to widespread "head in the sand" and just completely baseless claims made by widely followed right wingers), but not without massive spreads of false information themselves. But the left lost in terms of censorship metrics. Deleting posts that talk about deaths due to vaccines is irresponsible, even if the right was greatly exaggerating the dangers, the left was acting like there was zero danger (the facts showed that there was small, but real dangers, and almost every demographic would easily see those dangers offset by the benefits of the jab). Deleting posts with regards to masks (especially when "wear any type of mask, it complies" is such a factually incorrect statement to make to begin with!), with regards to isolation and spread, especially when the left was far weaker on these points, was irresponsible/wrong.
Each party would do very well to look inward and start fact checking themselves. I'm not so naive to believe that will ever happen, but it's a damned sports cheerleading party out there in politics right now (except sports teams criticize themselves more than the fans of these parties do).
"[R]esearch shows an overwhelming amount of misinformation and outright lies on public platforms are disproportionately made by far-right radical political parties and movements."
Oh, the fact checkers have assured us that the misinformation comes from the far right? Well, I guess that settles that.
It is not total chaos at meta
This strengthens my belief that I don't ALWAYS want free speech. If I want absolute free speech I can always go to 4chan. But I don't want every website to be 4chan.
My go to example is if I am running a puppy forum for cute pictures of puppy's it is the right of the community to ban anyone posting kittens.
I feel like for a while there was some uneasy consensus for what speech is allowed on what platform. That consensus is getting lost and most platforms feel a bit more chaotic than they used to.
@Noahpinion predicted the fragmentation of the internet for roughly these reasons into insular bubbles with their own norms and customs. I also don't want that to fully happen, but I'm sure that a single platform will never be the solution, because we want to go into different spaces for different experiences.
I think your support of the changes blinds you a bit to Zuck's true intentions here. I don't think he made any sort of ideologies obvious on the podcast. He's a pure opportunist, and always has been. I mean, we all know the origin story of Facebook already.
He really struck me as a character from the Silicon Valley show on that podcast though. You heard the bow hunting exchange, right? He's so obviously trying to appeal so much to the incoming administration and winds of change.
I think your interpretation of the prior fact checking, the changes, and what will happen is pretty accurate, I just think it fails to appreciate how much of a snake Zuck truly is (but that really doesn't matter all that much on this topic).
The policy contradiction on abuse of women, gays, etc. is indeed rather insane; I expect a lot of negative sentiment will be in reaction to that, and will be misattributed to the broader policy changes.
Fundamentally if a 4chan-like culture emerges on Facebook a lot of people are going to hate it. That's largely distinct from the problems of fact checking, etc. but the discourse will lump it all into the same political taxon.