newsweekshowcase.com

The future of the internet is not looking good at the moment

Wired: https://www.wired.com/story/the-internets-future-is-looking-bleaker-by-the-day/

TikTok and Meta: The Danger of Fact-Checking in the Era of the CP-Violating January 6 Elections

I usually come across it as a comment on a video of a teenager on TikTok doing something that others find embarrassing like singing out of a key, or obsessing over something that makes them cringe. You will see that comment, word for word, on videos from people with disabilities. These commenters say this because they believe it’s easier to be nasty to people on Instagram and other Meta platforms. The subtext is that if these users posted to Reels instead of TikTok, they’d receive the harassment the commenters believe they deserve.

And yet, it’s TikTok staring down the barrel of a nationwide ban this week. Mark Zuckerberg decided to make his platforms more dangerous in order to appease the incoming president.

As you’ve likely already heard, Meta announced on Tuesday that it would be ending its third-party fact-checking program, replacing it with X-style Community Notes, and “restoring free expression” to its platforms. To accomplish the latter part, the company will relocate its trust and safety operation from California to Texas, purportedly to avoid liberal bias (but not conservative bias, I guess?) and focus its moderation filters on illegal content like terrorism and child sexual abuse material rather than what it calls “lower-severity violations.”

Kate Knibbs reported this week that there are an update to Meta’s Hateful Conduct policy that allows users to make blatantly homophobic, transphobic, sexist, and racist posts without consequences. For platformer, a sentence from the guidelines of Meta was removed due to other changes. Doing so immediately after the anniversary of January 6 is something to be marveled at.

Is that why protections are destroyed for Meta’s most vulnerable users? In a video statement on Tuesday, Zuckerberg explained that the policies were “out of touch with mainstream discourse” and that “recent elections also feel like a cultural tipping point towards once again prioritizing speech.” (Zuckerberg, no one’s idea of a political theorist, didn’t really explain why fact-checking, itself the sort of speech that free-speech activists have long held is the appropriate reaction to bad speech, isn’t worth prioritizing, nor did he explain what he has against the many forms of speech that Meta will still suppress. Free expression, it seems, is identical with whatever Meta happens not to be banning at a given moment.)

Source: The Internet’s Future Is Looking [Bleaker by the Day](https://tech.newsweekshowcase.com/the-internets-future-looks-dimmer-by-the-day/)

Tik Tok, Facebook, the Biden Administration, and a lie of omission: Facebook is not the problem for the Trump White House

The Supreme Court will take up TikTok’s lawsuit against the US government and attempts to ban the app nationwide. The court doesn’t have a lot of time to save the app since it’s only two weeks from the deadline for a sale or extension.

Rogan then asks who, specifically, was pressuring Facebook. And Zuckerberg has no answer: “It was people in the Biden administration,” he says. I think that I was involved in some of the conversations, but I didn’t direct them.

The decision on whether or not to remove content was made by the company, according to the letter. Emphasis mine. “Like I said to our teams at the time, I feel strongly that we should not compromise our content standards due to pressure from any Administration in either direction – and we’re ready to push back if something like this happens again.”

But the biggest lie of all is a lie of omission: Zuckerberg doesn’t mention the relentless pressure conservatives have placed on the company for years  — which has now clearly paid off. The Republican congressman Jim Jordan has released internal communications from Zuckerberg which show him being full of shit.

For his part, Rogan serves up Zuckerberg a series of softballs, setting his own tone by referring to content moderation as “censorship.” There is an idea that the government is trying to restrict information about covid and covid vaccine issues, Hunter Biden’s laptop, and the election, as a running theme throughout the interview. The man who was rebuked by San Francisco for putting his name on a hospital while he spread misinformation thinks that “on balance, the vaccines are better.” Whew!

On the Rogan show, Zuckerberg went further in describing the fact-checking program he’d implemented: “It’s something out of like 1984.” He says the fact-checkers were “too biased,” though he doesn’t say exactly how.

Facebook and other social media networks took a harsher stance on fake news in the 2020 election making it harder for Macedonian teenagers to make money off of Trump supporters. This intervention gave too much deference to certain people in the media who said, oh, there’s no way that this guy could have been elected except for rumors, and that’s why he lost.

Every lawyer I know has gotten foam at their mouth because of that fire in a crowded theater. It is not the law, and it never has been. And, obviously, you can yell “fire” in a crowded theater — especially if, you know, the theater is on fire. Rogan says nothing in response to this, and Zuckerberg knows he’s got a willing mark. You can get away with the small bullshit if you do it right.

Unfortunately I wasn’t born yesterday, and I remember Zuckerberg’s first attempt at getting rich: FaceMash, a clone of HotOrNot where he uploaded photos of his fellow female students to be rated — without their consent. One way of describing that is giving people a voice. Personally, I’d call it “creep shit.”

Zuckerberg, CEO of Facebook’s parent company Meta, sets the tone at the very beginning: “I think at some level you only start one of these companies if you believe in giving people a voice, right?”

But Zuckerberg wants us to believe this isn’t about politics at all. Getting Rogan’s listeners riled up about Zuckerberg’s enemies and finding Republicans a new tech company target is just a coincidence, as are the changes to allow more hate speech on his platforms happening now, changes that just happen to pacify Republicans. All of this has nothing to do with the incoming administration, Zuckerberg tells Rogan. Some people look at this as a purely political thing because they think it would be convenient to do it right after the election. he says. We try to have policies that are in tune with mainstream discourse.

They found a theory they wanted to look into. They were really trying hard, right? To like, to find, to look for, to know, but it is not known. It just, it kind of, like, throughout the, the, the, the, the party and the government, there was just sort of, I don’t know if it’s, I don’t know how this stuff works. I have never been in government. I don’t know if it’s like a directive or it’s just like a quiet consensus that like, we don’t like these guys. They aren’t doing what we want. We’re going to punish them. It is difficult to be at the other end of that.

This is a compelling demonstration that jujitsu and MMA training will not help you act aggressive if you are constitutionally bitch made. When Republicans target Facebook in a witch-hunting, Blaming the Consumer Financial Protection Bureau for a witch-hunt is something! That’s what this whole performance is about: getting Trump, Vance, Jordan and the rest of the Republican party to lay off. The Cambridge Analytica scandal cost Facebook just $5 billion. If Zuckerberg plays ball, his next privacy whoopsie could be even cheaper.

In fact, Zuckerberg even offers Republicans another target: Apple. The way Apple makes money is squeezing people. Among his complaints.

There is an actual antitrust case against Apple, at least some of these issues matter. But that isn’t what’s on Zuckerberg’s mind. The last point is the important one, from his perspective. He has a longstanding grudge against Apple after the company implemented anti-tracking features into its default browser, Safari. Facebook was not a fan of the changes in newspaper ads. The policy cost social media companies over $10 billion, according to The Financial Times. It turns out that most people don’t want to be tracked and that’s bad for Facebook’s business.

Is it possible that this worked? Is the social media mogul talking about how social media needed more masculinity to win over the boys? Well, Barstool’s Dave Portnoy isn’t fooled by this shit.

Fact-checking can be helpful in helping to convince people that the information they are reading is true and trustworthy, according to a social psychologist at the University of Cambridge. Studies show there is very reliable evidence that fact-checking can reduce false claims.

The company said that the move was to make sure fact checkers didn’t have political bias. “Experts, like everyone else, have their own biases and perspectives. This showed up in the choices people made about how to fact check.

For example, a 2019 meta-analysis of the effectiveness of fact-checking in more than 20,000 people found a “significantly positive overall influence on political beliefs”1.

We’d want people to be aware of what they are saying in the first place. “But if we have to work with the fact that people are already exposed, then reducing it is almost as good as it as it’s going to get.”

Fact-checking is less effective when an issue is not neutral, says Jay Van Bavel, a psychologist. “If you’re fact-checking something around Brexit in the UK or the election in United States, that’s where fact-checks don’t work very well,” he says. “In part that’s because people who are partisans don’t want to believe things that make their party look bad.”

On Facebook, articles and posts deemed false by fact-checkers are currently flagged with a warning. They are also shown to fewer users by the platform’s suggestion algorithms, Mantzarlis says, and people are more likely to ignore flagged content than to read and share it.

Flagging posts as problematic could also have knock-on effects on other users that are not captured by studies of the effectiveness of fact-checks, says Kate Starbird, a computer scientist at the University of Washington in Seattle. “Measuring the direct effect of labels on user beliefs and actions is different from measuring the broader effects of having those fact-checks in the information ecosystem,” she adds.

He says conservatives are spreading the conservative misinformation more. It is going to look like fact-checking is biased when one party is spreading most of the misinformation.

Exit mobile version