Thursday, Apr 25, 2024
CLOSE

Will The Kyle Rittenhouse Verdict Change How Facebook Treats Gun Violence—Again?


Will The Kyle Rittenhouse Verdict Change How Facebook Treats Gun Violence—Again?

KENOSHA WISCONSIN – NOVEMBER 17. Kyle Rittenhouse backs up as attorneys talk about items in the His defense filed a motion to mistrial at his Kenosha County Courthouse trial on November 17, 2021, in Kenosha. Rittenhouse was accused of killing two demonstrators and shooting Jacob Blake seven times in his back during unrest in Kenosha. Rittenhouse (Antioch, Illinois) was just 17 years old at the time. The charges against Rittenhouse include felony attempted and felony murder. Photo by Sean Krajacic/Pool/Getty Images

Getty Images

Facebook was notified in August 2020 when Kyle Rittenhouse shot and killed two men while wounding a third. took relatively swift action. A day after the incident in Kenosha, Wisconsin, it removed his Facebook and Instagram accounts, started to prohibit posts praising him and blocked his name from the apps’ search function.

The moves came as part of another new Facebook policy around violence and mass shootings that debuted that same week, though it’s unclear whether it dropped befOre or Rittenhouse had shot the men. And as part of its decision to reduce Rittenhouse’s profile on the platform, the company officially designated Rittenhouse as a “mass shooter.”

The steps were immediately criticised by Facebook. In a post to Facebook’s internal Workplace message board days later, one employee wrote: “If Kyle Rittenhouse had killed 1 person instead of 2, would it still qualify as a mass shooting? Can we really consistently and objectively differentiate between support (not allowed) and discussion of whether he is being treated justly (allowed)?”

The Workplace post went on: “Can we really handle this at scale in an objective way without making critical errors in both under and over enforcement?”

The comment hits the mark. Facebook has spent years deliberating about the type and manner of content that it should regulate. The company has been criticized both by liberals and conservatives alike for being too busy. It is therefore pulled in two directions and it does not always please one side.

Recent pressure has been placed on the government to adopt a more strong stance against violence-inducing content. This might seem to be something that could attract universal support. It hasn’t. Facebook made it even worse on Friday when Rittenhouse was found not guilty by a jury. This sparked outcries among right-wing pundits who claimed that Facebook unfairly penalized him. (His lawyer successfully convinced him that he had acted in self defence that August evening in Kenosha. This was a city that was being protested over Jacob Blake’s police shooting.

Facebook has long been reluctant to make judgement calls about what belongs on its site—and when it has prohibited material such as violent content, it has not always been successful in keeping it from appearing on its platform. The most shocking example was the New Zealand shooting of March 2019, Christchurch. In this case, the gunman livestreamed the entire incident on Facebook and YouTube. No one reported the video to Facebook until 29 minutes after it went commenced, and no part of the video triggered Facebook’s automated moderation software, according to a internal Facebook report about the Christchurch shooter. Facebook finally shut the feed down, but the company would take down 1.5 million copies the next day. Facebook responded by changing a few policies regarding live video, such as speeding up the rate at which its software reviews live videos. The broadcasts used to last for five minutes, but the program would notice it. After Christchruch’s changes, it was reduced to around 20 seconds.

Like many policy changes made by Facebook, they were responsive. In recent years, Facebook seems to have been more interested in keeping up with the latest events on its platform. Mark Zuckerberg, Facebook CEO, acknowledged that Facebook had failed to remove a Facebook page encouraging the formation of militias in the same Wisconsin area where Rittenhouse and the other men were shot. Facebook users reported this militia group 455 time before Facebook took it down. And then in January, Facebook took measures against posts related to the U.S. Capitol riots only in the aftermath to the insurrection—even though sentiment deligitimizing the election had blossomed on Facebook in the months after Joe Biden’s victory, another internal Facebook report shows.

There are many questions raised by Rittenhouse’s verdict. When should a “mass shooter” label get affixed to someone—before or after a trial? Ever? Facebook should how do they reduce posts? Is it time to scrap the mass-shooter policy?

Over the weekend, Facebook, which didn’t return requests to comment, was again sent backtracking. It lifted its block around searching for “Kyle Rittenhouse” on Facebook and Instagram, helping posts about Rittenhouse from right-wing media personalities like Ben Shapiro and Dan Bongino to attract tens of thousands of comments, reshares and reaction emojis, the signals that then boost their posts further up users’ Facebook feeds. American Patriots Appeal is the only Facebook group that offers Rittenhouse Tshirts. He is shown in a crouched G.I. and it costs $27.99 Joe poses with a semiautomatic rifle. An emblazoned phrase appears alongside: “Kyle Rittenhouse Did Nothing Wrong.”

These internal Facebook documents were taken from documents Frances Haugen (Facebook whistle-blower) gave to the SEC. The redacted version has been given to Congress along with a consortium news organisations, including SME. They’re popularly known as The Facebook Papers.  

The post Will The Kyle Rittenhouse Verdict Change How Facebook Treats Gun Violence—Again? appeared first on Social Media Explorer.


Did you miss our previous article...
https://socialmediaamplification.com/social-media-analysis/the-future-of-eyewear