But that alternative is not Gab.ai, which is the home to white supremacists mostly (AFAICT).
Twitter is so bad, though, I doubt it can continue to grow and maintain viability. It's too partisan and hideously biased against conservatives.
It's a joke frankly.
But like I said, it's the place for politics on social media until a genuine alternative emerges.
At WSJ, "Inside Twitter’s Long, Slow Struggle to Police Bad Actors":
Twitter soon plans "to start showing users a picture of a tombstone in the place of a tweet that has been taken down as a way to signal that a user has violated a company policy” https://t.co/nBwoDH5TrS— Hadas Gold (@Hadas_Gold) September 3, 2018
When Twitter Inc. Chief Executive Jack Dorsey testifies before Congress this week, he’ll likely be asked about an issue that has been hovering over the company: Just who decides whether a user gets kicked off the site?Keep reading.
To some Twitter users—and even some employees—it is a mystery.
In policing content on the site and punishing bad actors, Twitter relies primarily on its users to report abuses and has a consistent set of policies so that decisions aren’t made by just one person, its executives say.
Yet, in some cases, Mr. Dorsey has weighed in on content decisions at the last minute or after they were made, sometimes resulting in changes and frustrating other executives and employees, according to people familiar with the matter.
Understanding Mr. Dorsey’s role in making content decisions is crucial, as Twitter tries to become more transparent to its 335 million users, as well as lawmakers about how it polices toxic content on its site.
In a hearing Wednesday morning before the Senate Intelligence Committee, Mr. Dorsey will appear alongside Facebook Inc. Chief Operating Officer Sheryl Sandberg to discuss how foreign actors can use the social-media platforms to spread misinformation and propaganda. Later in the day, the House Commerce Committee will question Mr. Dorsey individually in a Republican-led look at how Twitter treats conservative voices.
The latter hearing “is about pulling back the curtain on Twitter’s algorithms, how the company makes decisions about content, and how those decisions impact Americans,” said Rep. Greg Walden (R., Ore.), the chairman of the House Commerce Committee.
Twitter and rival Facebook are increasingly caught in a Catch-22 situation—criticized by some users for allowing hateful posts, but blasted by others for removing content because it curtails free speech.
Twitter has taken a different approach than Facebook, which has hired thousands of content reviewers in the last couple of years to review posts and built out technology to flag inappropriate content. Twitter has far less staff and typically only investigates harassment and abuse that has been reported by users.
Last month, after Twitter’s controversial decision to allow far-right conspiracy theorist Alex Jones to remain on its platform, Mr. Dorsey told one person that he had overruled a decision by his staff to kick Mr. Jones off, according to a person familiar with the discussion. Twitter disputes that account and says Mr. Dorsey wasn’t involved in those discussions.
Twitter’s initial inaction on Mr. Jones, after several other major tech companies banned or limited his content, drew fierce backlash from the public and Twitter’s own employees, some of whom tweeted in protest.
A similar chain of events unfolded in November 2016, when the firm’s trust and safety team kicked alt-right provocateur Richard Spencer off the platform, saying he was operating too many accounts. Mr. Dorsey, who wasn’t involved in the initial discussions, told his team that Mr. Spencer should be allowed to keep one account and stay on the site, according to a person directly involved in the discussions.
Twitter says Mr. Dorsey doesn’t overrule staffers on content issues. The company declined to make Mr. Dorsey available...
0 comments:
Post a Comment