Tag Archives: Limit
Mark Zuckerberg would like you to know that despite a scathing report in The New York Times, which depicts Facebook as a ruthless, self-concerned corporate behemoth, things are getting better—at least, the way he sees it.
In a lengthy call with reporters Thursday, and an equally lengthy “note” published on Facebook, the company’s CEO laid out a litany of changes Facebook is making, designed to curb toxic content on the platform and provide more transparency into the decisions Facebook makes on content. But perhaps the most consequential update is that the Facebook News Feed algorithm will now try to limit the spread of sensationalist content on the platform, which represents a major change from how the company has traditionally approached moderation. All of it is in service of restoring trust in a company whose public reputation—and the reputation of its leaders—have taken near constant body blows over the past two years.
“When you have setbacks like we’ve had this year that’s a big issue, and it does erode trust, and it takes time to build that back,” Zuckerberg said on the call. “Certainly our job is not only to have this stuff at a good level and to continually improve, but to be ahead of new issues. I think over the last couple of years that’s been one of the areas where we’ve been most behind, especially around the election issues.”
Zuckerberg’s words come a day after the Times published a damning report that portrays Facebook as not merely behind on issues of election interference, as Zuckerberg suggests, but actively working to downplay what it knew about that interference. It suggests that Facebook’s executives, wary of picking sides in a partisan battle over Russian interference in the 2016 election, aimed to minimize Russia’s role in spreading propaganda on the platform. The story states that Facebook’s former head of cybersecurity, Alex Stamos, was chastised by the company’s chief operating officer, Sheryl Sandberg, for investigating Russian actions without the company’s approval and berated again for divulging too much information about it to members of Facebook’s board.
In his remarks, Zuckerberg flatly denied this allegation. “We’ve certainly stumbled along the way, but to suggest that we weren’t interested in knowing the truth or that we wanted to hide what we knew or that we tried to prevent investigations is simply untrue,” he said. (Stamos, for his part, tweeted earlier on Thursday that he was “never told by Mark, Sheryl or any other executives not to investigate.”)
The Times story also alleges that Facebook waged a smear campaign against its competitors through an opposition research firm called Definers Public Relations. The firm repeatedly worked to tie Facebook’s detractors, including groups like the Open Markets Institute and Freedom from Facebook, to billionaire George Soros. Critics say that in doing so, Facebook engaged with the same anti-Semitic tropes that have been used by white nationalists and other hate groups that regularly villainize Soros.
Zuckerberg denied having any personal knowledge of Definers’ work with Facebook, and said he and Sheryl Sandberg, Facebook’s chief operating officer, only heard about the relationship yesterday. That’s despite the fact that Definers often coordinated large-scale calls with the press on behalf of Facebook and its employees and, in at least one case, sat in on meetings between Facebook and the media.
After Zuckerberg read the story in the Times, he says Facebook promptly ended its relationship with the firm. “This type of firm might be normal in Washington, but it’s not the type of thing I want Facebook associated with, which is why we’re no longer going to be working with them.”
But while Zuckerberg said he had no knowledge of Definers’ work or its messaging, he defended Facebook’s criticism of activist groups like Freedom from Facebook. He said the intention was not to attack Soros, for whom Zuckerberg said he has “tremendous respect,” but show that Freedom from Facebook “was not a spontaneous grassroots effort.”
Zuckerberg declined to assign blame for the tactics allegedly employed by Definers, or to comment on broader personnel issues within Facebook itself. He said only that Sandberg, who has been overseeing Facebook’s lobbying efforts and who is portrayed unfavorably throughout the Times story, is “doing great work for the company.” “She’s been an important partner to me and continues to be and will continue to be,” Zuckerberg said. (Sandberg was not on the call.)
For the umpteenth time this year, Zuckerberg found himself working overtime to clean up Facebook’s mess, even as he wanted desperately to tout the progress the company’s been making. And it has made important progress. In Myanmar, where fake news on Facebook has animated a brutal ethnic cleansing campaign against the Rohingya people, the company has hired 100 Burmese speakers to moderate content there and is now automatically identifying 63 percent of the hate speech it takes down, up from just 13 percent at the end of last year. Facebook has expanded its safety and security team to 30,000 people globally, more than the 20,000 people the company initially set out to hire this year. It’s also changed its content takedown process, allowing people to appeal the company’s decisions about content they post or report. On Thursday, Facebook announced that within the next year, it will create an independent oversight body to handle content appeals.
But by far the biggest news to come out of Thursday’s announcements is the change coming to Facebook’s News Feed algorithm. Zuckerberg acknowledged what most observers already know to be one of Facebook’s most fundamental problems: That sensationalist, provocative content, even content that doesn’t explicitly violate Facebook’s policies, tends to get the most engagement on the platform. “As content gets closer to the line of what is prohibited by our community standards, we see people tend to engage with it more,” he said. “This seems to be true regardless of where we set our policy lines.”
This issue is arguably what undergirds most of Facebook’s problems the past few years. It’s why divisive political propaganda was so successful during the 2016 campaign and why fake news has been able to flourish. Until now, Facebook has operated in a black-and-white environment, where content either violates the rules or it doesn’t, and if it doesn’t, it’s free to amass millions of clicks, even if the poster’s intention is to mislead and stoke outrage. Now Facebook is saying that even content that doesn’t explicitly violate Facebook’s rules might see its reach reduced. According to Zuckerberg’s post, that includes, among other things, “photos close to the line of nudity” and “posts that don’t come within our definition of hate speech but are still offensive.”
Zuckerberg called the shift “a big part of the solution for making sure polarizing or sensational content isn’t spreading in the system, and we’re having a positive effect on the world.”
With this move, Facebook is taking a risk. Curbing engagement on the most popular content will likely cost the company money. And such a dramatic change no doubt opens Facebook up to even more accusations of censorship, at a time when the company is fending off constant criticism from all angles.
But Facebook is betting big on the upside. If outrage is no longer rewarded with ever more clicks, the thinking goes, maybe people will be better behaved. That Facebook is prepared to take such a chance says a lot about the public pressure that’s been placed on the company these last two years. After all of that, what does Facebook have to lose?
More Great WIRED Stories
Amid declining PC market and increased competition in the cloud computing space, Azure growth rates have taken the dive for the worse. According …
First up, Telegram is pushing the upper limit of groups from 1,000 people to 5,000 people — this comes just four months after the company increased the limit from 200 people to create the thousand-strong so-called “supergroups.” These groups are distinct from normal groups — once your group reaches 200 people you can now elect to upgrade it to supergroup status which optimizes it for larger communities of people. For example, new members will be able to see the whole message history when they join, and when someone deletes a message it will be deleted for everyone in the group. Also, because supergroups can be particularly large, notifications are muted by default to prevent your phone from buzzing itself into oblivion.
Above: Telegram Public
In addition to larger groups, Telegram now lets users push supergroups to the public using a shareable short link, meaning anybody can view the group’s conversation history — but they’ll need to join before they can post messages. Group admins will also be given extra controls to thwart spammers, including blocking and reporting tools. Public groups are already live in Europe and the U.S., and will be rolling out gradually to other countries, though interestingly Telegram said that “several countries in Asia” don’t yet have the feature due to a history of “significant spam activity.”
Elsewhere, Supergroup admins can also now pin important news to the top of a chat, meaning everyone who joins for the first time or opens the app after some time away, will see the message. This is similar to features in other messaging apps and social networks, such as Twitter which lets you pin a tweet to the top of your timeline.
Founded in 2013 by Pavel Durov (creator of Russian social networking giant VK) and his brother Nikolai, Telegram has emerged as a major player in the increasingly competitive chat app realm. This is in part due to the company’s focus on encryption, while the app also offers a secret chat feature that makes it easy to delete messages or schedule a time for them to self-destruct.
A few weeks back, Telegram announced it had passed 100 million monthly active users (MAUs), representing a 60 percent rise in just nine months. While this is still some way off its competition, with the likes of Facebook Messenger and WhatsApp which claim almost two billion MAUs between them, it’s still a sizable entourage of users. And by focusing on building not only the size of the groups but also the visibility, the company’s hoping it can maintain its recent growth spurt.