Connect with us

Social Media

With Misinformation About Coronavirus Rampant, Facebook Officials Are Taking More Aggressive Steps

Elijah Labby

Published

on

Screenshot of Facebook's Keren Goldshlager and Josh Mabry and the NAB's John Clark from the webcast

May 14, 2020 — Facebook is taking extra steps to stop the rampant spread of coronavirus misinformation, said company officials in a National Association of Broadcasters event Thursday.

Keren Goldshlager, a member of Facebook’s Integrity Partnerships team, said that the platform’s approach to content moderation could be summarized in three words: remove, reduce and reform.

For Facebook to remove a post, she said, it must be either inauthentic, unsafe, privacy-invasive or degrading to others.

However, Facebook officials said that the company deliberately does not always delete false news.

“We think it’s really important to strike a balance between enabling free expression and discussion on the platform, but also making sure that the most harmful misinformation isn’t going viral,” she said.

Goldshlager said that Facebook partners with over 60 fact-checking organizations worldwide to verify the authenticity of the most widely-spread content on the platform. If content is deemed untruthful, Facebook does not remove it, but instead adds a note to explain the degree to which the post is false and link users to additional information.

Local News Partnerships Lead Josh Mabry detailed the platform’s $100 million commitment to journalism. The fund, which goes to local news organizations, aims to support businesses fighting against false information.

“To date, we’ve announced around 600 grants in total through this program,” Mabry said. “Recently, we announced 200 grants, totaling around $16 million, going to various kinds of publishers … from broadcast to newspapers to digital.”

Mabry said that the fund largely consisted of relief funding for journalistic organizations that are low on resources, adding that something as simple as an e-mail newsletter can be impractical for a small outlet but can make all the difference.

“That seems like such a thing that is not new … but it is this format that people really gravitate towards,” Mabry said. “People will sign up for a newsletter, they like having the news come directly to them… and it’s one other way to form that direct relationship with your audience.”

Elijah Labby was a Reporter with Broadband Breakfast. He was born in Pittsburgh, Pennsylvania and now resides in Orlando, Florida. He studies political science at Seminole State College, and enjoys reading and writing fiction (but not for Broadband Breakfast).

Social Media

Josh Hawley Wants To Break Up Big Tech And Revisit How Antitrust Matters Are Considered

Senator Josh Hawley talks Section 230, antitrust reform, and the Capitol riots.

Benjamin Kahn

Published

on

Josh Hawley, right, via Flickr

May 14, 2020 — Facebook is taking extra steps to stop the rampant spread of coronavirus misinformation, said company officials in a National Association of Broadcasters event Thursday.

Keren Goldshlager, a member of Facebook’s Integrity Partnerships team, said that the platform’s approach to content moderation could be summarized in three words: remove, reduce and reform.

For Facebook to remove a post, she said, it must be either inauthentic, unsafe, privacy-invasive or degrading to others.

However, Facebook officials said that the company deliberately does not always delete false news.

“We think it’s really important to strike a balance between enabling free expression and discussion on the platform, but also making sure that the most harmful misinformation isn’t going viral,” she said.

Goldshlager said that Facebook partners with over 60 fact-checking organizations worldwide to verify the authenticity of the most widely-spread content on the platform. If content is deemed untruthful, Facebook does not remove it, but instead adds a note to explain the degree to which the post is false and link users to additional information.

Local News Partnerships Lead Josh Mabry detailed the platform’s $100 million commitment to journalism. The fund, which goes to local news organizations, aims to support businesses fighting against false information.

“To date, we’ve announced around 600 grants in total through this program,” Mabry said. “Recently, we announced 200 grants, totaling around $16 million, going to various kinds of publishers … from broadcast to newspapers to digital.”

Mabry said that the fund largely consisted of relief funding for journalistic organizations that are low on resources, adding that something as simple as an e-mail newsletter can be impractical for a small outlet but can make all the difference.

“That seems like such a thing that is not new … but it is this format that people really gravitate towards,” Mabry said. “People will sign up for a newsletter, they like having the news come directly to them… and it’s one other way to form that direct relationship with your audience.”

Continue Reading

Social Media

Oversight Board Upholds Trump’s Ban From Facebook

The Oversight Board has sent the decision back to Facebook management, criticizing it for setting a “standardless” penalty.

Benjamin Kahn

Published

on

May 14, 2020 — Facebook is taking extra steps to stop the rampant spread of coronavirus misinformation, said company officials in a National Association of Broadcasters event Thursday.

Keren Goldshlager, a member of Facebook’s Integrity Partnerships team, said that the platform’s approach to content moderation could be summarized in three words: remove, reduce and reform.

For Facebook to remove a post, she said, it must be either inauthentic, unsafe, privacy-invasive or degrading to others.

However, Facebook officials said that the company deliberately does not always delete false news.

“We think it’s really important to strike a balance between enabling free expression and discussion on the platform, but also making sure that the most harmful misinformation isn’t going viral,” she said.

Goldshlager said that Facebook partners with over 60 fact-checking organizations worldwide to verify the authenticity of the most widely-spread content on the platform. If content is deemed untruthful, Facebook does not remove it, but instead adds a note to explain the degree to which the post is false and link users to additional information.

Local News Partnerships Lead Josh Mabry detailed the platform’s $100 million commitment to journalism. The fund, which goes to local news organizations, aims to support businesses fighting against false information.

“To date, we’ve announced around 600 grants in total through this program,” Mabry said. “Recently, we announced 200 grants, totaling around $16 million, going to various kinds of publishers … from broadcast to newspapers to digital.”

Mabry said that the fund largely consisted of relief funding for journalistic organizations that are low on resources, adding that something as simple as an e-mail newsletter can be impractical for a small outlet but can make all the difference.

“That seems like such a thing that is not new … but it is this format that people really gravitate towards,” Mabry said. “People will sign up for a newsletter, they like having the news come directly to them… and it’s one other way to form that direct relationship with your audience.”

Continue Reading

Courts

Supreme Court Declares Trump First Amendment Case Moot, But Legal Issues For Social Media Coming

Benjamin Kahn

Published

on

Photo of Justice Clarence Thomas in April 2017 by Preston Keres in the public domain

May 14, 2020 — Facebook is taking extra steps to stop the rampant spread of coronavirus misinformation, said company officials in a National Association of Broadcasters event Thursday.

Keren Goldshlager, a member of Facebook’s Integrity Partnerships team, said that the platform’s approach to content moderation could be summarized in three words: remove, reduce and reform.

For Facebook to remove a post, she said, it must be either inauthentic, unsafe, privacy-invasive or degrading to others.

However, Facebook officials said that the company deliberately does not always delete false news.

“We think it’s really important to strike a balance between enabling free expression and discussion on the platform, but also making sure that the most harmful misinformation isn’t going viral,” she said.

Goldshlager said that Facebook partners with over 60 fact-checking organizations worldwide to verify the authenticity of the most widely-spread content on the platform. If content is deemed untruthful, Facebook does not remove it, but instead adds a note to explain the degree to which the post is false and link users to additional information.

Local News Partnerships Lead Josh Mabry detailed the platform’s $100 million commitment to journalism. The fund, which goes to local news organizations, aims to support businesses fighting against false information.

“To date, we’ve announced around 600 grants in total through this program,” Mabry said. “Recently, we announced 200 grants, totaling around $16 million, going to various kinds of publishers … from broadcast to newspapers to digital.”

Mabry said that the fund largely consisted of relief funding for journalistic organizations that are low on resources, adding that something as simple as an e-mail newsletter can be impractical for a small outlet but can make all the difference.

“That seems like such a thing that is not new … but it is this format that people really gravitate towards,” Mabry said. “People will sign up for a newsletter, they like having the news come directly to them… and it’s one other way to form that direct relationship with your audience.”

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending