Connect with us

Social Media

Social Media Self-Regulation Has Failed, Say New America Panelists

Elijah Labby

Published

on

Photo of Amnesty International Researcher Joe Westby by ITU Pictures used with permission

July 7, 2020 — Social networks have failed to self-regulate, said Joe Westby, technology and human rights researcher at Amnesty International, in a Tuesday webinar hosted by New America.

The event, titled “How Advertising Algorithms Drive the Internet’s Favorite Business Model,” saw participants discuss how algorithms impact user experiences on social media platforms.

The internet has seen substantial changes over the past two decades, Westby said, and think tanks in the early 2000s warned about the type of ad-based-advertising that is prevalent across social media platforms like Facebook today.

“We’ve got to the point where we just accept [that] this is the way the internet works, but we need to remember that… the internet wasn’t always reliant on this business model, and there are alternatives,” he said.

Such advertising requires mass attention on the platforms, said Nathalie Marechal, a senior policy analyst at Ranking Digital Rights. Websites need to know who users are and what they do, as well as what interests them, so that they can capture their attention for as long as possible.

“That creates a set of powerful incentives that have led the social platforms that we know and love — or love to hate, or hate to love — and how we engage with them,” she said.

Such algorithmic ad-based revenue streams can create problems, said Morgan Williams, general counsel at the National Fair Housing Alliance.

Lawsuits “can play a very important role in changing the practice of individual operators, but also in informing the market more broadly about where perspective liability may lie,” he said.

The legal process often assists users by clarifying rules and making grey areas more comfortable to navigate, Williams added.

New social media lawsuits “may help to sort of clarify some of those legal questions in ways that we think would expand the scope of civil rights protections,” he said.

Elijah Labby was a Reporter with Broadband Breakfast. He was born in Pittsburgh, Pennsylvania and now resides in Orlando, Florida. He studies political science at Seminole State College, and enjoys reading and writing fiction (but not for Broadband Breakfast).

Social Media

Josh Hawley Wants To Break Up Big Tech And Revisit How Antitrust Matters Are Considered

Senator Josh Hawley talks Section 230, antitrust reform, and the Capitol riots.

Benjamin Kahn

Published

on

Josh Hawley, right, via Flickr

July 7, 2020 — Social networks have failed to self-regulate, said Joe Westby, technology and human rights researcher at Amnesty International, in a Tuesday webinar hosted by New America.

The event, titled “How Advertising Algorithms Drive the Internet’s Favorite Business Model,” saw participants discuss how algorithms impact user experiences on social media platforms.

The internet has seen substantial changes over the past two decades, Westby said, and think tanks in the early 2000s warned about the type of ad-based-advertising that is prevalent across social media platforms like Facebook today.

“We’ve got to the point where we just accept [that] this is the way the internet works, but we need to remember that… the internet wasn’t always reliant on this business model, and there are alternatives,” he said.

Such advertising requires mass attention on the platforms, said Nathalie Marechal, a senior policy analyst at Ranking Digital Rights. Websites need to know who users are and what they do, as well as what interests them, so that they can capture their attention for as long as possible.

“That creates a set of powerful incentives that have led the social platforms that we know and love — or love to hate, or hate to love — and how we engage with them,” she said.

Such algorithmic ad-based revenue streams can create problems, said Morgan Williams, general counsel at the National Fair Housing Alliance.

Lawsuits “can play a very important role in changing the practice of individual operators, but also in informing the market more broadly about where perspective liability may lie,” he said.

The legal process often assists users by clarifying rules and making grey areas more comfortable to navigate, Williams added.

New social media lawsuits “may help to sort of clarify some of those legal questions in ways that we think would expand the scope of civil rights protections,” he said.

Continue Reading

Social Media

Oversight Board Upholds Trump’s Ban From Facebook

The Oversight Board has sent the decision back to Facebook management, criticizing it for setting a “standardless” penalty.

Benjamin Kahn

Published

on

July 7, 2020 — Social networks have failed to self-regulate, said Joe Westby, technology and human rights researcher at Amnesty International, in a Tuesday webinar hosted by New America.

The event, titled “How Advertising Algorithms Drive the Internet’s Favorite Business Model,” saw participants discuss how algorithms impact user experiences on social media platforms.

The internet has seen substantial changes over the past two decades, Westby said, and think tanks in the early 2000s warned about the type of ad-based-advertising that is prevalent across social media platforms like Facebook today.

“We’ve got to the point where we just accept [that] this is the way the internet works, but we need to remember that… the internet wasn’t always reliant on this business model, and there are alternatives,” he said.

Such advertising requires mass attention on the platforms, said Nathalie Marechal, a senior policy analyst at Ranking Digital Rights. Websites need to know who users are and what they do, as well as what interests them, so that they can capture their attention for as long as possible.

“That creates a set of powerful incentives that have led the social platforms that we know and love — or love to hate, or hate to love — and how we engage with them,” she said.

Such algorithmic ad-based revenue streams can create problems, said Morgan Williams, general counsel at the National Fair Housing Alliance.

Lawsuits “can play a very important role in changing the practice of individual operators, but also in informing the market more broadly about where perspective liability may lie,” he said.

The legal process often assists users by clarifying rules and making grey areas more comfortable to navigate, Williams added.

New social media lawsuits “may help to sort of clarify some of those legal questions in ways that we think would expand the scope of civil rights protections,” he said.

Continue Reading

Courts

Supreme Court Declares Trump First Amendment Case Moot, But Legal Issues For Social Media Coming

Benjamin Kahn

Published

on

Photo of Justice Clarence Thomas in April 2017 by Preston Keres in the public domain

July 7, 2020 — Social networks have failed to self-regulate, said Joe Westby, technology and human rights researcher at Amnesty International, in a Tuesday webinar hosted by New America.

The event, titled “How Advertising Algorithms Drive the Internet’s Favorite Business Model,” saw participants discuss how algorithms impact user experiences on social media platforms.

The internet has seen substantial changes over the past two decades, Westby said, and think tanks in the early 2000s warned about the type of ad-based-advertising that is prevalent across social media platforms like Facebook today.

“We’ve got to the point where we just accept [that] this is the way the internet works, but we need to remember that… the internet wasn’t always reliant on this business model, and there are alternatives,” he said.

Such advertising requires mass attention on the platforms, said Nathalie Marechal, a senior policy analyst at Ranking Digital Rights. Websites need to know who users are and what they do, as well as what interests them, so that they can capture their attention for as long as possible.

“That creates a set of powerful incentives that have led the social platforms that we know and love — or love to hate, or hate to love — and how we engage with them,” she said.

Such algorithmic ad-based revenue streams can create problems, said Morgan Williams, general counsel at the National Fair Housing Alliance.

Lawsuits “can play a very important role in changing the practice of individual operators, but also in informing the market more broadly about where perspective liability may lie,” he said.

The legal process often assists users by clarifying rules and making grey areas more comfortable to navigate, Williams added.

New social media lawsuits “may help to sort of clarify some of those legal questions in ways that we think would expand the scope of civil rights protections,” he said.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending