Connect with us

Big Tech

Democrats’ Seek #OneMoreVote for Net Neutrality and Launch Congressional Review Act Against Trump FCC

Published

on

WASHINGTON, February 28, 2018 – Advocates of net neutrality are pushing for congressional action to overturn a December action by the Federal Communications Commission to repeal such rules. But they will have an uphill battle if they expect President Trump to sign off on gutting an action by his appointee to lead the FCC.

In his first year in office, Trump signed 15 congressional joint resolutions meant to repeal regulations passed in the waning days of the Obama administration. Now, House and Senate Democrats – plus one Senate Republican, Sen. Susan Collins of Maine – support what would be a 16th joint congressional resolution.

The regulations formerly known as net neutrality rules

The regulations formerly known as net neutrality rules prohibited broadband providers like Comcast and Verizon from interfering with users’ internet traffic or prioritizing some traffic over others. The FCC under Obama appointed Tom Wheeler did this by classifying broadband internet access services as common carriers under Title II of the Communications Act.

Having been overturned by the FCC in a party-line vote on December 14, the official publication of the repeal was published in the Federal Register on February 22 and will therefore the official repeal will take effect in 60 days, or on April 23, 2018.

Supporters of the Obama-era rules have responded to the FCC’s decision to abandon them by filing a federal lawsuit against the FCC in hopes that a court will order Pai and his colleagues to return to enforcing common carrier regulations on broadband providers.

Using the Congressional Review Act to reverse agency actions

But other longtime network neutrality proponents like Sen. Ed Markey, D-Mass., are hoping they can show turnabout to be much more than “fair play” by reinstating the Title II open internet rules with the same strategy used by Congressional Republicans and President Trump to dismantle Barack Obama’s legacy using Congressional Review Act resolutions to overturn regulations on the environment, gun safety, financial consumer protection, broadband privacy, and workplace safety.

Markey and his 49 Senate co-sponsors hope that the 60-day window will be enough time to prevent a tiebreaking vote by Vice President Mike Pence. For that, Markey and Collins will need one more vote, presumably from a Republican.

Meanwhile, House Democrats will need to hustle to line up enough GOP members to secure a bare majority in the House. The passage of such a resolution – if signed by Trump – would “disapprove” Pai’s rules and prevent the FCC from ever repealing the Title II reclassification unless Congress specifically gives it the authority in a separate bill.

Statements from a Capitol Hill media event to promote the CRA resolution

“The grassroots movement to reinstate net neutrality is growing by the day, and we will get that one more vote needed to pass my CRA resolution,” Markey said while speaking at a Capitol Hill media event to promote his resolution and bring attention to what activists called a “day of action” on network neutrality.

Markey also urged his Republican colleagues to join the “overwhelming majority of Americans” who support a free and open internet. “The internet is for all – the students, teachers, innovators, hard-working families, small businesses, and activists, not just Verizon, Charter, AT&T, and Comcast and corporate interests.”

Markey’s resolution presents an opportunity for Republicans “to right this administration’s wrong and reinstate the FCC’s Open Internet Order,” said Senate Minority Leader Chuck Schumer, D-New York.

“It’s time the Republicans show the American people whose side they’re on: big ISPs and major corporations or consumers, entrepreneurs, and small business owners.”

House Democratic Leader Nancy Pelosi, D-California, called on her Republican colleagues to “stop the FCC assault on consumer choice and consumer protections” by supporting the House version of Markey’s bill, even as it has failed to gain a single GOP co-sponsor since being introduced by Rep. Mike Doyle, D-Penn.

“The Trump Administration’s attacks on net neutrality deliver a disastrous blow to consumers, small businesses and the American entrepreneurship that is the envy of the world,” Pelosi said. “It gives me great pride to stand with Democrats and millions of Americans to defend the promise of a free, open Internet.”

Social media action using the #OneMoreVote hashtag

The “day of action” had a strong online presence, as activists used the #OneMoreVote hashtag to raise awareness on various social media platforms.

The offline protests, which were organized by Free Press, Fight for the Future and Demand Progress, the advocacy group founded by the late internet activist Aaron Swartz, went beyond the Capitol Hill rally at which Markey spoke.

These activists coordinated protests outside the district offices of eight Republican Senators, including Cory Gardner, Rob Portman, Jerry Moran, Orrin Hatch, Lisa Murkowski, and Marco Rubio, as well as Dean Heller of Nevada – whose seat is often considered a potential Democratic pickup – and John Kennedy of Louisiana.

Markey’s resolution also gained praise from Chris Lane, a vice president at Public Knowledge, a consumer advocacy group that has long advocated strong open internet protections. Lane said his group applauds Markey, Schumer, and their colleagues “for their leadership introducing this resolution to overturn the FCC’s net neutrality repeal.”

“Without the FCC protecting consumers, the prices of broadband continues to rise, privacy breaches online stack up, and communities are given sub-standard internet connections through redlining in urban areas and neglect in rural ones,” Lane said.

“The CRA provides the fastest way to restore strong net neutrality rules that are wildly popular, working to produce billions of dollars in investment and innovation, and were upheld in court twice. Only in Washington, where high-paid lobbyists hold sway, is this a controversial set of rules.”

In the end, the Trump White House vows that it would not support a CRA action against FCC

But even if Markey’s Senate bill finds a second Republican vote and Doyle’s House bill garners such overwhelming bipartisan support that House Speaker Paul Ryan, R-Wisconsin, allows it to the floor, White House Deputy Press Secretary Hogan Gidley poured cold water on Democrats’ hopes for using the CRA.

“The Trump Administration supports the FCC’s efforts to roll back burdensome, monopoly-era regulations,” Gidley said, when asked whether Trump would sign a resolution to effectively reinstate the Obama-era open internet rules his own FCC chairman voted to overturn.

(Image from Fight for the Future used with permission.)

Andrew Feinberg is the White House Correspondent and Managing Editor for Breakfast Media. He rejoined BroadbandBreakfast.com in late 2016 after working as a staff writer at The Hill and as a freelance writer. He worked at BroadbandBreakfast.com from its founding in 2008 to 2010, first as a Reporter and then as Deputy Editor. He also covered the White House for Russia's Sputnik News from the beginning of the Trump Administration until he was let go for refusing to use White House press briefings to promote conspiracy theories, and later documented the experience in a story which set off a chain of events leading to Sputnik being forced to register under the Foreign Agents Registration Act. Andrew's work has appeared in such publications as The Hill, Politico, Communications Daily, Washington Internet Daily, Washington Business Journal, The Sentinel Newspapers, FastCompany.TV, Mashable, and Silicon Angle.

Section 230

Experts Warn Against Total Repeal of Section 230

Panelists note shifting definition of offensive content.

Published

on

WASHINGTON, November 22, 2021 – Communications experts say action by Congress to essentially gut Section 230 would not truly solve any problems with social media.

Experts emphasized that it is not possible for platforms to remove from their site all content that people may believe to be dangerous. They argue that Section 230 of the Communications Decency Act, which shields platforms from legal liability with respect to what their users post, is necessary in at least some capacity.

During discussion between these experts at Broadband Breakfast’s Live Online Event on Wednesday, Alex Feerst, the co-founder of the Digital Trust and Safety Partnership, who used to work as a content moderator, said that to a certain extent it is impossible for platforms to moderate speech that is “dangerous” because every person has differing opinions about what speech they consider to be dangerous. He says it is this ambiguity that Section 230 protects companies from.

Still, Feerst believes that platforms should hold some degree of liability for the content of their sites as harm mitigation with regards to dangerous speech is necessary where possible. He believes that the effects of artificial intelligence’s use by platforms makes some degree of liability even more essential.

Particularly with the amount of online speech to be reviewed by moderators in the internet age, Feerst says the clear-cut moderation standards are too messy and expensive to be viable options.

Matt Gerst, vice president for legal and policy affairs at the Internet Association, and Shane Tews, nonresident senior fellow at the American Enterprise Institute, also say that while content moderation is complex, it is necessary. Scott McCollough, attorney at McCollough Law Firm, says large social media companies like Facebook are not the causes of all the problems with social media that are in the national spotlight right now, but rather that social features of today’s society, such as the extreme prevalence of conflict, are to blame for this focus on social media.

Proposals for change

Rick Lane, CEO of Iggy Ventures, proposes that reform of Section 230 should include a requirement for social media platforms to make very clear what content is and is not allowed on their sites. McCullough echoed this concern, saying that many moderation actions platforms take presently do not seem to be consistent with those platforms’ stated terms and conditions, and that individual states across the nation should be able to look at these instances on a case-by-case basis to determine whether platforms fairly apply their terms and conditions.

Feerst highlighted the nuance of this issue by saying that people’s definitions of “consistent” are naturally subjective, but agrees with McCullough that users who have content removed should be notified of such, as well as the reasoning for moderators’ action.

Lane also believes that rightfully included in the product of Section 230 reform will be a requirement for platforms to demonstrate a reasonable standard of care and moderate illegal and other extremely dangerous content on their sites. Tews generally agreed with Lane that such content moderation is complex, as she sees a separation between freedom of speech and illegal activity.

Gerst highlighted concerns from companies the Internet Association represents that government regulation coming from Section 230 reform will require widely varied platforms to standardize their operation approaches, diminishing innovation on the internet.

Continue Reading

Big Tech

Experts Caution Against One Size Fits All Approach to Content Moderation

Cost of moderation another factor as to why some experts say standardized content moderation policies may not work for all.

Published

on

Former President Donald Trump sued Facebook, Twitter and Google earlier this year

WASHINGTON, November 10, 2021 – Some experts say they are concerned about a lack of diversity in content moderation practices across the technology industry because some companies may not be well-served – and could be negatively affected – by uniform policies.

Many say following what other influential platforms do, like banning accounts, could do more harm than good when it comes to protecting free speech on the internet.

Since former President Donald Trump was banned from Twitter and Facebook for allegedly stoking the January Capitol riot, debate has raged about what Big Tech platforms should do when certain accounts cross the generally protected free speech line into promoting violence, disobedience, or other illegal behavior.

But the Knight Foundation event on November 2 heard that standardized content moderation policies imply a one-size fits all approach that would work across the tech spectrum. In fact, experts say, it won’t.

Lawmakers have been calling for commitments from social media companies to agree to content and platform policies, including increasing protections for minors online. But representatives from Snapchat, TikTok, and YouTube who sat before members of the Senate Commerce Subcommittee on Consumer Protection last month did not commit to that.

Facebook itself has an Oversight Board that is independent of the company; the Board earlier this year upheld Trump’s ban from the platform but recommended the company set a standard for the penalty (Trump was banned indefinitely).

Among proposed solutions for many platforms is a move toward decentralized content regulation with more delegation of moderation to individuals that are not employed by the platforms. There are even suggestions of incentivizing immunity from certain antitrust regulation should platforms implement decentralized structures.

Costs of content moderation

At an Information Technology and Innovation Foundation event on Tuesday, experts suggested a level of decentralization that would involve user tools, as opposed to plowing money to employ content moderators.

Experts noted the expense of hiring content moderators. With global social media platforms, employees who are able to moderate content in all languages and dialects must be hired, and the accumulation of these hiring costs have the potential to be lethal to many platforms.

Continue Reading

Social Media

Social Media Companies Noncommittal on Bipartisan Calls for Changes to Content Regulation

Platform representatives did not commit to legislation that would increase online protections for kids.

Published

on

Sen. Richard Blumenthal, D-Connecticut

WASHINGTON, October 28, 2021 – Members of the Senate Commerce Subcommittee on Consumer Protection on Tuesday lobbed concerns at representatives from Snapchat, TikTok and YouTube about what their platforms put in front of kids, as the platforms did not commit to changes proposed by lawmakers who are winding down a month that included revelations of the negative impact social media can have on the mental health of kids.

During the hearing, subcommittee chairman Sen. Richard Blumenthal, D-Connecticut, said his staff had created a TikTok account and while at first they were shown videos of dance trends that have been popularized on the app, it only took one week for the app’s algorithm to place videos encouraging suicidal ideation on their feed. Blumenthal also noted that through viewing fitness-related videos geared toward a male audience on social media, it only took one minute to find posts promoting illegal steroids.

Blumenthal also raised other concerning videos his staff found, including a stunt whereby kids are encouraged to hold their breath until they lose consciousness.

In response, Michael Beckerman, TikTok’s head of public policy, stated that TikTok has “not been able to find any evidence of a blackout challenge on TikTok at all.” In response to Beckerman, Blumenthal said that his office had been able to find “pass out videos” and that he found Beckerman’s statements on the matter to be unreliable.

Tuesday’s hearing comes mere weeks after a Facebook whistleblower testified that the company does not take action on its own internal research that shows its photo-sharing app Instagram has a negative impact on kids health because it conflicts with its profit-driven motion. The testimony came after the whistleblower, Frances Haugen, leaked the research to the Wall Street Journal and the Securities and Exchange Commission. Since then, Facebook has halted development of an Instagram app for kids.

The hearing pressed tech platform representatives on social media policies that lawmakers say have led to the sale of illegal drugs to minors online, the exposure of minors to content which promotes self harm and access to children for sexual predators.

Senators also criticized the social media platforms’ lack of data privacy policies and contended that they often refuse to cooperate with law enforcement investigations as well as display indifference toward keeping children from using their platforms. Both Snapchat and TikTok’s representatives committed to providing access to the algorithms used in their apps after Senators asked whether they would.

However, the representatives would not all commit their companies to supporting proposed regulatory legislation such as the Children and Teen’s Online Privacy Protection Act written by subcommittee member Sen. Ed Markey, D-Massachusetts, which prohibits the collection of personal information without consent for kids ages 13 to 15 years., bans targeted advertising directed to kids, and lets kids and teens erase any personal info collected on them at any point with an erase button

The representatives also did not commit to supporting the EARN IT Act of 2020, which would amend Section 230 and allow social media platforms to be held liable in cases where they are suspected to have caused harm to children. Throughout the hearing, the social media representatives tended to emphasize the importance of trying to take an active role in controlling what their children are viewing on social media.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending