Connect with us

Big Tech

FCC Chief Appoints Two Key Staffers

WASHINGTON, September 30, 2010 – Federal Communications Commission Chairman Julius Genachowski has named senior agency staff in the Media Bureau and Wireless Telecommunications bureaus. These positions are MB Deputy Michelle Carey and WTB Deputy and Senior Advisor on New Technology Michael McKenzie.

Published

on

WASHINGTON, September 30, 2010 – Federal Communications Commission Chairman Julius Genachowski has named senior agency staff in the Media and Wireless Telecommunications bureaus. These positions are MB Deputy Michelle Carey and WTB Deputy and Senior Advisor on New Technology Michael McKenzie.

Carey will assist in shaping the bureau’s policies designed to facilitate competition in the multichannel video programming marketplace. Prior to this appointment, she was a senior advisor to Assistant Secretary Larry Strickling at the National Telecommunications and Information Administration, where she assisted in implementing the $4.4 billion Broadband Technology Opportunities Program.

McKenzie will oversee the Mobility and the Spectrum Management Resources & Technologies (SMaRT) divisions within WTB and help provide direction for the bureau’s activities, including its role in advancing the goals and recommendations of the National Broadband Plan. McKenzie will focus on the FCC’s consideration of the implications of emerging technologies and contribute to the commission’s work on America’s innovation agenda. Prior to joining the FCC, Mr. McKenzie worked in senior-level legal, business development, and strategy positions. Most recently, he served as a general manager at Microsoft, where he supported the company’s worldwide enterprise cloud computing services.

Broadband Breakfast is a decade-old news organization based in Washington that is building a community of interest around broadband policy and internet technology, with a particular focus on better broadband infrastructure, the politics of privacy and the regulation of social media. Learn more about Broadband Breakfast.

Section 230

Experts Warn Against Total Repeal of Section 230

Panelists note shifting definition of offensive content.

Published

on

WASHINGTON, November 22, 2021 – Communications experts say action by Congress to essentially gut Section 230 would not truly solve any problems with social media.

Experts emphasized that it is not possible for platforms to remove from their site all content that people may believe to be dangerous. They argue that Section 230 of the Communications Decency Act, which shields platforms from legal liability with respect to what their users post, is necessary in at least some capacity.

During discussion between these experts at Broadband Breakfast’s Live Online Event on Wednesday, Alex Feerst, the co-founder of the Digital Trust and Safety Partnership, who used to work as a content moderator, said that to a certain extent it is impossible for platforms to moderate speech that is “dangerous” because every person has differing opinions about what speech they consider to be dangerous. He says it is this ambiguity that Section 230 protects companies from.

Still, Feerst believes that platforms should hold some degree of liability for the content of their sites as harm mitigation with regards to dangerous speech is necessary where possible. He believes that the effects of artificial intelligence’s use by platforms makes some degree of liability even more essential.

Particularly with the amount of online speech to be reviewed by moderators in the internet age, Feerst says the clear-cut moderation standards are too messy and expensive to be viable options.

Matt Gerst, vice president for legal and policy affairs at the Internet Association, and Shane Tews, nonresident senior fellow at the American Enterprise Institute, also say that while content moderation is complex, it is necessary. Scott McCollough, attorney at McCollough Law Firm, says large social media companies like Facebook are not the causes of all the problems with social media that are in the national spotlight right now, but rather that social features of today’s society, such as the extreme prevalence of conflict, are to blame for this focus on social media.

Proposals for change

Rick Lane, CEO of Iggy Ventures, proposes that reform of Section 230 should include a requirement for social media platforms to make very clear what content is and is not allowed on their sites. McCullough echoed this concern, saying that many moderation actions platforms take presently do not seem to be consistent with those platforms’ stated terms and conditions, and that individual states across the nation should be able to look at these instances on a case-by-case basis to determine whether platforms fairly apply their terms and conditions.

Feerst highlighted the nuance of this issue by saying that people’s definitions of “consistent” are naturally subjective, but agrees with McCullough that users who have content removed should be notified of such, as well as the reasoning for moderators’ action.

Lane also believes that rightfully included in the product of Section 230 reform will be a requirement for platforms to demonstrate a reasonable standard of care and moderate illegal and other extremely dangerous content on their sites. Tews generally agreed with Lane that such content moderation is complex, as she sees a separation between freedom of speech and illegal activity.

Gerst highlighted concerns from companies the Internet Association represents that government regulation coming from Section 230 reform will require widely varied platforms to standardize their operation approaches, diminishing innovation on the internet.

Continue Reading

Big Tech

Experts Caution Against One Size Fits All Approach to Content Moderation

Cost of moderation another factor as to why some experts say standardized content moderation policies may not work for all.

Published

on

Former President Donald Trump sued Facebook, Twitter and Google earlier this year

WASHINGTON, November 10, 2021 – Some experts say they are concerned about a lack of diversity in content moderation practices across the technology industry because some companies may not be well-served – and could be negatively affected – by uniform policies.

Many say following what other influential platforms do, like banning accounts, could do more harm than good when it comes to protecting free speech on the internet.

Since former President Donald Trump was banned from Twitter and Facebook for allegedly stoking the January Capitol riot, debate has raged about what Big Tech platforms should do when certain accounts cross the generally protected free speech line into promoting violence, disobedience, or other illegal behavior.

But the Knight Foundation event on November 2 heard that standardized content moderation policies imply a one-size fits all approach that would work across the tech spectrum. In fact, experts say, it won’t.

Lawmakers have been calling for commitments from social media companies to agree to content and platform policies, including increasing protections for minors online. But representatives from Snapchat, TikTok, and YouTube who sat before members of the Senate Commerce Subcommittee on Consumer Protection last month did not commit to that.

Facebook itself has an Oversight Board that is independent of the company; the Board earlier this year upheld Trump’s ban from the platform but recommended the company set a standard for the penalty (Trump was banned indefinitely).

Among proposed solutions for many platforms is a move toward decentralized content regulation with more delegation of moderation to individuals that are not employed by the platforms. There are even suggestions of incentivizing immunity from certain antitrust regulation should platforms implement decentralized structures.

Costs of content moderation

At an Information Technology and Innovation Foundation event on Tuesday, experts suggested a level of decentralization that would involve user tools, as opposed to plowing money to employ content moderators.

Experts noted the expense of hiring content moderators. With global social media platforms, employees who are able to moderate content in all languages and dialects must be hired, and the accumulation of these hiring costs have the potential to be lethal to many platforms.

Continue Reading

Social Media

Social Media Companies Noncommittal on Bipartisan Calls for Changes to Content Regulation

Platform representatives did not commit to legislation that would increase online protections for kids.

Published

on

Sen. Richard Blumenthal, D-Connecticut

WASHINGTON, October 28, 2021 – Members of the Senate Commerce Subcommittee on Consumer Protection on Tuesday lobbed concerns at representatives from Snapchat, TikTok and YouTube about what their platforms put in front of kids, as the platforms did not commit to changes proposed by lawmakers who are winding down a month that included revelations of the negative impact social media can have on the mental health of kids.

During the hearing, subcommittee chairman Sen. Richard Blumenthal, D-Connecticut, said his staff had created a TikTok account and while at first they were shown videos of dance trends that have been popularized on the app, it only took one week for the app’s algorithm to place videos encouraging suicidal ideation on their feed. Blumenthal also noted that through viewing fitness-related videos geared toward a male audience on social media, it only took one minute to find posts promoting illegal steroids.

Blumenthal also raised other concerning videos his staff found, including a stunt whereby kids are encouraged to hold their breath until they lose consciousness.

In response, Michael Beckerman, TikTok’s head of public policy, stated that TikTok has “not been able to find any evidence of a blackout challenge on TikTok at all.” In response to Beckerman, Blumenthal said that his office had been able to find “pass out videos” and that he found Beckerman’s statements on the matter to be unreliable.

Tuesday’s hearing comes mere weeks after a Facebook whistleblower testified that the company does not take action on its own internal research that shows its photo-sharing app Instagram has a negative impact on kids health because it conflicts with its profit-driven motion. The testimony came after the whistleblower, Frances Haugen, leaked the research to the Wall Street Journal and the Securities and Exchange Commission. Since then, Facebook has halted development of an Instagram app for kids.

The hearing pressed tech platform representatives on social media policies that lawmakers say have led to the sale of illegal drugs to minors online, the exposure of minors to content which promotes self harm and access to children for sexual predators.

Senators also criticized the social media platforms’ lack of data privacy policies and contended that they often refuse to cooperate with law enforcement investigations as well as display indifference toward keeping children from using their platforms. Both Snapchat and TikTok’s representatives committed to providing access to the algorithms used in their apps after Senators asked whether they would.

However, the representatives would not all commit their companies to supporting proposed regulatory legislation such as the Children and Teen’s Online Privacy Protection Act written by subcommittee member Sen. Ed Markey, D-Massachusetts, which prohibits the collection of personal information without consent for kids ages 13 to 15 years., bans targeted advertising directed to kids, and lets kids and teens erase any personal info collected on them at any point with an erase button

The representatives also did not commit to supporting the EARN IT Act of 2020, which would amend Section 230 and allow social media platforms to be held liable in cases where they are suspected to have caused harm to children. Throughout the hearing, the social media representatives tended to emphasize the importance of trying to take an active role in controlling what their children are viewing on social media.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending