Connect with us

Privacy

FTC Settlement with YouTube Has Creators Upset and Worried About FTC Approach to Children’s Privacy

Published

on

Photo of COPPA expert at TechFreedom panel by Adrienne Patton

WASHINGTON, January 14, 2020 – Google’s decision to require all content creators for YouTube to designate whether their videos are intended for children is negatively impacting online free speech, a bevy of influencers said at a TechFreedom event on Monday.

The market-oriented non-profit organized panels on Monday of YouTube content creators to discuss their fears that the Children’s Online Privacy Protection Act has spurred changes – in the name of protecting children – that are detrimental to children.

The changes in YouTube’s video tagging policy is apparently the result of Google’s $170 million settlement with the Federal Trade Commission and the New York attorney general for alleged violations of the COPPA. That 1998 law prohibits severely limits companies from collecting data about children under 13, and YouTube was accused of violating that law.

YouTube, in term, is requiring its content creators to affirmatively comply with COPPA: “If you fail to set your audience accurately, you may face compliance issues with the FTC or other authorities, and we may take action on your YouTube account,” according to YouTube’s help-center document on the topic.

The content creators speaking at the TechFreedom event all agreed that YouTube’s changes have significantly impacted their businesses as they try to adapt to vague new restrictions.

Harry Jho, the creator of Mother Goose Club with six million subscribers and billions of views on YouTube, said he had to alter Mother Goose Club’s business plan and significantly reduce potential content.

Jho’s primary audience is children under the age of five. However, parents’ and grandparents’ accounts show up in the account because most children that age do not have a personal device and account.

Mother Goose Club only receives data for viewers ages 13 and up; the company cannot gather viewer information or send fans private emails. The fans in the comment section initiate communication.

Lawyer and vlogger Jeremy Johnston said that audience interaction is what makes YouTube different from television. Johnston, who runs J House Vlog with his wife, has two million subscribers and have created a community through interaction with their fans by responding to comments and questions.

Johnston was working on J House Junior, a vlog specific for YouTube Kids, but said he had to shut down production because of YouTube’s new policy.

COPPA was intended to put parents in charge, Johnston said. But in practice, he continued, the law circumvents the parents and makes creators liable. Creators, he said, don’t have control of Google’s platform. Instead, COPPA is protecting children from their parent’s choices, said Johnston.

The FTC’s last major regulatory change to COPPA was in 2013. In July 2019, the agency issued a proposed regulation with additional changes regarding content that is deemed to be targeting children.

But Angela Campbell, a law professor at Georgetown University, made clear that it was not the FTC that called for YouTube comments or playlists to be turned off — it was YouTube.

Campbell said that content creators should move in the direction of data minimization, limiting the personal information that organizations can use or acquire.

To Jackie, the creator of NerdECrafter, COPPA has detrimental consequences for female entrepreneurs. Because female creators often use tools like dolls or watercolors, she said, those items might seem like child-like interests and are therefore flagged as child-like content. But Jackie said that she has data showing 85 percent of her audience are adults.

Jackie believes that creators need a concise definition of “child content” because there are crossovers in the crafting community.

As a result of the new YouTube policy, Jackie is taking steps to make her content seem more “adult”-related, rather than child-directed.

Forrest, or KreekCraft, as he is known online to his thousands of subscribers, said the internet isn’t as “black and white” anymore. In response to the moderator asking what he would say to the FTC, he pleaded the FTC to collaborate with content creators before engaging in activities – like the YouTube settlement – that significantly affect them.

Because kids do not watch television as much as YouTube, kids will not abandon the platform, said Forrest. But, he said, they may start watching non-child videos because that will be the only accessible content.

Privacy

Online Protections for Children Bill Passes Committee Despite Concern over FTC Authority

Opposition to a reformed COPPA include the ability of the FTC to enact broad rule-making.

Published

on

Photo of Senator Edward Markey, D-Mass.

WASHINGTON, July 28, 2022 – The Senate Committee on Commerce, Science and Transportation approved two online privacy protection bills in a Wednesday markup, including an update to legislation that will increase the age for online protection for children.

An update to the Children and Teens’ Online Privacy and Protection Act (S.1628) – which originally passed in 1998 but had amendments proposed last May – would see the age of protections increase from 13 to 15, meaning large internet companies will be prohibited from collecting the personal information of anyone under 16 without consent and ban targeted marketing to those children. The bill passed via voice vote.

Other provisions in the bill include a mandate to create an online “eraser button” that will allow users to eliminate personal information of a child or teen; implement a “Digital Marketing Bill of Rights for Minors” that limits the collection of personal information from young users; and establish a first-of-its-kind Youth Privacy and Marketing Division at the FTC,” according to a summary of the bill’s key components.

“The Senate Commerce Committee this morning took a historic step towards stopping Big Tech’s predatory behavior from harming kids every day,” Senator Edward Markey, D-Mass., who introduced the amendments, said Wednesday.

The other bill, the Kids Online Safety Act (S.3663), will give parents enhanced control over their children’s online activities to “better protect their health and well-being.” The bill, introduced by Senator Richard Blumenthal, D-CT, and Senator Marsha Blackburn, R-TN, passed 28-0.

The bill would put in place additional safeguards and tools, such as platforms giving minors options to protect their personal information and to disable recommendations.

“I don’t think we’ve ever had a piece of legislation that has had such strong support across groups across the country” “Parents want a tool kit to protect their children online,” Senator Blumenthal said during Wednesday’s hearing.

The bills now move to the Senate floor.

Concern about FTC authority under new COPPA

Under COPPA 2.0, the FTC authority includes determining what are “unfair or deceptive acts” in marketing practices and enforcing violations. In May, the agency put out a policy statement specifying its focus on enforcing the existing version of the bill.

Some senators voted against passing COPPA 2.0 over concern that it would give the Federal Trade Commission too much rule-making authority.

Senator Blackburn said there should be more restrictions on the ability of the FTC to make rules so there wouldn’t be overreach.

Similarly, Senator Mike Lee, R-UT, said he was not able to support the bill during markup because he is concerned about “giving a blanket ruling power to the FTC.

“We are at our best when we carefully consider legislation and don’t rush through it,” Lee said.

Continue Reading

Cybersecurity

Rep. Swalwell Says App Preference Bill Will Harm National Security

‘I just want to limit the ability for any bad actor to get into your device.’

Published

on

Photo of Representative Eric Swalwell, D-Calif.

July 27, 2022 – Antitrust legislation that would restrict the preferential treatment of certain apps on platforms would harm national security by making more visible apps from hostile nations, claimed Representative Eric Swalwell, D-Calif, at a Punchbowl News event Wednesday.

The American Innovation and Choice Online Act is currently under review by the Senate and, if passed, would prohibit certain online platforms from unfairly preferencing products, limiting another business’ ability to operate on a platform, or discriminating against competing products and services.

The legislation would ban Apple and Google from preferencing their own first-party apps on their app stores, which would make it easier for apps disseminated from hostile nations to be seen on the online stores, Swalwell said.

“[Russia and China] could flood the app store with apps that can vacuum up consumer data and send it back to China,” said Swalwell, adding that disinformation regarding American elections would spread. “Until these security concerns are addressed, we should really pump the breaks on this.”

Swalwell asked for a hearing conducted by Judiciary Committee of the House with the National Security Agency, Federal Bureau of Investigation, and Homeland Security officials to lay out what the bill would mean for national security.

“I just want to limit the ability for any bad actor to get into your device, whether you’re an individual or small business,” said Swalwell.

Lawmakers have become increasingly concerned about China’s access to American data through popular video-sharing apps, such as TikTok. Last month, Federal Communications Commissioner Brendan Carr called for Apple and Google to remove the app on the grounds that the app’s parent company, ByteDance, is “beholden” to the Communist government in China and required to comply with “surveillance demands.”

The comments follow debate surrounding the bill, which was introduced to the Senate on May 2 by Sen. Amy Klobuchar, D-Minn., on how it would affect small businesses and American competitiveness globally.

Continue Reading

Cybersecurity

Government Should Incentivize Information Sharing for Ransomware Attacks, Experts Say

‘Information sharing between the government and the private sector, while integral to tackling ransomware, is inconsistent.’

Published

on

Screenshot of Trent Teyema of GeoTech Center

WASHINGTON, July 27, 2022 – The federal government should incentivize the reporting of cyberattacks through safe harbor and shield laws, said experts at an Atlantic Council event Tuesday, as a recent law requiring companies in critical infrastructure sectors to report such attacks to the federal government is limited and currently unclear on who exactly it impacts.

The Cyber Incident Reporting for Critical Infrastructure Act passed in March does not cover private companies who do not operate in the critical infrastructure sectors and does not include safe harbor and shield laws that would encourage private companies to engage in the process.

Oftentimes, companies will avoid interacting with law enforcement to avoid the stigma associated with being a victim of a cyberattack and out of fear of being held liable by regulators and investors, said Trent Teyema, senior fellow at technology policy university collaborative GeoTech Center.

Teyema called for a safe harbor framework, a law that provides protection against legal liability when other conditions are met. Such a provision would decrease the risk of companies being held liable for cyberattacks from regulators, investors, and the public.

He also called for shield laws that would protect against revealing certain information to the government as a requirement for receiving law enforcement assistance.

The government needs to make it easy for the private sector to share information with law enforcement, said Teyema.

“Information sharing between the government and the private sector, while integral to tackling ransomware, is inconsistent,” read a report written by Teyema and David Bray, fellow at GeoTech Center. Information sharing across sectors allows cybersecurity experts in both sectors to learn about new vulnerabilities in software and new attack vectors. It strengthens collective resiliency and can influence the processes used to anticipate and respond to threats, continued the report.

Ransomware on the rise

Ransomware attacks in which bad actors demand money to release encrypted data are increasing dramatically, reported the White House last year. Ransomware incidents often disrupt critical services, such as banks, hospitals and schools that require constant access to data. In 2021, there was approximately $20 billion in damages from ransomware attacks in the United States, with $11 billion in 2020 and $5 billion the year before, said Bray.

This follows on the heels of the 2021 Colonial Pipeline hack that targeted the billing system and led to the shutdown of the largest fuel pipeline in the United States. The Russian-speaking cybercrime group responsible, DarkSide, received $4.4 million in ransom from Colonial, part of which was later recovered by the United States law enforcement.

Research firm Cybersecurity Ventures predicts that there will be a ransomware attack every two seconds by the year 2031 with global costs exceeding $265 billion.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending