Connect with us

Social Media

House Democrats Grill Facebook Witness, Tech Officials on Social Media Disinformation

Published

on

Photo of witness table at social media disinformation hearing by Adrienne Patton

WASHINGTON, January 8, 2020 – House Democrats on Wednesday pressed Facebook and other technology observers on why tech companies aren’t doing more to prevent the spread of “deepfakes” and other forms of digital manipulation online.

At an Energy and Commerce subcommittee hearing on “Manipulation and Deception in the Digital Age,” Chairwoman Jan Schakowsky, D-Illinois, set the stage by claiming that Congress had taken a “laissez-faire” approach to online protection.

Americans can be harmed as easily online as in the visible world, Schakowsky said, arguing there was a disparity between protections for in-person commerce that are widely lacking in the virtual realm.

Full committee Chairman Frank Pallone, D-N.J., said that the danger of subtle manipulation now means we can no longer trust our eyes.

But Rep. Cathy Rodgers, R-Washington, stressed innovation over government regulation. She said that Congress needs to be careful not to harm practices people enjoy.

The four witnesses and committee members turned their attention to the danger of deepfakes, cheap fakes, and other deceptive online practices that are difficult to detect and affect millions of people globally.

Monika Bickert, vice president of global policy management at Facebook, said that Facebook had improved its relationship with third party fact-checkers.

Under questioning, Bickert said Facebook would label videos that were false, and emphasized its more active role in content moderation: Whereas Facebook removed only one network in 2016, last year Facebook removed more than 50 networks from the social network.

Working alongside academics, professionals, and fact-checkers, she said that 2020 will be a good year in going after information.

Joan Donovan, research director at Harvard’s Kennedy School, testified that the multi-million-dollar deception industry was a threat to national security. She said the country hasn’t quantified the cost of misinformation, but it needs regulatory guardrails. Otherwise, she said, the “future is forgery.”

Justin Hurwitz, a law professor at University of Nebraska College of Law, said design is powerful because people are predictable, we are programable, and dark patterns harm consumers.

Tristan Harris, executive director for the Center for Humane Technology, said the country has a “dark infrastructure,” and that it needed to protect its digital borders as much as it protects its physical borders.

Rather than form new federal agencies to deal with misinformation challenges, Harris said Congress needed to give existing agencies a digital update.

Rep. Kathy Castor, D-Florida, asked Harris to expound upon the possible harm that children face with addictive algorithms. Harris said the autoplay feature on YouTube automatically plays videos that lean-to extremes. For example, he said, if you watch a video on 9/11, autoplay will scout videos on 9/11 conspiracy theories to play immediately afterwords.

 

Adrienne Patton was a Reporter for Broadband Breakfast. She studied English rhetoric and writing at Brigham Young University in Provo, Utah. She grew up in a household of journalists in South Florida. Her father, the late Robes Patton, was a sports writer for the Sun-Sentinel who covered the Miami Heat, and is for whom the press lounge in the American Airlines Arena is named.

Section 230

Repealing Section 230 Would be Harmful to the Internet As We Know It, Experts Agree

While some advocate for a tightening of language, other experts believe Section 230 should not be touched.

Published

on

Rep. Ken Buck, R-Colo., speaking on the floor of the House

WASHINGTON, September 17, 2021—Republican representative from Colorado Ken Buck advocated for legislators to “tighten up” the language of Section 230 while preserving the “spirit of the internet” and enhancing competition.

There is common ground in supporting efforts to minimize speech advocating for imminent harm, said Buck, even though he noted that Republican and Democratic critics tend to approach the issue of changing Section 230 from vastly different directions

“Nobody wants a terrorist organization recruiting on the internet or an organization that is calling for violent actions to have access to Facebook,” Buck said. He followed up that statement, however, by stating that the most effective way to combat “bad speech is with good speech” and not by censoring “what one person considers bad speech.”

Antitrust not necessarily the best means to improve competition policy

For companies that are not technically in violation of antitrust policies, improving competition though other means would have to be the answer, said Buck. He pointed to Parler as a social media platform that is an appropriate alternative to Twitter.

Though some Twitter users did flock to Parler, particularly during and around the 2020 election, the newer social media company has a reputation for allowing objectionable content that would otherwise be unable to thrive on social media.

Buck also set himself apart from some of his fellow Republicans—including Donald Trump—by clarifying that he does not want to repeal Section 230.

“I think that repealing Section 230 is a mistake,” he said, “If you repeal section 230 there will be a slew of lawsuits.” Buck explained that without the protections afforded by Section 230, big companies will likely find a way to sufficiently address these lawsuits and the only entities that will be harmed will be the alternative platforms that were meant to serve as competition.

More content moderation needed

Daphne Keller of the Stanford Cyber Policy Center argued that it is in the best interest of social media platforms to enact various forms of content moderation, and address speech that may be legal but objectionable.

“If platforms just hosted everything that users wanted to say online, or even everything that’s legal to say—everything that the First Amendment permits—you would get this sort of cesspool or mosh pit of online speech that most people don’t actually want to see,” she said. “Users would run away and advertisers would run away and we wouldn’t have functioning platforms for civic discourse.”

Even companies like Parler and Gab—which pride themselves on being unyielding bastions of free speech—have begun to engage in content moderation.

“There’s not really a left right divide on whether that’s a good idea, because nobody actually wants nothing but porn and bullying and pro-anorexia content and other dangerous or garbage content all the time on the internet.”

She explained that this is a double-edged sword, because while consumers seem to value some level of moderation, companies moderating their platforms have a huge amount of influence over what their consumers see and say.

What problems do critics of Section 230 want addressed?

Internet Association President and CEO Dane Snowden stated that most of the problems surrounding the Section 230 discussion boil down to a fundamental disagreement over the problems that legislators are trying to solve.

Changing the language of Section 230 would impact not just the tech industry: “[Section 230] impacts ISPs, libraries, and universities,” he said, “Things like self-publishing, crowdsourcing, Wikipedia, how-to videos—all those things are impacted by any kind of significant neutering of Section 230.”

Section 230 was created to give users the ability and security to create content online without fear of legal reprisals, he said.

Another significant supporter of the status quo was Chamber of Progress CEO Adam Kovacevich.

“I don’t think Section 230 needs to be fixed. I think it needs [a better] publicist.” Kovacevich stated that policymakers need to gain a better appreciation for Section 230, “If you took away 230 You would have you’d give companies two bad options: either turn into Disneyland or turn into a wasteland.”

“Either turn into a very highly curated experience where only certain people have the ability to post content, or turn into a wasteland where essentially anything goes because a company fears legal liability,” Kovacevich said.

Continue Reading

Social Media

Members of Congress Request Facebook Halt ‘Instagram For Kids’ Plan Following Mental Health Research Report

Letter follows Wall Street Journal story that reports Facebook knew about mental health damage Instagram has on teens.

Published

on

WASHINGTON, September 15, 2021 – Members of Congress have sent a letter Wednesday to Facebook CEO Mark Zuckerberg urging the company to stop its plan to launch a new platform for kids, following a report by the Wall Street Journal that cites company documents that reportedly shows the company knows its platforms harm the mental health of teens.

The letter, signed by Edward Markey, D-Massachusetts, Kathy Castor, D-Florida, and Lori Trahan, D-Massachusetts, also asks Facebook to provide answers by October 6 to questions including whether the company has, and who, reviewed the mental health research as cited in the Journal report; whether the company will agree to abandon plans to launch a new platform for children or teens; and when the company will begin studying its platforms’ impact on the kids’ mental health.

The letter also demands an update on the company’s plans for new products targeting children or teens, asks for copies of internal research regarding the mental health of this demographic, and copies of any external research the company has commissioned or accessed related to this matter.

The letter cites the Journal’s September 14 story, which reports that the company has spent the past three years conducting studies into how photo-sharing app Instagram, which Facebook owns, affects millions of young users, and found that the app is “harmful for a sizable percentage of them, most notably teenage girls.” The story uses the story of a teen who had to see a therapist due to an eating disorder due to exposure to images of other users’ bodies.

The story also cites a presentation that said teens were blaming Instagram for anxiety, depression, and the desire to kill themselves.

The head of Instagram, Adam Mosseri, told the Journal that research on mental health was valuable and that Facebook was late to realizing the drawback of connecting large swatch of people, according to the story. But he added that there’s “a lot of good that comes with what we do.”

Facebook told Congress it was planning ‘Instagram for kids’

Back in March, during a congressional hearing about Big Tech’s influence, Zuckerberg said Instagram was in the planning stages of building an “Instagram for kids.” Instagram itself does not allow kids under 13 to use the app.

On April 5, Markey, Castor and Trahan penned their names on another letter to Zuckerberg, which expressed concerns about the plan. “Children are a uniquely vulnerable population online, and images of kids are highly sensitive data,” the April letter said. “Facebook has an obligation to ensure that any new platforms or projects targeting children put those users’ welfare first, and we are skeptical that Facebook is prepared to fulfil this obligation.”

The plan was also met with opposition from the Campaign for a Commercial-Free Childhood, the Center for Humane Technology, Common Sense Media, and the Center for Digital Democracy, who said the app “preys on their fear of missing out as their ravenous desire for approval by peers exploits their developmental growth.

“The platform’s relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and well-being,” the opponents said. “Younger children are even less developmentally equipped to deal with these challenges, as they are learning to navigate social interactions, friendships, and their inner sense of strengths during this crucial window of development.”

At the March hearing, Zuckerberg, however, claimed that social apps to connect other people can have positive mental health benefits.

And then in August, Sens. Richard Blumenthal, D-Connecticut, and Marsha Blackburn, R-Tennessee, sent a letter to Zuckerberg asking for their research on mental health. Facebook responded without the company’s research, but said there are challenges with doing such research, the Journal said. “We are not aware of a consensus among studies or experts about how much screen time is ‘too much,’” according to the Journal, citing the response letter to the senators.

Continue Reading

China

Experts Raise Alarm About China’s App Data Aggregation Potential

The Communist government has access to a vast trove from Chinese-made apps.

Published

on

Former Commerce aide and professor at Texas A&M University, Margaret Peterlin

WASHINGTON, September 2, 2021 – Social media app TikTok’s rise as one of the world’s top downloaded software is concerning experts who say the aggregate data collected across a number of Chinese-made apps will allow the Communist government to get ahead of any federal action to stem the data flow.

In June, President Joe Biden signed an executive order that revoked a Trump direction that sought to ban TikTok and replaced it with criteria for the Commerce Department to evaluate the risks of said apps connected to foreign adversaries. The Trump administration even pressured TikTok to sell its U.S. business, but that never materialized.

On a webinar hosted by the Federalist Society on Thursday, panelists said the U.S. government may already be behind on the race to contain data collection and prevent it from getting into Chinese hands, who are creating advanced artificial intelligence using the data.

Margaret Peterlin, a lawyer, former Commerce Department aide and professor at the school of public service at Texas A&M University, said her concern with Biden’s executive order is whether it’s “strategically responsive” to what the Chinese government intends to do with all these sources of data – WeChat, TikTok, AliExpress, and its massive influence in telecommunications with Huawei and ZTE.

She noted that the Communist government has been very clear about its direction – that it wants to dominate and develop its data aggregation prowess to develop advanced artificial technologies. She illustrated this by using the example of how government uses advanced identification technologies and surveillance to monitor the Uyghur minority.

Peterlin also raised the issue of Chinese telecommunications companies like Huawei and ZTE, which have been the subject of restrictions from the Biden administration and the Federal Communications Commission in recent months. But she noted that Huawei is still in involved with regional carriers in the United States.

The FCC has addressed this concern by offering to compensate carriers to “rip and replace” that risky equipment. (Part of Huawei’s attraction is it’s relative low cost compared to its European rivals, for example.)

She noted that 5G “isn’t just another G” because there are many more connection and data points. Due to the promised lower latency, critical infrastructure like power grids and dams and even medical devices, can be controlled over the next-generation networks. Peterlin said these points of connection cannot be something the Chinese can get access to.

For Jamil Jaffer, founder and executive director of the National Security Institute, his concern is the pace at which the Chinese government is moving. “I worry that it’s very late in the process to be getting to this point, and they’re well ahead of us,” he said, speaking on China’s growing influence in the technology space.

Jennifer Hay, senior director for national security programs at DataRobot, a company that develops AI software, said Biden’s executive order should be able to expand to other platforms and empower the Commerce Department to look into what is going on behind the scenes on these applications.

She said the government needs to be able to make educated decisions about who’s using Americans’ data and what that data is being used for.

Hay even suggested that Congress step in and draft legislation on this kind of data collection, but Jaffer disagreed on the grounds that not only would a slow-moving government not be able to keep up with the rapid movement of technology, but legislation may impede business. He said this type of work is best left to the private sector to figure out.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending