Free Speech
Social Media Conspiracies Fuel Extremism, Says GWU Panel
January 22, 2021—Social media is an accelerant that spreads conspiracy theories and feeds extremist ideology, and was ultimately the cause of the U.S. Capitol riot on January 6, said experts on a George Washington University panel on Thursday.
Despite the rioters being a hodgepodge of different extremist groups, nearly all of them organized originally on social media platforms, said Seamus Hughes, deputy director of the GWU program on extremism. The groups included the Proud Boys, Oath Keepers, and other militia-types, mixed in with conspiracy theorists and those who believed the elections were rigged against Donald Trump, he said.

There are two components that gave rise to what happened on Capitol Hill, said Rollie Lal, professor at GWU. First came the utilization of conspiracy theories as political strategy, then came the proliferation of those conspiracies on social media platforms, which flourish among extremist groups.
These conspiracies include the fringe right-wing QAnon theory that alleges a Satan-worshipping cabal of pedophiles run an international global sex-trafficking ring. Over time conspiracies gain traction by attaching themselves to current events, such as believing that COVID-19 is a hoax or that the 2020 election wasn’t valid, which led to the insurrection at the Capitol, Lal said.
Determining the responsibility of social media companies like Twitter and Facebook is a challenge because their data is basically a black box and no one outside the companies can access it, said Rebekah Tromble, director for the Institute on Data, Democracy, and Politics at GWU. These companies need better regulatory frameworks, she said, suggesting policy similar to the European Union’s recently-proposed Digital Services Act.
In response to the Capitol unrest, Twitter permanently suspended Donald Trump’s account and Facebook suspended it indefinitely, alleging that he directly incited the violence. Tromble said that Twitter should have suspended his account a while ago.
Section 230 of the 1996 Telecommunications Act, which provides immunity to platforms from content posted by their users, is a dream for social media companies, said Christopher Kojm, professor at GWU, addressing the balance between the decision-making power of these platforms and preserving free speech rights of users.
“These companies are the sole voice of decision,” Kojm said, “but if you have government regulates them, how do you address First Amendment issues?”
If a company argues that Trump incited violence through social media, and social media is federally regulated, that company would need to prove “imminent lawless action”, according to the Brandenburg incitement test in order to lose free speech protection, he said, referencing the 1969 Supreme Court decision in Brandenburg v. Ohio.
The panel also compared domestic extremists to foreign terrorist groups, drawing many similarities to ISIS, especially their use of social media. ISIS was far more effective using social media for recruiting and spreading false information than al Qaida in the past, said Mary McCord, legal director of the Institute for Constitutional Advocacy and Protection at GWU.
ISIS was severely crippled as companies and governments worked to remove them from their platforms, Hughes said, explaining that deplatforming individuals and groups on social media, from a practical perspective, is very effective at reducing their influence on followers.
Tromble agreed, saying that Twitter’s banning of Trump will probably reduce his social impact; however, she also expressed concern that one company had the absolute power to shut down another’s voice.
Section 230
Pressed by Congress, Big Tech Defends Itself and Offers Few Solutions After Capitol Riot
January 22, 2021—Social media is an accelerant that spreads conspiracy theories and feeds extremist ideology, and was ultimately the cause of the U.S. Capitol riot on January 6, said experts on a George Washington University panel on Thursday.
Despite the rioters being a hodgepodge of different extremist groups, nearly all of them organized originally on social media platforms, said Seamus Hughes, deputy director of the GWU program on extremism. The groups included the Proud Boys, Oath Keepers, and other militia-types, mixed in with conspiracy theorists and those who believed the elections were rigged against Donald Trump, he said.

There are two components that gave rise to what happened on Capitol Hill, said Rollie Lal, professor at GWU. First came the utilization of conspiracy theories as political strategy, then came the proliferation of those conspiracies on social media platforms, which flourish among extremist groups.
These conspiracies include the fringe right-wing QAnon theory that alleges a Satan-worshipping cabal of pedophiles run an international global sex-trafficking ring. Over time conspiracies gain traction by attaching themselves to current events, such as believing that COVID-19 is a hoax or that the 2020 election wasn’t valid, which led to the insurrection at the Capitol, Lal said.
Determining the responsibility of social media companies like Twitter and Facebook is a challenge because their data is basically a black box and no one outside the companies can access it, said Rebekah Tromble, director for the Institute on Data, Democracy, and Politics at GWU. These companies need better regulatory frameworks, she said, suggesting policy similar to the European Union’s recently-proposed Digital Services Act.
In response to the Capitol unrest, Twitter permanently suspended Donald Trump’s account and Facebook suspended it indefinitely, alleging that he directly incited the violence. Tromble said that Twitter should have suspended his account a while ago.
Section 230 of the 1996 Telecommunications Act, which provides immunity to platforms from content posted by their users, is a dream for social media companies, said Christopher Kojm, professor at GWU, addressing the balance between the decision-making power of these platforms and preserving free speech rights of users.
“These companies are the sole voice of decision,” Kojm said, “but if you have government regulates them, how do you address First Amendment issues?”
If a company argues that Trump incited violence through social media, and social media is federally regulated, that company would need to prove “imminent lawless action”, according to the Brandenburg incitement test in order to lose free speech protection, he said, referencing the 1969 Supreme Court decision in Brandenburg v. Ohio.
The panel also compared domestic extremists to foreign terrorist groups, drawing many similarities to ISIS, especially their use of social media. ISIS was far more effective using social media for recruiting and spreading false information than al Qaida in the past, said Mary McCord, legal director of the Institute for Constitutional Advocacy and Protection at GWU.
ISIS was severely crippled as companies and governments worked to remove them from their platforms, Hughes said, explaining that deplatforming individuals and groups on social media, from a practical perspective, is very effective at reducing their influence on followers.
Tromble agreed, saying that Twitter’s banning of Trump will probably reduce his social impact; however, she also expressed concern that one company had the absolute power to shut down another’s voice.
Free Speech
Experts Reexamining Role of Targeted Ads and Political Extremism in Wake of January 6 Attack
January 22, 2021—Social media is an accelerant that spreads conspiracy theories and feeds extremist ideology, and was ultimately the cause of the U.S. Capitol riot on January 6, said experts on a George Washington University panel on Thursday.
Despite the rioters being a hodgepodge of different extremist groups, nearly all of them organized originally on social media platforms, said Seamus Hughes, deputy director of the GWU program on extremism. The groups included the Proud Boys, Oath Keepers, and other militia-types, mixed in with conspiracy theorists and those who believed the elections were rigged against Donald Trump, he said.

There are two components that gave rise to what happened on Capitol Hill, said Rollie Lal, professor at GWU. First came the utilization of conspiracy theories as political strategy, then came the proliferation of those conspiracies on social media platforms, which flourish among extremist groups.
These conspiracies include the fringe right-wing QAnon theory that alleges a Satan-worshipping cabal of pedophiles run an international global sex-trafficking ring. Over time conspiracies gain traction by attaching themselves to current events, such as believing that COVID-19 is a hoax or that the 2020 election wasn’t valid, which led to the insurrection at the Capitol, Lal said.
Determining the responsibility of social media companies like Twitter and Facebook is a challenge because their data is basically a black box and no one outside the companies can access it, said Rebekah Tromble, director for the Institute on Data, Democracy, and Politics at GWU. These companies need better regulatory frameworks, she said, suggesting policy similar to the European Union’s recently-proposed Digital Services Act.
In response to the Capitol unrest, Twitter permanently suspended Donald Trump’s account and Facebook suspended it indefinitely, alleging that he directly incited the violence. Tromble said that Twitter should have suspended his account a while ago.
Section 230 of the 1996 Telecommunications Act, which provides immunity to platforms from content posted by their users, is a dream for social media companies, said Christopher Kojm, professor at GWU, addressing the balance between the decision-making power of these platforms and preserving free speech rights of users.
“These companies are the sole voice of decision,” Kojm said, “but if you have government regulates them, how do you address First Amendment issues?”
If a company argues that Trump incited violence through social media, and social media is federally regulated, that company would need to prove “imminent lawless action”, according to the Brandenburg incitement test in order to lose free speech protection, he said, referencing the 1969 Supreme Court decision in Brandenburg v. Ohio.
The panel also compared domestic extremists to foreign terrorist groups, drawing many similarities to ISIS, especially their use of social media. ISIS was far more effective using social media for recruiting and spreading false information than al Qaida in the past, said Mary McCord, legal director of the Institute for Constitutional Advocacy and Protection at GWU.
ISIS was severely crippled as companies and governments worked to remove them from their platforms, Hughes said, explaining that deplatforming individuals and groups on social media, from a practical perspective, is very effective at reducing their influence on followers.
Tromble agreed, saying that Twitter’s banning of Trump will probably reduce his social impact; however, she also expressed concern that one company had the absolute power to shut down another’s voice.
Section 230
Sen. Mark Warner Says His Section 230 Bill Is Crafted With Help of Tech Companies
January 22, 2021—Social media is an accelerant that spreads conspiracy theories and feeds extremist ideology, and was ultimately the cause of the U.S. Capitol riot on January 6, said experts on a George Washington University panel on Thursday.
Despite the rioters being a hodgepodge of different extremist groups, nearly all of them organized originally on social media platforms, said Seamus Hughes, deputy director of the GWU program on extremism. The groups included the Proud Boys, Oath Keepers, and other militia-types, mixed in with conspiracy theorists and those who believed the elections were rigged against Donald Trump, he said.

There are two components that gave rise to what happened on Capitol Hill, said Rollie Lal, professor at GWU. First came the utilization of conspiracy theories as political strategy, then came the proliferation of those conspiracies on social media platforms, which flourish among extremist groups.
These conspiracies include the fringe right-wing QAnon theory that alleges a Satan-worshipping cabal of pedophiles run an international global sex-trafficking ring. Over time conspiracies gain traction by attaching themselves to current events, such as believing that COVID-19 is a hoax or that the 2020 election wasn’t valid, which led to the insurrection at the Capitol, Lal said.
Determining the responsibility of social media companies like Twitter and Facebook is a challenge because their data is basically a black box and no one outside the companies can access it, said Rebekah Tromble, director for the Institute on Data, Democracy, and Politics at GWU. These companies need better regulatory frameworks, she said, suggesting policy similar to the European Union’s recently-proposed Digital Services Act.
In response to the Capitol unrest, Twitter permanently suspended Donald Trump’s account and Facebook suspended it indefinitely, alleging that he directly incited the violence. Tromble said that Twitter should have suspended his account a while ago.
Section 230 of the 1996 Telecommunications Act, which provides immunity to platforms from content posted by their users, is a dream for social media companies, said Christopher Kojm, professor at GWU, addressing the balance between the decision-making power of these platforms and preserving free speech rights of users.
“These companies are the sole voice of decision,” Kojm said, “but if you have government regulates them, how do you address First Amendment issues?”
If a company argues that Trump incited violence through social media, and social media is federally regulated, that company would need to prove “imminent lawless action”, according to the Brandenburg incitement test in order to lose free speech protection, he said, referencing the 1969 Supreme Court decision in Brandenburg v. Ohio.
The panel also compared domestic extremists to foreign terrorist groups, drawing many similarities to ISIS, especially their use of social media. ISIS was far more effective using social media for recruiting and spreading false information than al Qaida in the past, said Mary McCord, legal director of the Institute for Constitutional Advocacy and Protection at GWU.
ISIS was severely crippled as companies and governments worked to remove them from their platforms, Hughes said, explaining that deplatforming individuals and groups on social media, from a practical perspective, is very effective at reducing their influence on followers.
Tromble agreed, saying that Twitter’s banning of Trump will probably reduce his social impact; however, she also expressed concern that one company had the absolute power to shut down another’s voice.
-
Artificial Intelligence4 months agoU.S. Special Operations Command Employs AI and Machine Learning to Improve Operations
-
Broadband Roundup4 months agoBenton on Middle Mile Open Access Networks, CENIC Fiber Route in California, Investors Buying Bitcoin
-
Artificial Intelligence2 months agoArtificial Intelligence Aims to Enhance Human Capabilities, But Only With Caution and Safeguards
-
Broadband Roundup4 months agoTrump Signs Executive Order on Artificial Intelligence, How Not to Wreck the FCC, Broadband Performance in Europe
-
Fiber3 months agoSmaller Internet Providers Were Instrumental to Fiber Deployment in 2020, Says Fiber Broadband Association
-
Rural4 months agoFCC to Spend $9.3 Billion on 5.2 Million Broadband Locations as Result of Rural Digital Opportunity Fund Auction
-
Cybersecurity2 months agoInternet of Things Connected Devices Are Inherently Insecure, Say Tech Experts
-
Privacy3 weeks agoNew Laws Needed on Capturing Data Collection From Mixed Reality, Experts Say













