Connect with us

Big Tech

Ethical Technology Still in Early Stages, But Here to Stay, Say Axios Panelists

Elijah Labby

Published

on

Photo of Salesforce Chief Ethical and Human Use Officer Paula Goldman courtesy of Salesforce

August 6, 2020 — Although many ethical technologies are still in early developmental stages, they are here to stay, said participants in an Axios webinar Thursday.

The webinar, titled “Ethical Tech in a Time of Crisis,” saw participants discuss the strengths and shortcomings of the ways in which technology companies have adapted to the coronavirus.

Paula Goldman, chief ethical and human use officer at Salesforce, said that her studies in the field have led her to believe that movements toward tech responsibility are here to stay.

“I got a PhD and studied how unorthodox ideas become mainstream and what I would say is that this is definitely a movement,” she said. “But we’re in the early inning.”

Goldman suggested that people look to the past for examples of how such changes take place.

“I am optimistic that this work will keep scaling and that we will emerge,” she said. “Just like, for example, 20 [or] 30 years ago we had a security crisis and we emerged with new protocols and tech. It’s the early days for ethical use and that’s where we’re heading, but we all have to keep working at it.”

However, Human Rights Watch Executive Director Kenneth Roth said that places like South Korea have not used technology responsibly in their approach to monitoring the coronavirus.

“They had a very open process, which gave almost no attention to privacy,” he said. “They collected credit card data, they collected a location data, they used facial recognition software and video monitoring and they made all of this public, and it was an absolute disaster. And people were ostracized because they were identified as coronavirus infectors.”

Former U.S. Chief Data Scientist DJ Patil said that tech companies have a special responsibility with great stakes, and that tech platforms have to clarify their policies.

“It’s no small statement to say this is life or death,” he said. “The platforms have responsibility right now to figure out what is the right level of action. At a bare minimum, it is creating stricter standards for how and what is allowed on a platform.”

In order to make progress, Patil said, tech companies should look to time-tested solutions.

“We need to stop focusing on the super sexy technologies like machine learning and rather focus on the bare basics,” he said.

Elijah Labby was a Reporter with Broadband Breakfast. He was born in Pittsburgh, Pennsylvania and now resides in Orlando, Florida. He studies political science at Seminole State College, and enjoys reading and writing fiction (but not for Broadband Breakfast).

Big Tech

Government’s Reactive Nature Hobbling Tech Regulation, Expert Says

Congress may need another big tech breach to move earnestly on regulation, says consultant.

Samuel Triginelli

Published

on

Screenshot of Steve Haro at FiscalNote event

April 12, 2021 – The reactive nature of Congress to data crises means another breach of citizens’ privacy may be needed to spurn the next big legislative move, said a former congressional chief of staff.

“We still have questions to answer how to deal with technology dominance. We are not there yet because, unfortunately, Congress, for the most part, tends to act in response to crisis,” said Steve Haro, who is currently a government affairs consultant and was a former assistant secretary of commerce.

During a discussion sponsored by FiscalNote and CQ Rollcall, experts joined in a conversation on the current state of public policy for the tech industry and how influential Congress and the Biden-Harris admission will be on dealing with big tech.

Among the discussed issues was how the government will deal with intermediary liability provision Section 230.

Lawmakers have wondered whether the provision — which protects platforms from legal liability for posts by their users — offers too much protection to social networks when it comes to content moderation and disinformation. This central premise has spurned calls for a reform of Section 230; a number of Democrats have proposed their own bill to keep much of the protections except for paid posts.

“I do not believe 230 needs change, but that doesn’t mean I don’t have concerns,” Haro said. “I believe there is collective agreement this is still a necessary law, and it has worked. It has allowed the internet to build do what has become, good or bad.”

Haro pointed to the congressional hearings into Facebook’s handling of the Cambridge Analytica scandal three years ago, which saw the scraping of millions of user accounts without their consent. The result did not see substantial progress on regulations. “We might need another crisis to spur Congress into action,” Haro said.

Michael Drobac, principal at the law firm Dentons, said “we are not there, and I would say the thing that has been most present and clear is that in most of these hearings” the members of Congress are still trying to understand the technology to make a meaningful impact.

“The reality is that section 230 is as important today as it was when it was passed,” Haro said.

Continue Reading

Big Tech

Regulatory Commission Needed To Monitor Big Tech Collection Of Consumer Data, Professor Says

Derek Shumway

Published

on

Screenshot of Robin Gaster from Henry George School of Social Science

April 8, 2021 — There needs to be a digital regulatory commission created to ensure big tech cannot run wild with consumer data, said Robin Gaster, a George Washington University public policy scholar.

Gaster, who’s also president of Incumetrics, a data and program evaluation consultancy, published a book that was released this month about Amazon’s rise from an online bookstore to everything else.

Gaster sat down with Broadband Breakfast on Wednesday and talked about the e-commerce giant’s reach into industries like healthcare and its rapid collection of more consumer data. The solution, he proposes, is creating a “new digital deal,” which would see a sort-of digital Federal Communications Commission — an entity which has the resources and the person power to match Amazon’s growing force.

Amazon’s reach into health care needs to be met with proper oversight and ethics to ensure it really will protect consumer privacy, he said.

The e-commerce behemoth acquired PillPack, a prescription delivery company, developed the Amazon Halo, a competitor device to Fitbit, and launched Amazon Care, a telehealth app service. Add Amazon’s own Alexa AI platform into the mix and it has a stream of access to valuable data.

“I would absolutely imagine that five years from now, if you sprain your knee, you probably will not go on the Internet and look for things and trying to figure it out. You will say, ‘Alexa,’ I sprained my knee. What should I do?” said Gaster.

Amazon’s breakneck growth into healthcare is concerning because no one knows exactly what could or intends to do with all the data it possesses, Gaster said. With so much aggregated data across its products and services, Amazon needs to be held accountable for its actions so that if something goes wrong, there are ways to fix it that are open and trustworthy.

Gaster said governments and companies alike are playing “privacy theater” – they talk about protecting privacy, but it’s a mere performance put on to make it seem like they care about it, he said.

Alexa takes in all sorts of data from voice-commands and people’s Amazon accounts. It may as well be a virtual doctor someday, but people don’t know how or if they can control their data recorded by Alexa, Gaster said.

The notion that people can control their data is ridiculous, said Gaster. “We are walking across the digital plane naked. We have no clothes!” he said, adding no one can wade through the legalese in the terms and conditions and privacy statements.

Gaster’s book is entitled Behemoth – Amazon Rising: Power and Seduction in the Age of Amazon.

Continue Reading

Courts

Supreme Court Declares Trump First Amendment Case Moot, But Legal Issues For Social Media Coming

Benjamin Kahn

Published

on

Photo of Justice Clarence Thomas in April 2017 by Preston Keres in the public domain

April 5, 2021—Despite accepting a petition that avoids the Supreme Court deliberating on whether a president can block social media users, Justice Clarence Thomas on Monday issued a volley that may foreshadow future legal issues surrounding social media in the United States.

On Monday, the Supreme Court sent back to a lower court and ruled as moot a lawsuit over whether former President Donald Trump could block followers on Twitter, after accepting a petition by the federal government to end the case because Trump wasn’t president anymore.

The case dates back to March 2018, when the Knight First Amendment Institute and others brought a case against former president Trump in the Southern District of New York for blocking users based on their political views, arguing the practice is a violation of the first amendment.

The lower court judge agreed, and the decision was upheld by the United States Court of Appeals.

In accepting the petition by the government, Justice Thomas stated that adjudicating legal issues surrounding digital platforms is uniquely difficult. “Applying old doctrines to new digital platforms is rarely straightforward,” he wrote. The case in question hinged on the constitutionality of then-President Trump banning people from interacting with his Twitter account, which the plaintiff argued was a protected public forum.

Thomas stated that while today’s conclusion was able to be vacated, that likely would not be the case in the future. He went on to say that digital platforms exercise “concentrated control of so much speech in the hands of a few private parties.”

He continued: “We will soon have no choice but to address how our legal doctrines apply to highly concentrated, privately owned information infrastructure such as digital platforms.”

Even though Facebook and Google were not the platforms in question in this case, Thomas pointed to them as “dominant digital platforms” and stated that they have “enormous control over speech.” He stated that Google, Facebook, and Twitter have the capabilities to suppress information and speech at will, and referenced the “cataclysmic consequences” for authors that Amazon disagrees with.

Thomas also rejected the notion that other options exist.

“A person always could choose to avoid the toll bridge or train and instead swim the Charles River or hike the Oregon Trail. But in assessing whether a company exercises substantial market power, what matters is whether the alternatives are comparable.”

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending