FCC Proposes $867K Lumen Fine, Mid East Misinformation, SEC Against AI in Finance
The regulator is proposing a fine for failed transmission of 911 calls.
Hanna Agro
October 17, 2023 – The Federal Communications Commission proposed Tuesday a fine of $867,000 against Lumen Technologies for allegedly failing to transmit 911 calls to public safety centers following two service outages in February of 2022.
The first outage took place in South Dakota on February 17 and lasted for nearly five hours, while the second outage lasted almost seven hours and took place in North Dakota. Both of these outages failed to transmit hundreds of 911 calls to respective public safety centers, the FCC said.
Following the first outage, the FCC alleges Lumen did not notify two affected public safety centers until days after the incident, while after the second outages Lumen only reported the outage in a timely manner to two of eleven public safety centers who were not receiving calls.
“If you call 911 for help, your call should reach first responders,” said FCC Chairwoman Jessica Rosenworcel. She added that if “an outage prevents calls from reaching 911, public safety officials should be informed as soon as possible so they can tell the public of alternate ways to reach 911.
Lumen will be given an opportunity to respond to the claims made against them, after which the commission will review any provided evidence or legal arguments.
Tech companies asked to address alleged misinformation on Middle East conflict
Sen. Michael Bennet, D-Colorado, is urging social media companies like X and TikTok in a letter Tuesday to proactively handle misinformation about the ongoing Israel-Hamas war.
Bennet wrote to these tech giants citing content reported by the Associated Press, which included claims that Ukraine supplied weapons to Hamas and that an Israeli general had been captured.
Bennet asked these platforms to provide information about what content is being removed and to enforce more stringent moderation standards to reduce the spread of “deceptive content online.”
“According to numerous reports, deceptive content has ricocheted across social media sites since the conflict began, sometimes receiving millions of views,” said Bennet. He added that the kind of algorithms social sites rely on seem to be boosting this kind of content.
Congressman Frank Pallone D-N.J., issued a similar statement to social media companies Thursday, pleading for better content moderation of what he deemed to be misinformed and violent content.
Gary Gensler warns about artificial intelligence use for financial markets
In an interview with the Financial Times, Gary Gensler, chairperson of the Securities and Exchange Commission, warned that without artificial intelligence regulation, use of the software could lead to a financial crash.
Gensler explained that regulating AI will be difficult because several institutions may be relying on similar AI interfaces to gather data and make financial decisions.
The piece explained that normally regulatory rules apply to individual institutions such as banks or money market funds; however, the introduction of AI could be used in a “horizontal fashion” across several institutions all relying on the same kind of data model to make financial decisions.
The article explained that Gensler’s concern comes from that kind of horizontal data use promoting “herd behavior,” which, according to Investopedia, is when people make the same decision as someone else based on the belief that that person knows what they are doing.
Investopedia explained that in financial markets when herd behavior takes place, it can create “asset bubbles” or “market crashes” because buying and selling behavior swings in one direction.
While other experts and advocates like Gensler are urging governing bodies to enforce AI regulator legislation, nothing concrete has been put into place.