Connect with us

Innovation

Secretary of State Blinken Says Research and Development Critical For Competition Against China

Blinken was touring the University of Maryland laboratories last week.

Published

on

Secretary of State Antony Blinken

August 16, 2021 – Secretary of State Antony Blinken said earlier this month a key consideration for competition with China is to plow investments in research and development, as the Communist nation continues to do the same and export cheap technologies around the world.

 

“The Chinese and Russian governments, among others, are making the argument in public and private that the United States is in decline,” said Blinken at the University of Maryland on August 9, as he was touring the institution’s laboratories.

 

“Nothing would put to rest faster their specious argument about America’s best days being behind us than if the United States made serious investments in our domestic renewal right now.”

 

The United States, previously being the first in the world for research and development relative to its economic size, has fallen to ninth place, according to a recent UNESCO report. In contrast, China has risen to second, according to the same report.

 

Last week, the Senate passed a $1-trillion infrastructure bill that would lay the groundwork for the future, including a timely $65-billion for broadband to help seal the remote learning and work gap that emerged from the pandemic.

 

Targeting China

 

President Joe Biden’s administration has been coupling economic spending and sanctions against Chinese companies to combat the emerging economic superpower and protect national security. Chinese companies carry great influence on the world stage, few more prominent than leading telecom equipment maker Huawei, which has been the subject of bans in the United States and other parts of the world.

 

For example, while Huawei’s greatest 5G equipment competitors are Finland’s Nokia and Sweden’s Ericsson – with Samsung emerging as a potent player in its own right – it is widely reported that the Chinese company’s equipment sells for cheap, making it a more economic choice for countries in Africa. The Chinese government has been accused of spying.

 

An official from Huawei – which has challenged (but was denied) the Federal Communications Commission’s authority to wade into national security matters when it proposed rules in June to prohibit future authorizations to companies posing potential threats – has previously noted that the company doesn’t have much to worry about with the overtures by the Biden government to ban Chinese companies’ access to American chips because it will just make its own.

 

Blinken’s aims

 

Blinken strongly advocated for the bipartisan U.S. Innovation and Competition Act, a bill introduced in April that would boost federal funding for U.S.-based semiconductor manufacturing and provide $52 billion over five years for research initiatives. 

 

By investing in infrastructure, Blinken said he believes that this would be the essential thing that the U.S can do to advance its foreign policy because it will increase foreign trade and investment competitiveness.

Reporter Mike Ogunji is from Columbus, Ohio, and studied public relations and information technology at the University of Cincinnati. He has been involved in the Model United Nations and We The People. Mike enjoys books, basketball, broadband and exploring the backwoods.

Drones

Aron Solomon: The New Horizon of Drones and Your Privacy

We have yet to wrap our minds around the impact of drones in our own lives and in society.

Published

on

The author of this Expert Opinion is Aron Solomon, head of digital strategy for Esquire Digital

While so many more of us understand what a drone is today than we did even two years ago, we have yet to wrap our collective minds around the impact of drones in our own lives and the inner workings of our society. Like everything else that’s new and odd for us to see, yet then becomes commonplace, there’s going to be a massive drone adjustment period for people and we may be in it now.

For those of you who might be living on a remote island – wait, there are even drone-flying YouTube celebrities there. Okay, for any of you who actually don’t know what a drone is and can do, a drone is also known as a UAV or unmanned aerial vehicle. While drones themselves are obviously a technology, what is important about them are the other technologies a drone can house.

A drone can have GPS, lasers and many other technologies that are controlled by a user or users on the ground through ground control systems (GSC). In short, a drone can pack whatever the latest technology is. Think of James Bond’s spy shoes, except they fly, look cooler than a pair of brogues, and can easily surveil or even kill you.

Drone usage started small and is getting big. Back in 2016, there were bold predictions that drone usage would triple by 2020. The reality has exceeded that number. A report from June shows that the commercial drone market is growing fairly rapidly with no signs it will slow down:

“The drone manufacturing industry is maturing – and so are drone customers.  As the capabilities of drones increase, they are used for more sophisticated and specific applications.”

While almost anyone could buy and fly a drone a few short years ago (obviously not very close to an airport or a takeoff or landing path) there are a lot more rules today than there ever have been:

  • New FAA rules require all drones to be registered unless they weigh less than 0.55lbs and are used recreationally. There are two types of registration in the United States, part 107 and recreational.
  • You must now mark your drone physically with the registration number.
  • For business usage of a drone, FAA suggests you keep a flight log. They can request information if there is a situation they choose to investigate.
  • It is now illegal to shoot down a drone even if it’s over your own property and you suspect it of recording you. Drones are protected by the NTSB as aircraft.

Tim George, an Erie, Pennsylvania lawyer, cautions us against believing we are still in the Wild, Wild West of drone flight:

“Anyone choosing to operate a drone needs to follow all registration and licensing requirements where they live. It’s important for every drone operator to remember that there might be municipal law they need to follow, as well as state and federal law. Being unaware of applicable drone laws will be no defense to criminal infractions or potential civil claims.”

But how well are people following the law?

Not very, as this iPhone picture I actually took while writing this story highlights. This was taken at the observatory on the top of a mountain in a large North American city, with the premise being that this athlete and his team were using a (pretty intrusive) drone to film him running down a set of stairs.

It is worth noting that I had the same permission in taking that picture as did the person in the pic and their team did in capturing my image, as the drone circled above and around me. In other words, absolutely none.

Ricky Leighton, a Maine-based certified drone pilot and video expert, cautions us that this type of poor behavior will lead to together regulation:

“There are two things to consider here. The first is that drone pilots need to closely observe any rules and legislation where they choose to operate their drone. The second is a bit more nuanced in that there has to be common courtesy as to where, when, and how we operate our drones. The less courtesy we give, the stricter the regulations will eventually be.“

And don’t think that drones are or will be limited to consumer use. While relaxing in the park and having someone send their drone to hover ten inches from your face is pretty annoying, more serious drones for enterprise use are dramatically on the rise.

A year ago, Skydio announced that they had raised an additional $100M financing round to continue what many fees is controversial work with governments and private enterprise. More simply put, some fear that this rockstar ex-MIT and GoogleX team are making mass surveillance drones and less than savory deals.

Given that one of their competitors, DJI, owns nearly 80% of the commercial drone market, multiple aggressive startups flush with cash are seeking to shake loose some of that market share as they grow their market cap.

As drones become more prevalent in our daily lives, our initial pushback against them may be dulled by their ubiquity. Like any other new technology, even one that can be pretty scary when we consider all of its dimensions, time usually gets us comfortable with things we expect would always stretch our comfort zone.

Aron Solomon is the head of digital strategy for Esquire Digital and has taught entrepreneurship at McGill University and the University of Pennsylvania. Since earning his law degree, Solomon has spent the last two decades advising law firms and attorneys. He founded LegalX, the world’s first legal technology accelerator and was elected to Fastcase 50, recognizing the world’s leading legal innovators. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Artificial Intelligence

Int’l Ethical Framework for Auto Drones Needed Before Widescale Implementation

Observers say the risks inherent in letting autonomous drones roam requires an ethical framework.

Published

on

Timothy Clement-Jones was a member of the U.K. Parliament's committee on artificial intelligence

July 19, 2021 — Autonomous drones could potentially serve as a replacement for military dogs in future warfare, said GeoTech Center Director David Bray during a panel discussion hosted by the Atlantic Council last month, but ethical concerns have observers clamoring for a framework for their use.

Military dogs, trained to assist soldiers on the battlefield, are currently a great asset to the military. AI-enabled autonomous systems, such as drones, are developing capabilities that would allow them to assist in the same way — for example, inspecting inaccessible areas and detecting fires and leaks early to minimize the chance of on-the-job injuries.

However, concerns have been raised about the ability to impact human lives, including the recent issue of an autonomous drone possibly hunting down humans in asymmetric warfare and anti-terrorist operations.

As artificial intelligence continues to develop at a rapid rate, society must determine what, if any, limitations should be implemented on a global scale. “If nobody starts raising the questions now, then it’s something that will be a missed opportunity,” Bray said.

Sally Grant, vice president at Lucd AI, agreed with Bray’s concerns, pointing out the controversies surrounding the uncharted territory of autonomous drones. Panelists proposed the possibility of an international limitation agreement with regards to AI-enabled autonomous systems that can exercise lethal force.

Timothy Clement-Jones, who was a member of the U.K. Parliament’s committee on artificial intelligence, called for international ethical guidelines, saying, “I want to see a development of an ethical risk-based approach to AI development and application.”

Many panelists emphasized the immense risk involve if this technology gets in the wrong hands. Panelists provided examples stretching from terrorist groups to the paparazzi, and the power they could possess with that much access.

Training is vital, Grant said, and soldiers need to feel comfortable with this machinery while not becoming over-reliant. The idea of implementing AI-enabled autonomous systems into missions, including during national disasters, is that soldiers can use it as guidance to make the most informed decisions.

“AI needs to be our servant not our master,” Clement agreed, emphasizing that soldiers can use it as a tool to help them and not as guidance to follow. He compared AI technology with the use of phone navigation, pointing to the importance of keeping a map in the glove compartment in case the technology fails.

The panelists emphasized the importance of remaining transparent and developing an international agreement with an ethical risk-based approach to AI development and application in these technologies, especially if they might enter the battlefield as a reliable companion someday.

Continue Reading

Artificial Intelligence

Deepfakes Could Pose A Threat to National Security, But Experts Are Split On How To Handle It

Experts disagree on the right response to video manipulation — is more tech or a societal shift the right solution?

Published

on

Rep. Anthony Gonzalez, R-Ohio

June 3, 2021—The emerging and growing phenomenon of video manipulation known as deepfakes could pose a threat to the country’s national security, policy makers and technology experts said at an online conference Wednesday, but how best to address them divided the panel.

A deepfake is a highly technical method of generating synthetic media in which a person’s likeness is inserted into a photograph or video in such a way that creates the illusion that they were actually there. A well done deepfake can make a person appear to do things that they never actually did and say things that they never actually said.

“The way the technology has evolved, it is literally impossible for a human to actually detect that something is a deepfake,” said Ashish Jaiman, the director of technology operations at Microsoft, at an online event hosted by the Information Technology and Innovation Foundation.

Experts are wary of the associated implications of this technology being increasingly offered to the general population, but how best to address the brewing dilemma has them split. Some believe better technology aimed at detecting deepfakes is the answer, while others say that a shift in social perspective is necessary. Others argue that such a societal shift would be dangerous, and that the solution actually lies in the hands of journalists.

Deepfakes pose a threat to democracy

Such technology posed no problem when only Hollywood had the means to portray such impressive special effects, says Rep. Anthony Gonzalez, R-Ohio, but the technology has progressed to a point that allows most anybody to get their hands on it. He says that with the spread of disinformation, and the challenges that poses to establishing a well-informed public, deepfakes could be weaponized to spread lies and affect elections.

As of yet, however, no evidence exists that deepfakes have been used for this purpose, according to Daniel Kimmage, the acting coordinator for the Global Engagement Center of the Department of State. But he, along with the other panelists, agree that the technology could be used to influence elections and further already growing seeds of mistrust in the information media. They believe that its best to act preemptively and solve the problem before it becomes a crisis.

“Once people realize they can’t trust the images and videos they’re seeing, not only will they not believe the lies, they aren’t going to believe the truth,” said Dana Rao, executive vice president of software company Adobe.

New technology as a solution

Jaiman says Microsoft has been developing sophisticated technologies aimed at detecting deepfakes for over two years now. Deborah Johnson, emeritus technology professor at the University of Virginia School of Engineering, refers to this method as an “arms race,” in which we must develop technology that detects deepfakes at a faster rate than the deepfake technology progresses.

But Jaiman was the first to admit that, despite Microsoft’s hard work, detecting deepfakes remains a grueling challenge. Apparently, it’s much harder to detect a deepfake than it is to create one, he said. He believes that a societal response is necessary, and that technology will be inherently insufficient to address the problem.

Societal shift as a solution

Jaiman argues that people need to be skeptical consumers of information. He believes that until the technology catches up and deepfakes can more easily be detected and misinformation can easily be snuffed, people need to approach online information with the perspective that they could easily be deceived.

But critics believe this approach of encouraging skepticism could be problematic. Gabriela Ivens, the head of open source research at Human Rights Watch, says that “it becomes very problematic if people’s first reactions are not to believe anything.” Ivens’ job revolves around researching and exposing human rights violations, but says that the growing mistrust of media outlets will make it harder for her to gain the necessary public support.

She believes that a “zero-trust society” must be resisted.

Vint Cerf, the vice president and chief internet evangelist at Google, says that it is up to journalists to prevent the growing spread of distrust. He accused journalists not of deliberately lying, but often times misleading the public. He believes that the true risk of deepfakes lies in their ability to corrode America’s trust in truth, and that it is up to journalists to restore that trust already beginning to corrode by being completely transparent and honest in their reporting.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending