Panel Analyzes Benefits and Challenges of Cloud Computing for Government Agencies

Broadband's Impact, Congress, Cybersecurity, House of Representatives, Privacy July 12th, 2013

, Reporter, Broadband Census News

WASHINGTON, July 12, 2013 – A panel of experts discussed the potential for the use of cloud computing by federal agencies at an event held by the Information Technology and Innovation Foundation on Wednesday morning.

The security of cloud computing was a primary concern for many of the panelists. Matt Wood, General Manager of Data Science for Amazon Web Services, described the cooperative approach that Amazon takes to security on its cloud services. He said Amazon secures the infrastructure itself, but customers are responsible for securing its systems that utilize cloud computing.

Terry Halvorsen, Chief Information Officer of the Department of the Navy, also suggested careful consideration of what data to put in cloud storage as another solution to security concerns. Data that is accessible to the public under the Freedom of Information Act can be placed on public cloud storage without fear, he noted.

Frank Baitman, Chief Information Officer for the Department of Health and Human Services, and Joseph Klimavicz, Chief Information Officer of the National Oceanic and Atmospheric Administration, both agreed that continuous monitoring was key to secure cloud computing. With such access, federal agencies would be able to instantly identify and track any breaches of security.

Baitman also noted that the transition to cloud computing brings new security challenges that are not encountered with internal storage of data. Consequently, Halvorsen recommended a careful evaluation of the tradeoffs between those new concerns and the lower cost of cloud storage.

These lower costs were also a highlight of the discussion. Baitman pointed out that the frequent hardware and software updates that agencies currently must undergo would be eliminated by utilizing third-party cloud storage.

According to Wood, Amazon’s goal is to reduce the cost of cloud computing to such a degree that customers do not even think about the cost, much like utilities. Such low costs would allow agencies to switch from a capital expenditure model, which can be costly and unpredictable, to an operational expenditure model, which is relatively stable and cheap.

Consequently, these agencies would be more able to try new approaches with their programs, spurring innovation.

“As you move into this operational model, the cost of experimentation is much lower,” Wood said.

Cloud computing also opens the door to other innovations through practices such as open data and collaboration, as David Robinson, Chief Innovation Officer for SAP Public Sector, asserted.

“What’s really powerful is the whole new range of outcomes,” he said.

However, Halvorsen also argued that no single approach could be taken to guarantee the best next-generation government services. For example, he disputed a complaint of Rep. Gerry Connolly, D-Va., that too many agencies were failing to close and adequate number of data centers.

Halvorsen countered that total cost of data storage rather than number of data centers should be the metric for measuring the efficiency of data storage. Because many data centers are part of larger facilities, he argued that such closures would do little to cut costs. Instead, the government should look at various strategies that have been proven successful in commercial industries.

“There will not be one single answer,” Halvorsen said.

You may also like:

Tagged with: , , , , , , , , , , , , ,

Leave a Reply