Expert Opinion
Geoff Mulligan: A ‘Dumb’ Way to Build Smart Cities

In every corner of the country and around the world, leaders are trying to make their cities “smarter.” These projects are often in response to specific and on-going demands — such as parking, overcrowding, noise, and pollution — while others have started to address broader goals — such as reduction of energy consumption, improvement of traffic flow, or sustainability. But as is often the case with grand ideas, many are taking the wrong approach. It’s simply impossible, in one sweep, to build a Smart City. Just as the internet and the web didn’t spring forth fully formed, as if from some “master plan,” Smart Cities must be built as organic, independent, and yet connected pieces. As Stewart Brand cogently argued, even buildings have to learn in steps.
A Smart City roadmap is invaluable, laying out a direction to help set expectations. However, it shouldn’t define specific cross-system technologies and implementation details, nor plan for all projects to launch or complete simultaneously. They must instead be created as separate solutions for each problem, then stitched together by open standards and open Application Programming Interfaces (APIs) and each built as an independent service. That’s how they must grow if we want them to succeed — learning by iteration.
Today’s problem
In the rush to “capture the market,” companies are selling “complete visions” — though incomplete solutions – of how their systems can solve the ills that plague the modern city. City planners, managers, and officials get sold the idea that these companies have some kind of silver bullet that, in a single solution, integrates all city functions and enhances their capabilities, thus making them work together efficiently. But this belies the true nature of the problem: none of us are smart enough to fully appreciate or understand the complexity of managing all the functions that go into making a city work. The sheer diversity of the systems ensures that no single technology can be applied as “the” solution. In addition, the timeframe for implementing these disparate programs can vary widely, meaning that technology selected at the start of one project will likely be obsolete by the start of another.
Worse yet, these companies are also selling and deploying products that are based on closed, proprietary systems. They include proprietary radios, single-purpose hardware, proprietary software and protocols, and closed web applications and portals. These designs constrain innovation and interfere with interoperability between newer and older systems, often saddling the new with the constraints of the past. This is like the Trojan Horse — a solution that requires all future systems to use these proprietary systems and thereby locking the city into that particular vendor for the rest of their days, limiting design and technology choices and stifling innovation and adoption of newer technology.
It’s not all gloom and doom. With the application of open systems and implementation of a service-oriented architecture, future technology can be built that’ll integrate more seamlessly with previous technology investments.
Choose a different path
We’ve learned from the lean-agile community to build success in small, incremental steps rather than one grand leap. But with the different needs, design patterns, and timeframes, how is it possible to accomplish building a Smart City in small steps? It’s done by leveraging the nature of the internet itself, complete with open standards and open APIs. By decoupling every system and eliminating hidden interfaces, we can relieve the pressures of time and technology interdependencies, thereby allowing greater innovation in each separate project while “future proofing” the design decisions.
We use different materials and architecture to construct buildings with differing purposes (hospitals vs. homes vs. high-rises), but there’s a consistency even within these varying buildings for standard electrical and plumbing connections. Smart City projects can adopt this same design pattern. This means that for a parking project, the city can pick the most appropriate communication technology but require that the system be built on open standard protocols that underlie the internet (for example, HTTP, IP, TCP, and MQTT), use data formats such as JSON or XML, and have open APIs.
Greater than the sum of the parts
Instead of a complete Smart City that’s decades in the making, city managers can instead look for “low-hanging fruit” or “greatest pain point” and more quickly build a point solution, knowing that it can simply be connected to any future systems in a scalable and secure manner. A smart parking system for city streets or a parking garage built using LoRa today can be connected to a city traffic management system built using NBIoT next year, as long as both use open APIs and avoid closed, proprietary solutions including “walled garden” cloud solutions.
The next city improvement project — a smart street light system, for example — might require a completely different communication technology from the previous parking system. Streetlights are up high and more distributed than parking meters or parking spaces in a garage. Streetlights have power, whereas a parking sensor will likely be battery-operated. These different requirements would necessitate the use of different communication technologies, but both systems can be interconnected through common protocols and APIs. Through open APIs, this interconnectivity doesn’t need to be designed in from the beginning but can be added after each of the separate systems is installed.
For example, the streetlight system that’s installed today could be connected to traffic flow sensors installed tomorrow. The two systems may use completely different communication technology and set of protocols. This new combination — streetlight and traffic flow sensors connected through open APIs — could offer an innovative solution for reducing streetlight energy usage by dimming lights when there are no cars, but increasing the brightness prior to the cars arrival based on messages from the traffic flow system.
The use and adherence to open APIs and microservices brings another benefit — decoupled velocity. This means that even concurrent projects can be built at different speeds and rolled out at different times and yet combined when each is completed and functional. As in the example above, the smart streetlight project might end up taking longer to deploy because of the sheer number of devices. Where as the traffic flow sensors might be installed sooner. Open APIs release each system from timing interdependencies and implementation speed.
Vendor lock-in and future-proofing
Another benefit of open standards and APIs is the elimination of vendor lock-in, which is when a vendor wins all future business because they alone are holding the keys to the design and the data. Vendor lock-in squelches innovation: you’re only as innovative as the vendor wants to be or lets you be. If a city needs a design or solution that isn’t in the vendor’s current portfolio, the city’s choices become wait, pay more to have the vendor add it to their roadmap, or go outside the ecosystem and use some sort of gateway (but gateways are evil, see below) to translate protocols and data and interconnect the systems.
Instead, open standards and APIs bring the ability to incorporate and evolve with newer technologies and systems. But, much like vendor lock-in, you can run afoul of technology lock-in. Imagine having built a Smart City project requiring the use of videotape and now not being able to adopt streaming technologies because they’re incompatible. Technology changes rapidly; in just a few years, we’ve moved from 2G to 3G and now to 5G in the cellular environment. By using open standards to decouple the higher-layer protocols from the lower layers, technology can evolve and systems using older tech can easily interconnect. In this way a system deployed using 4G today can interoperate with 5G systems tomorrow and 6G and 7G systems in a few years.
The underpinnings of innovation
Avoiding vendor and technology lock-in is critical to allow for innovation. Nothing will be more detrimental to a city’s infrastructure and future than to be bound to a vendor and have to ask for permission to enhance or extend the systems’ functionality. As new technology comes to the market and new services are brought out to solve other city issues, the ability to quickly test and connect them to existing solutions is the necessary for offering evolving solutions and bringing more opportunities for innovation and cost reductions. When you embark on your next project, ask your vendors — “do you use open standard protocols?” and “how are your APIs and data published?”
Avoid these traps — the ‘evil’ gateway and ‘private clouds’
One tool that many vendors attempt to leverage to show openness and interoperability is the “gateway.” They claim that they provide, or can build, a gateway to connect to other systems. Gateways are a never-ending trap on so many levels:
- they’re a single point of failure;
- they’re a single point of attack for hackers;
- they require complex coordination between systems;
- maintenance and updates are costly or non-existent;
- updates need to be managed;
- they add extra costs for hardware and power; and
- they’re closed and proprietary.
The second trap is private clouds and walled gardens. The vendor will claim that they use “all of the open internet standards,” listing protocol after protocol, but they use these protocols only to send the data (your data) into a closed, proprietary cloud system — locking it away so that only they have the keys. This is akin to building a road that leads to a cul-de-sac, which is blocked by a locked gate that only lets traffic in. Then, new systems must be built to connect through this cloud, likely via closed and proprietary interfaces. In the end, only other systems in this closed ecosystem can be used for future projects, thereby limiting innovation and increasing time and costs. Sending data to the cloud isn’t a panacea, as many vendors would like to suggest.
Who owns the data — that is, your data
In Smart City projects the goals of improving city services or infrastructure are the leading driver for implementation but the greatest benefits will come from the availability of the data gathered from these projects and new systems. Unfortunately, many of the Smart City systems being proffered today lock away access to the data in walled gardens, as mentioned before. It’s imperative that the data is sent to city-owned and managed servers, or the city’s data lake or available without license through open APIs. Only in this way will the city and future Smart City projects be able to use and leverage the wealth of information and the underlying real value of these types of projects.
A related concern surrounding data ownership is the rights to the use and sale of the data created by the Smart City project — a valuable commodity. Throughout the life of the project it should be clear that the city owns all rights to the data. The vendor may not access, distribute, or sell any of the data whether in raw form or aggregated without the explicit permission of the city. Only in this way will you be able to protect the rights and privacy of the city and it’s citizens.
Choosing the right project
By adopting open standards and APIs, you’re now able to embark on a Smart City project without having to solve all other city projects at the same time or constrain them with the choices made today. But choosing the “right” project is important. In some cases, it’s prudent to choose a small, fast, low-cost project. This allows you to get your feet wet, test vendors, accomplish a project in a short time, and hopefully succeed; but if you fail, fail fast, learn, and move on. There sometimes is a problem with these projects though: they may have little impact and they may cause others to look upon them as “ho hum.”
An alternative is to choose a project that’s a large “pain point” for the city. By definition, these projects have great visibility and impact, but may have far greater risk and take much longer to complete. They don’t generally meet the rules for lean-agile, but the small “safe” projects may not show off the true benefits that a Smart City can bring. Solve this by using divide and conquer. Rather than implementing smart parking across the entire city, choose to focus on a particularly congested city section or single parking structure.
Building success
When a city is becoming smarter by investing in a Smart City project, use this checklist to evaluate the project:
- Does it start small and scale well? This is better than a monolithic solution that requires a gigantic investment.
- Is it locking the city into technologies, or, even worse, vendors? Does it exclude other vendors?
- Is it open? What protocols are used? Are the APIs published and open?
- Did the vendor mention or require (evil) gateways?
- Does it solve a problem for the city quickly, even if it’s only a small problem?
- What will the city be able to learn from taking on this project?
- Who owns the data?
Through the strict application and requirement of openness, your Smart City project can be delivered in a way that’s quick, beneficial, evolvable, and scalable. Our cities can and will become smarter and better places to live through small steps and open standards — open APIs and microservices are the foundational stepping stones to that future.
Geoff Muilligan is IoT Practice Lead at Skylight Digital and CTO for IIoT at Jabil. Past founder and Chairman of LoRa and IPSO. Former White House Presidential Innovation Fellow on IoT. Creator of 6lowpan. This article originally appeared on the author’s web site, and is reposted with permission.
BroadbandBreakfast.com accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@broadbandcensus.com. The views reflected in Expert Opinion pieces do not necessarily reflect the views of BroadbandBreakfast.com and Breakfast Media LLC.
Broadband's Impact
Lindsay Mark Lewis: As Inflation Spiked, Broadband is ‘The Dog That Didn’t Bark’
Why have internet prices remained constant while demand surges? It all boils down to investment.

There are many lessons to be learned from last year’s midterms, but Democrats should not take the results as some broad endorsement of the economic status quo. Midterm voters identified inflation as the most important issue driving their votes. And while the latest Labor Department data shows the producer price index decreasing by 0.1% in February, prices remain 4.6% higher than a year ago, which means lawmakers still have work to do to bring inflation under control.
And as they search for ideas, they may want to examine the dog that didn’t bark – in particular, the one sector of the economy that has been an interesting counternarrative to the otherwise troubling inflation story.
Home internet service is one of the few major living costs that isn’t skyrocketing. In fact, the most popular broadband speed tier one year ago actually costs 15% less today, on average.
This success story – and the bipartisan policies behind it – offers important lessons.
Remarkably, broadband prices are declining even as demand surges. The pandemic made home internet service more essential than ever for education, job opportunities and health care – all driving internet traffic 25% to 50% above pre-pandemic levels.
So why have internet prices remained constant – even declined by some measures – while demand surges? In short, it all boils down to investment.
When the pandemic cratered economic activity in the spring of 2020, executives in many industries – from lumber to oil refineries to computer chips – made the snap decision to pull back on long-term investments in new factories and manufacturing capacity. When the economy roared back, those industries couldn’t meet demand, sending prices soaring.
In the broadband industry, conversely, providers responded by investing $86 billion into their network infrastructure in 2021 – the biggest one-year total in nearly 20 years. These investments are fueling faster speeds – fixed broadband speeds are up 35% nationwide in the past 12 months – while making sure networks have the capacity to handle growing traffic needs.
This teaches us three things.
First, we should observe a Hippocratic oath and “do no harm.” America’s broadband system has thrived under a decades long bipartisan consensus for light-touch, pro-investment policies. Nearly $2 trillion in private capital built the networks that now deliver American consumers higher speeds at lower per-megabit prices than consumers enjoy in Europe, despite having to cover greater distances and more difficult terrain.
This further tells us that it’s precisely the wrong time to abandon this successful model in favor of price controls and utility-style regulation, as some House and Senate progressives have proposed. Even Democratic policy experts acknowledge that approach would be toxic for private investment.
Second, policymakers need to recognize that broadband isn’t immune from the supply chain crunches plaguing so many other sectors of the economy. Broadband buildouts are already getting delayed by shortages in fiber cable, network hardware and skilled labor. And that’s before $42 billion in federal infrastructure funding goes out the door starting next year, which will only intensify demand for these scarce supplies.
That means rural buildout projects funded by federal dollars are likely to see inflationary pressures – and take longer to complete – than Congress expected when it passed the infrastructure bill in 2021. That will put pressure on state broadband offices to be even more diligent about waste, and to emphasize reliable supply chains with experienced network builders. Bidders will also need the flexibility to buy fiber from wherever they can manage to source it, even if that means relaxing the program’s strict “Buy American” rules. This requires a regulator ability to do smart tweaking of rules to expedite buildouts cost-effectively.
Third, we need to help more financially struggling households get connected. Thanks to President Joe Biden’s Affordable Connectivity Program – and an agreement with 20 broadband companies – 48 million households can now get home internet service for free.
But more than a year later, just over a third of eligible households have signed up. Investing in enrollment campaigns and digital literacy training programs is the fastest way we can crank up the dial on enrollment. Relatively small investments here could pay huge dividends in bringing millions more Americans into the digital economy.
Even with these remaining challenges, the overall contours of American broadband policy – encouraging investment, competition and affordability – are working well. And as the saying goes: “If it ain’t broke, don’t fix it.” In an inflation-roiled economy that defies easy answers, we should learn from – not mess with – this all-too-rare success story.
Lindsay Mark Lewis is executive director of the Progressive Policy Institute. Contact him at llewis@ppionline.org. This piece was originally published in the Richmond Times on March 24, 2023, and is reprinted with permission.
Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.
Expert Opinion
David Strauss: How Will State Broadband Offices Score BEAD Applications?
Fiber, coax and fixed wireless network plans dependent on BEAD funding demand scrutiny.

Given the vital ways in which access to broadband enables America, adequate Internet for all is a necessary and overdue undertaking. To help close the digital divide, the Infrastructure Investment and Jobs Act includes $42.5 billion in Broadband Equity, Access and Deployment funding for the last mile. Add to this the estimated level of subgrantee matching funds and the total last mile figure rises to $64 billon, according to the BEAD Funding Allocation and Project Award Framework from ACA Connects and Cartesian.
The federal funds will be disbursed by the Department of Commerce’s National Telecommunications and Information Administration to the State Broadband Offices who will then award subgrants to service providers. On June 30, each state will find out their allocation amount. By 2024, the states will establish a competitive subgrantee process to start selecting applicants and distributing funds.
A critical element of the selection process is the methodology for scoring the technical merits of each subgrantee and their proposal. Specific assessment criteria to be used by each state are not yet set. However, the subgrantee’s network must be built to meet these key performance and technical requirements:
- Speeds of at least 100 Megabits per second (Mbps) download and 20 Mbps upload
- Latency low enough for “reasonably foreseeable, real-time interactive applications”
- No more than 48 hours of outage a year
- Regular conduit access points for fiber projects
- Begin providing service within four years of subgrant date
What level of scrutiny will each state apply in evaluating the technical merits of the applicants and their plans?
Based on our conversations with a number of state broadband leaders, the answers could be as varied as the number of states. For example, some states intend to rigorously judge each applicant’s technical capability, network design and project readiness. In contrast, another state believes that a deep upfront assessment is not needed because the service provider will not receive funds until certain operational milestones are met. Upon completion, an audit of the network’s performance could be implemented.
We, at Broadband Success Partners, are a bit biased about the level of technical scrutiny we think the states should apply. Having assessed over 50 operating and planned networks for private sector clients, we appreciate the importance of a thorough technical assessment. Our network analyses, management interviews and physical inspections have yielded a valuable number of dos and don’ts. By category, below are some of the critical issues we’ve identified.
Network Planning & Design
- Inadequate architecture, lacking needed redundancy
- Insufficient network as-built diagrams and documentation
- Limited available fiber with many segments lacking spares
Network Construction
- Unprotected, single leased circuit connecting cities to network backbone
- Limited daisy-chained bandwidth paths on backhaul network
- Lack of aerial slack storage, increasing repair time and complexity
Network Management & Performance
- Significant optical ground wire plant, increasing potential maintenance cost
- Internet circuit nearing capacity
- Insufficient IPv4 address inventory for planned growth
Equipment
- Obsolete passive optical network equipment
- Risky use of indoor optical network terminals in outdoor enclosures
- Sloppy, untraceable wiring
Technical Service / Network Operations Center
- Technical staff too lean
- High labor rate for fiber placement
- Insufficient NOC functionality
While the problems we uncover do not always raise to the level of a red flag, it happens often enough to justify this exercise. Our clients who invest their own capital in these networks certainly think so. The same should hold true for networks funded with taxpayer money. Fiber, coax and fixed wireless network plans dependent on BEAD funding demand serious scrutiny.
David Strauss is a Principal and Co-founder of Broadband Success Partners, the leading broadband consulting firm focused exclusively on network evaluation and technical due diligence. This piece is exclusive to Broadband Breakfast.
Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.
Expert Opinion
Raul Katz: Can Investments in Robust Broadband Help States Limit the Downside of Recession?
If managed effectively, the BEAD program could play a key role in allowing our economy to weather the storms ahead.

The United States economy is still undergoing persistent inflation rates, high interest rates, and stock market volatility. According to a Wall Street Journal survey conducted in January, economists put the probability of a recession at 61 percent.
Simultaneously, we are also on the eve of the largest federal broadband funding distribution in American history. All 50 U.S. states have begun formulating plans to help connect their communities through the $42.5 billion Broadband Equity, Access and Deployment Program, and its funds are expected to be distributed within months. That, coupled with the Affordable Connectivity Program and other initiatives designed to subsidize broadband access, will play a critical role in connecting every American to the internet. This once-in-a-generation investment in building more robust and resilient broadband networks can help states weather the coming economic storm. To learn how, we simply need to look back to March 2020.
When the COVID-19 pandemic initially cratered the economy, states that had a higher rate of fixed broadband penetration were more insulated from its disruptive effects. Simply put, better-connected states had more resilient economies according to a study I authored for Network:On. In a separate study, by using an economic growth model that accounts for the role fixed broadband plays in mitigating the societal losses resulting from the pandemic, I also found that more connected societies exhibit higher economic resiliency during a pandemic-induced disruption.
In the study conducted for Network:On, we documented that U.S. states with higher broadband adoption rates were able to counteract a larger portion of the economic losses caused by the pandemic than states with lower broadband adoption rates. The states most adversely affected by the pandemic, such as Arkansas and Mississippi, were those exhibiting lower broadband penetration rates. Conversely, states with higher broadband penetration, such as Delaware and New Jersey, were able to mitigate a large portion of losses, as connectivity levels allowed for important parts of the economy to continue functioning during lockdowns.
Nationally, if the entire U.S. had penetration figures equal to those of the more connected states during the pandemic, the GDP would have contracted only one percent— a much softer recession than the actual 2.2 percent. These findings show that investments in closing the digital divide and ensuring everyone can access a high-speed Internet connection are critical to building economic resilience.
Today, wide penetration rate disparities exist between states — such as Delaware’s rate of 91.4 percent compared to Arkansas’ rate of 39.7 percent. Because of this, public authorities should focus on creating policy frameworks that allow operators to spur infrastructure deployments and find the optimal technological mixes to deliver the highest performance to users.
Broadband access matters. It doesn’t exist in a vacuum and is crucial to an area’s economic health. As state broadband offices around the country prepare to deploy BEAD funding, they must remember that broadband access and adoption are imperative to building economic resiliency.
Beyond my own study, a review of the research examining the economic impact of digital technologies over the past two decades confirms that telecommunications and broadband positively impact economic growth, employment, and productivity. This reinforces how consequential these government investments in broadband infrastructure and adoption are to protecting America’s economic health.
The BEAD program still has its challenges, but if managed effectively, it could play a key role in allowing our economy to weather the storms ahead.
Dr. Raul Katz is the president at Telecom Advisory Services LLC and author of the study: The Role of Robust Broadband Infrastructure in Building Economic Resiliency During the COVID-19 Pandemic. This piece is exclusive to Broadband Breakfast.
Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.
-
Fiber4 weeks ago
‘Not a Great Product’: AT&T Not Looking to Invest Heavily in Fixed Wireless
-
Broadband Roundup3 weeks ago
AT&T Floats BEAD in USF Areas, Counties Concerned About FCC Map, Alabama’s $25M for Broadband
-
Big Tech3 weeks ago
Preview the Start of Broadband Breakfast’s Big Tech & Speech Summit
-
Big Tech4 weeks ago
House Innovation, Data, and Commerce Chairman Gus Bilirakis to Keynote Big Tech & Speech Summit
-
Big Tech3 weeks ago
Watch the Webinar of Big Tech & Speech Summit for $9 and Receive Our Breakfast Club Report
-
#broadbandlive2 weeks ago
Broadband Breakfast on March 22, 2023 – Robocalls, STIR/SHAKEN and the Future of Voice Telephony
-
Infrastructure1 week ago
BEAD Build Timelines in Jeopardy if ‘Buy America’ Waivers Not Granted, White House Budget Office Told
-
#broadbandlive3 weeks ago
Broadband Breakfast on March 8: A Status Update on Tribal Broadband