Connectbase CEO Predicts Edge Technology is What’s Next for AI Inference
“Edge is real and exciting today,” CEO Ben Edmond said. “It's been talked about for almost a decade now, but the reality is we're now seeing, because of AI, real use cases.”
Eric Urbach
WASHINGTON April 16, 2026 – Edge Technology is at the forefront of AI inference growth according to Connectbase CEO and Founder Ben Edmond.
On an webinar on Wednesday with the INCOMPAS’s CEO Chip Pickering, Edmond noted that while the massive data centers and hyperscale may be driving the major conversations around AI inference, there’s more interesting stuff happening at the edge. Connectbase, headquartered in Westborough, Mass., is a SaaS platform for buying and selling connectivity globally.
Edge inference is a type of AI deployment on localized devices that allows for real-time data processing and analysis without reliance on cloud computing or an internet connection. Rooted in edge computing framework, edge inference’s applications process data quickly for applications that require analysis within milliseconds.
“Edge is real and exciting today,” Edmond said. “It's been talked about for almost a decade now, but the reality is we're now seeing, because of AI, real use cases.”
Prior to founding Connectbase, Edmond served as the chief revenue officer at Global Capacity, where he was responsible for all aspects of the company’s customer go‑to‑market strategy and execution.
Some of these use cases, according to Edmond, relate to autonomous drones, or self-driving cars, where fast response times are essential. Edmond noted that due to the size and scale of edge systems, concerns around resource depletion, energy consumption, or general NYMBIsm of data centers are muted, allowing for faster innovation and more creativity.
“You need proximity, and you need proximity that's intelligent, so putting the edge data centers in and around the control points needed to interact with the autonomous world on drones, vehicles, whatever it is, I think is a major demand driver and that that world is moving in that direction,” Edmond said.
It also has more practical applications such as AI powered radiology, according to Edmond. Edmond noted that low-latency processing is essential for running models to assist doctors at local facilities to get analysis instantly, at lower costs. He said this type of use case is now possible with Edge AI, where it was only a pipe dream just a few years ago.
“It boils down to a core belief that one AI is real and It is impacting us today,” Edmond said “[We] need to prepare for AI as an infrastructure market shift, just like we prepared for, when the iPhone was released and what it had for implications on the cellular network.’

Member discussion