Three key tech trends shaping connectivity in 2026
By Lily Bennett|6 January, 2026
As enterprise IT continues to evolve, networking is becoming a strategic lever – one that directly impacts agility, resilience and the ability to compete. In a recent podcast episode, our CTO Paul Gampe shares the technologies he thinks will matter most for the foreseeable future.
Three themes stood out: the rise of Network-as-a-Service (NaaS), the shift toward transmission on-demand at layer one and the rapid evolution of Model Context Protocol (MCP). Each represents a meaningful change in how networks are built, consumed and automated.
Let’s break them down.
1. Network-as-a-Service moves from “nice to have” to table stakes
For years, NaaS was discussed as a ‘’nice to have’’ option but that’s beginning to change. As more production workloads move into the cloud, connectivity has had to keep pace. With sustained double-digit growth across hyperscalers like AWS, Google Cloud, Microsoft and others, the network has become a core part of the economic engine that underpins digital services.
Network-as-a-Service is the tollbooth of connectivity to cloud, it's become core to the economic engine of network providers,’’ Paul explains.
What’s changed recently is not just adoption, but mindset. NaaS is now seen as essential for retaining relevance in the enterprise connectivity landscape with 90% of network providers recognising NaaS and API adoption as strategically critical. But the question is whether they are truly API-ready to support it.
The good news is that clarity is emerging. Over the past year, roles across standards bodies have become more clearly defined:
- GSMA and CAMARA focusing on programmable mobile and radio access networks.
- MEF LSO (Sonata APIs) establishing itself as the standard for east-west connectivity.
- TM Forum addressing north-south integration across OSS, BSS and vendor infrastructure.
This growing alignment reduces fragmentation and gives providers and enterprises a clearer roadmap. As we enter 2026, NaaS won’t be about experimentation, it will be the baseline expectation.
2. Transmission on-demand brings software thinking to layer one
Another major shift is happening lower down the stack. Traditionally, layer one connectivity, especially subsea and long-haul infrastructure, has been slow to procure and even slower to provision. But that model is being challenged by transmission on-demand: deterministic, high-capacity connectivity that can be provisioned in an automated, software-like way.
By combining trusted physical infrastructure with streamlined digital workflows, the quote-to-cash cycle is being dramatically reduced. What once took weeks or months can now happen with a few clicks.
This matters for resilience. Subsea cables, like all infrastructure, experience outages. When they do, the ability to stand up new capacity quickly isn’t just convenient, it’s critical.
Paul explains that the most exciting driver here is AI. As AI workloads grow, so does the need to move massive training datasets, synchronise data across clouds and support federated learning models. These use cases demand deterministic connectivity – no jitter, no congestion, no unpredictable latency which packet-switched networks can’t always deliver.
That deterministic nature of a transmission product, has an entirely new evolving use case in AI connectivity,’’ says Paul.
We’re already seeing innovation at the LAN level with technologies like RDMA over Converged Ethernet, Nvidia Spectrum-X and rail-based architectures. The next frontier is the WAN. As AI workflows increasingly span multiple locations, models and clouds, wide-area transmission on demand becomes a foundational capability, not a niche one.
3. Model Context Protocol turns intent into action
If NaaS is about consuming networks differently, MCP is about interacting with them differently.
Since its emergence in late 2024, MCP has gained rapid attention – and for good reason. It addresses a long-standing gap between conversational AI and real operational systems. Understanding intent is one thing. Acting on it is another.
What makes MCP compelling is how effectively it bridges the gap between connecting the agent to the activity. With well-structured APIs and high-quality documentation, MCP allows AI agents to move beyond insight and into execution.
In practice, that means an AI agent can:
- Query available data centres in a specific location.
- Request and compare connectivity quotes.
- Place an order.
- And trigger provisioning — all through natural language interaction.
What once required manual workflows across multiple systems can now be orchestrated end-to-end.
If we go from this agentic world where you have a relationship with some chatbot or a language model to inform your context, the natural next step is robotic process automation,’’ explains Paul.
Breaking this down, adoption is likely to follow a clear path. First comes context: using large language models to understand needs. Next comes action: robotic process automation driven directly by that context. MCP sits squarely at that intersection.
The key enabler? Standards and documentation. The more consistent and machine-readable the interfaces, the faster MCP-driven automation will scale across the enterprise.
Preparing for what’s next
These trends are already taking shape and will continue to define how networks are bought, built and operated. Organisations that embrace these shifts early will be better positioned to scale, adapt and innovate as demands continue to grow.