
Comcast is working with NVIDIA to test next-generation AI applications running directly at the edge of its network, in a move designed to bring artificial intelligence processing closer to end users and reduce latency.
The US operator said the field trials will use NVIDIA GPUs deployed within Comcast’s distributed network infrastructure to evaluate how AI workloads perform when processed in regional facilities located nearer to homes and businesses.
Comcast’s network currently reaches around 65 million homes and businesses across the United States. The company believes its distributed architecture can provide a platform for real-time AI inference by placing computing resources closer to customers rather than relying on distant data centres.
Elad Nafshi, chief network officer at Comcast, said the initiative reflects the growing shift toward distributed AI infrastructure.
“The industry is shifting towards a more distributed AI infrastructure and Comcast operates a network that supports it today,” he said. “By bringing NVIDIA GPUs directly into our edge cloud, we can explore what becomes possible when AI inference happens only milliseconds from end users.”
The collaboration is designed to test how edge-based AI processing could enable faster and more responsive digital services for both consumers and businesses.
Initial use cases include a personalised advertising engine that can customise video advertising at the household level using AI video models from Decart. Comcast said this could tailor ads based on factors such as language or viewing preferences.
A second application is an AI-powered concierge tool for small businesses using Personal AI’s small language model platform running on HPE ProLiant servers. The system could manage customer enquiries, appointments and day-to-day interactions.
The companies are also testing the impact of edge-based GPU computing on online gaming, where reducing latency can improve responsiveness and gameplay performance. The work builds on low-latency technologies previously deployed by Comcast to support services such as NVIDIA GeForce NOW.
Ronnie Vasishta, senior vice-president of AI and telecoms at NVIDIA, said distributed AI infrastructure represents a major opportunity for network operators.
“By bringing intelligent AI inference to the network edge, Comcast can unlock cost efficiencies while delivering low-latency experiences for customers at massively concurrent scale,” he said.
Comcast said the trials will measure latency improvements, power and cost efficiency, resilience and scalability across its network footprint, as well as overall user experience.
The companies will also explore further opportunities including AI-enhanced advertising, new services for small businesses, premium low-latency gaming tiers and potential third-party edge compute services. Comcast is expected to provide further updates at NVIDIA’s GTC conference in San Jose.