Akamai Technologies introduced the Akamai Inference Cloud, a platform designed to facilitate AI inference closer to users and devices. This launch integrates Akamai’s distributed architecture with NVIDIA Blackwell, providing the necessary acceleration for AI computing. The system aims to deliver intelligent, agentic AI processing that meets the demands for low-latency inference and global scalability.
The Inference Cloud is positioned to enhance enterprise AI applications by processing inference tasks at proximity to users, thereby enabling personalized digital experiences and real-time decision-making. Akamai’s CEO emphasizes the relevance of this proximity to user needs in the evolving AI landscape, ensuring faster and smarter decision-making capabilities. The platform’s architecture supports applications in sectors such as smart commerce, financial insights, and physical AI systems, facilitating essential real-time responses and interactions with the physical world.
Additionally, the platform streamlines complex AI workloads by automatically routing tasks to optimal locations, reducing the time to realize value and simplifying management through a unified interface. Akamai is currently rolling out services in 20 global regions and plans to expand further, showcasing its commitment to enhancing AI capabilities across various industries.
👉 Pročitaj original: CIO Magazine