www.hostingjournalist.com - HostingJournalist.com
HostingJournalist.com V3.0 Has Been Launched. List Your Business and Start Publishing Today. Free 14 Day Trial. SHOW ME

Cloudflare Now Has NVIDIA GPUs Deployed Globally

CategoryCDN Hosting
PublishedOctober 2, 2023

News Summary

Cloudflare deploys NVIDIA GPUs, enhancing CDN with NVIDIA Triton Inference, allowing businesses worldwide access to efficient AI inference applications.


Join HostingJournalist Insider Today

Cloudflare Now Has NVIDIA GPUs Deployed Globally

Businesses can now leverage Cloudflare’s global CDN and data center network to power cutting-edge applications anywhere with AI inference thanks to Cloudflare’s partnership with NVIDIA. This will bring AI inference computational power closer to users located all over the world. In addition to this, it will include NVIDIA's full stack inference software, which will significantly increase the performance of artificial intelligence applications such as big language models.

This software deployment includes NVIDIA TensorRT-LLM and NVIDIA Triton Inference server. Inference based on AI is how the end user interacts with AI and is poised to become the dominant kind of AI work, according to Cloudflare. These days, there would be a significant demand coming from businesses for GPUs. With data centers located in more than 300 locations throughout the globe, Cloudflare says it is able to provide users with lightning-fast experiences while still adhering to worldwide compliance standards.

Cloudflare will make it feasible for any business in the world to begin deploying AI models, powered by NVIDIA GPUs, networking and inference software, without having to worry about maintaining, scaling, optimizing, or securing installations. This would make Cloudflare accessible to organizations of all sizes and in all regions of the world.

"AI inference on a network is going to be the sweet spot for many businesses: private data stays close to wherever users physically are, while still being extremely cost-effective to run because it's nearby," said Matthew Prince, CEO and co-founder of Cloudflare. "We are able to make AI inference, which was previously out of reach for many of our clients, accessible and inexpensive around the globe because to the cutting-edge GPU technology provided by NVIDIA, which is housed on our global network.

"NVIDIA's inference platform is critical to powering the next wave of generative AI applications," said Ian Buck, Vice President of Hyperscale and HPC at NVIDIA. "With NVIDIA GPUs and NVIDIA AI software now available on Cloudflare, businesses will be able to create responsive new customer experiences and drive innovation across every industry."

Low-Latency Generative AI Services

Today, Cloudflare is making it possible for anyone all across the world to enjoy generative AI inferencing without having to pay any upfront fees. To sum up, Cloudflare is now able to deliver the following services as a result of adding NVIDIA GPUs to its global edge network:

  • Every end user would have access to “low-latency generative AI experiences,” with NVIDIA GPUs accessible for inference jobs in over 100 cities by the end of 2023 and in virtually every location where Cloudflare's network reaches by the end of 2024.
  • Providing users with access to computing capacity in close proximity to the locations at which their data is stored, with the goal of assisting customers in anticipating any compliance and regulatory obligations that are likely to develop.
  • Pay-as-you-go computational power at scale, so that any company can have access to the most recent AI innovation - without the need to commit significant money up front to reserve GPUs that may go unused. This would guarantee that every firm has access to the most recent AI breakthrough.