DeepSeek R1 is now available on Intel® Tiber™ AI Cloud
Run DeepSeek-R1-Distill-Llama-70B using text-generation-inference with Intel® Data Center GPU Max Series. Learn nowIntel® Tiber™ AI Cloud offers AI startups and enterprises cost-effective and scalable AI-optimized computing. We support the open-source software ecosystem in partnership with universities and research institutions.
AI-optimized Compute
Launch single nodes or clusters featuring Intel Gaudi AI Processors, Intel Max GPUs and Xeon based CPUs.Enterprise-ready
Secure and compliant with private cloud options for fully isolated and customized cluster deployments.Expert Support
24/7 support and assistance for onboarding and LLM model optimization to streamline and scale AI deployments.Serving a Wide Range of AI Use Cases
Our cloud platform supports AI use cases, including:Pricing
Intel Tiber AI Cloud offers a wide variety of AI-optimized computing instances. From GPU-enabled VMs to large-scale clusters with AI processors.AI Processor Instances
Gaudi 2 AI Processor
Starting at $1.30 / hr / card- Bare metal on Xeon Gen 6
- Memory: 2TB
- Disk: 17TB
- Cards: 8 AI processors
GPU Instances
Max Series GPU
Starting at $0.39 / hr / card- Bare metal on Xeon Gen 5
- Memory: 2TB
- Disk: 1TB
- Cards: 8 GPUs
CPU
Xeon Gen 4, 5 or 6
Starting at $0.45 / hr- Bare metal on Xeon Scalable
- Memory: 1TB
- Disk: 1TB
Custom
Customized pricing for Gaudi clusters and discounted reserved pricing can be provided on request.Clusters and more
Introducing Gaudi AI processor supercomputer
Optimized for developers
Take advantage of software tools and developer resources to get up to speed quickly.Support for new and existing models
Customize reference models, start fresh, or migrate existing models using open-source tools, including resources from Hugging Face.Integrated with PyTorch
Keep working with the library your team already knows.Easy migration of GPU-based models
Quickly port your existing solutions using our purpose-built software tools.R&D and partnerships
In partnership with Intel Labs' leading AI researchers, we validate the latest LLM models and work closely with enterprise customers to optimize, fine-tune, and deploy AI models in Intel Tiber AI Cloud. We partner with leading AI startups to offer trusted AI solutions powered by the latest Intel accelerated computing.
Enterprise trusted AI
We offer access to partner AI services hosted on Intel Tiber AI Cloud via the Software catalog.Open the catalogFeatured solutions: Seekrflow

AI startup accelerator
Intel® Liftoff for Startups is an accelerator program empowering AI startups with Intel’s expertise, technology, and market reach.Ecosystem and research
Intel Tiber AI Cloud supports code validation and upstreaming for the open-source community for AI education.Be part of our startup programBe part of our startup program
Resources for your AI journey.
Enjoy the flexibility and control you need to innovate. From learning resources to deployment tools.
Use cases and success stories
