Expanding Our Global Reach: Zilliz Cloud Launches in Azure Central India

We’re excited to announce that Zilliz Cloud is now available in Azure Central India. This new region expands our global footprint and gives teams across India and neighboring regions the ability to deploy AI and vector workloads closer to where their data lives, helping meet regulatory, latency, and cost-efficiency goals.
As AI adoption accelerates across India, we’re seeing teams run into a familiar set of challenges: strict data localization rules, latency-sensitive user experiences, and rising infrastructure costs. Our new region helps address all of that, while keeping Zilliz Cloud’s hallmark performance and flexibility intact.
Powering Global AI with Local Precision
Zilliz Cloud now spans 26 cloud regions across major cloud providers, including AWS, Google Cloud, and Microsoft Azure, enabling customers to deploy where it makes the most sense for their users, data, and budget.
Zilliz Cloud is available across major cloud providers:
AWS: US East (N. Virginia), US East (Ohio), US West (Oregon), Canada (Central), Germany (Frankfurt), Singapore, Japan (Tokyo)
Google Cloud: US West (Oregon), US East (N. Virginia), US Central (Iowa), Germany (Frankfurt), Singapore
Azure: US East (Virginia), US East 2 (Virginia), US Central (Iowa), Germany West Central (Frankfurt), Central India (new!)
Deploying in-region helps reduce latency and egress costs while meeting growing demands for data sovereignty and localized processing. And with Zilliz Cloud, scaling AI search infrastructure globally doesn’t mean managing complex infrastructure.
Innovation with Cost Control: Meet Cardinal
Zilliz Cloud eliminates the traditional overhead associated with vector databases. Our Cardinal search engine and intelligent orchestration tools are designed for performance, efficiency, and scale.
Cardinal is our high-performance vector search engine, built from the ground up to deliver lightning-fast, accurate results across billions of embeddings. It uses a combination of intelligent indexing strategies and compute-aware scheduling to ensure blazing search speeds with optimized resource consumption.
As a result, Zilliz Cloud can reduce total cost of ownership by up to 70%, while delivering better performance than DIY vector stacks or general-purpose databases.
Build Without Limits: Zilliz Cloud Features at a Glance
Whether you're powering RAG, AI Agents, recommendation systems, semantic search, or multi-modal GenAI applications, Zilliz Cloud has the flexibility to scale with your needs.
Scale As Needed: Zilliz Cloud uses a horizontally scalable architecture that allows teams to scale up with ease. No overprovisioning. No manual reconfiguration.
Tailored Compute: Choose from multiple Compute Unit (CU) types based on your workload. Each CU type offers a different mix of CPU, memory, and storage, so you only pay for what your workload actually needs.
Hybrid Search + Full-Text Search: Zilliz Cloud supports hybrid search, which allows users to combine vector search (for semantic similarity) with full-text search (for keyword matching). That’s powerful for applications where relevance depends on both meaning and specific terms (e.g., "must include the word 'urgent'" and "find similar support tickets").
Natural Language as Code: Zilliz Cloud offers an MCP server, allowing you to manage the database and handle complex data tasks through natural language.
Enterprise-Grade Security: Zilliz Cloud includes role-based access control (RBAC), encryption at rest and in transit, audit logging, and SOC 2 compliance, ensuring data is secure and compliant from day one.
Built for Developers, Trusted by Enterprises
Zilliz Cloud is used and trusted by both fast-moving startups and Fortune 500 enterprises alike to power search, recommendations, chatbots, AI agents, and more. What they all have in common: the need for scalable, low-latency infrastructure that doesn't require babysitting.
"We needed a system that could handle real-time retrieval over millions of knowledge vectors without breaking under load. Zilliz gave us that. It freed up engineering cycles and let us focus on improving reasoning on the model side, not managing infrastructure," says Dr. Pratyush Kumar, Co-Founder of Sarvam.
Nowhere is this trust more evident than in India, where we're seeing our fastest customer growth. Indian companies are building some of the most demanding AI applications we've encountered—from Sarvam's multilingual large language models that serve India's diverse linguistic landscape, to Verbaflo.ai's conversational AI platforms revolutionizing real estate interactions, and many others pioneering the next generation of AI experiences.
With our Azure Central India region now live, these innovative companies can deploy infrastructure that matches their ambition—closer to their users, with the reduced latency and enhanced data sovereignty that regional deployment provides.
From RAG pipelines to vector-enhanced product search, Zilliz Cloud is built to accelerate your GenAI roadmap—without draining your infrastructure budget.
Ready to Launch in India?
If you’re building AI applications in or for India, our new Azure Central India region is now live and ready to support your deployments.
Log in to Zilliz Cloud and select "Central India" when provisioning a new cluster.
New to Zilliz Cloud? Explore our quick-start guide or connect with our technical team for help.
We’re excited to support the next wave of innovation coming out of India—and beyond.
- Powering Global AI with Local Precision
- Innovation with Cost Control: Meet Cardinal
- Build Without Limits: Zilliz Cloud Features at a Glance
- Built for Developers, Trusted by Enterprises
- Ready to Launch in India?
Content
Start Free, Scale Easily
Try the fully-managed vector database built for your GenAI applications.
Try Zilliz Cloud for FreeKeep Reading

Beyond PGVector: When Your Vector Database Needs a Formula 1 Upgrade
This blog explores why Postgres, with its vector search add-on, pgvector, works well for smaller projects and simpler use cases but reaches its limits for large-scale vector search.

Augmented SBERT: A Data Augmentation Method to Enhance Bi-Encoders for Pairwise Sentence Scoring
Discover how Augmented SBERT uses data augmentation to enhance the bi-encoder for pairwise sentence scoring.

Introducing Milvus 2.5: Built-in Full-Text Search, Advanced Query Optimization, and More 🚀
We're thrilled to announce the release of Milvus 2.5, a significant step in our journey to build the world's most complete solution for all search workloads.