Global Fintech Leader Scales AI with Milvus

5–10x faster
batch ingestion than competitors
Minimal development
needed to support multiple use cases
Instant scalability
from millions to tens of billions of vectors
When it comes to vector databases, Milvus has impressed us with its performance and scalability, meeting our stringent criteria for handling our AI use case backlog.
Team Lead
About the Company
This global fintech company specializes in digital payments, enabling transactions across more than 200 countries and in 25+ currencies. With a portfolio that spans consumer and merchant payment products, it serves tens of billions of transactions annually—from individual peer-to-peer payments to large-scale enterprise solutions. The company is known for its developer-first APIs, modern user experience, and multi-brand ecosystem.
Within this organization, the AI, ML, and Platform Solutions team plays a central role in driving innovation. Their mission: apply cutting-edge machine learning and AI to improve customer experience, automate operations, and open new revenue streams. This includes delivering horizontal AI/ML infrastructure, supporting real-time event streaming, and enabling new capabilities like GenAI across the company’s suite of payment products.
Challenges: Scaling AI Across a Complex Global Infrastructure
In 2023, the company prioritized rolling out a consumer-facing recommender system powered by GenAI. The system launched through one of the fintech's consumer brands and provides tailored product recommendations at checkout, based on merchant inventory and purchase context.
But executing on this goal was not simple. Two primary challenges stood in the way:
Massive Data Volumes The organization handles billions of transactions annually. Existing systems—both commercial and in-house—struggled to scale for the data volumes involved. In fact, the team had previously built a custom graph database because no vendor solution could meet their performance and scale requirements.
Immature Vector Database Landscape Vector search was central to powering personalized recommendations, but the available tools were still relatively new. The team needed a reliable, high-performance system that could scale to production workloads and meet their stringent latency and ingestion requirements.
After evaluating multiple solutions, including Weaviate and AlloyDB, the team chose Milvus.
Why Milvus: Performance, Scalability, and Ease of Use
"Milvus impressed us with its performance and scalability," said the Team Lead for AI, ML, and Platform Solutions. From early trials, Milvus showed exceptional capabilities in data ingestion, query performance, and operational flexibility. Documentation was clear and developer-friendly, and the system handled billions of vectors without extensive tuning.
Batch ingestion performance was especially critical. Inventory data needed to be updated frequently, sometimes hourly. In testing, Milvus ingested full collection dumps 5–10x faster than alternatives. A job that took competitors over 8 hours was completed by Milvus in under 1 hour.
What also stood out was Milvus’s flexibility. The team had a long backlog of AI use cases, from recommender systems to chatbots. Milvus met the needs of many of them with minimal development effort, saving valuable engineering time.
Despite initial hesitation about adopting an open-source tool maintained by a startup (Zilliz), the team found that Milvus had the maturity, ecosystem support, and real-world deployments to meet enterprise-grade requirements.
From Recommenders to Chatbots—What’s Next
Following the successful rollout of the recommender system, the team’s next initiative is an AI-powered customer service chatbot. This multilingual bot will support thousands of service agents globally by answering routine questions using vector search and retrieval techniques.
As the team continues expanding its AI footprint, it’s evaluating a move to Zilliz Cloud—the fully managed Milvus service. Running and scaling Milvus internally has been effective, but offloading infrastructure management would allow the team to focus on higher-value initiatives.
During batch ingestion tests, Milvus demonstrated that it could complete an entire collection dump into the database at speeds 5–10 times faster than competitors.
Team Lead