AI-Powered Call Center Intelligence with MongoDB
Transform customer service with MongoDB Atlas Vector Search and RAG. Convert call recordings into searchable insights for faster, more accurate responses.
Use cases: Analytics, Gen AI, Modernization, Personalization
Industries: Insurance, Financial Services, Healthcare, Retail, Telecommunications
Products: MongoDB Atlas, Atlas Search, Atlas Vector Search
Partners: AWS, Cohere, LangChain
Solution Overview
Customer satisfaction is critical for insurance companies. Studies have shown that companies with superior customer experiences consistently outperform their peers. In fact, McKinsey found that life and property/casualty insurers with superior customer experiences saw a significant 20% and 65% increase in total shareholder return, respectively, over five years.
A satisfied customer is a loyal customer. They are 80% more likely to renew their policies, directly contributing to sustainable growth. However, one major challenge faced by many insurance companies is the inefficiency of their call centers. Agents often struggle to quickly locate and deliver accurate information to customers, leading to frustration and dissatisfaction.
This solution illustrates how MongoDB can transform call center operations. By converting call recordings into searchable vectors (numerical representations of data points in a multi-dimensional space), businesses can quickly access relevant information and improve customer service. We'll dig into how the integration of Amazon Transcribe, Cohere, and MongoDB Atlas Vector Search is achieving this transformation.
Customer service interactions are goldmines of valuable insights. By analyzing call recordings, we can identify successful resolution strategies and uncover frequently asked questions. In turn, by making this information—which is often buried in audio files— accessible to agents, it can give customers faster and more accurate assistance.
However, the vast volume and unstructured nature of these audio files make it challenging to extract actionable information efficiently.
To address this challenge, we propose a pipeline that leverages AI and analytics to transform raw audio recordings into into vectors as shown in Figure 1:
Storage of raw audio files: Past call recordings are stored in their original audio format.
Processing of the audio files with AI and analytics services (such as Amazon Transcribe Call Analytics): Speech-to-text conversion, summarization of content, and vectorization.
Storage of vectors and metadata: The generated vectors and associated metadata (e.g., call timestamps, agent information) are stored in an operational data store.
Once the data is stored in vector format within the operational data store, it becomes accessible for real-time applications. This data can be consumed directly through vector search or integrated into a retrieval-augmented generation (RAG) architecture, a technique that combines the capabilities of large language models (LLMs) with external knowledge sources to generate more accurate and informative outputs.
Figure 1. Customer service call insight extraction and vectorization flow
Reference Architectures
With MongoDB:
Now that we’ve looked at the components of the pre-processing pipeline, let’s explore the proposed real-time system architecture in detail. It comprises the following modules and functions (see Figure 2):
Amazon Transcribe, which receives the audio coming from the customer’s phone and converts it into text.
Cohere’s embedding model, served through Amazon Bedrock, vectorizes the text coming from Transcribe.
MongoDB Atlas Vector Search receives the query vector and returns a document that contains the most semantically similar FAQ in the database.
Figure 2. System architecture and modules
For complete implementation details, visit the GitHub repository.
Key Learnings
Integrating AI services (speech-to-text, vector embeddings, vector search) with MongoDB Atlas can transform traditional call centers by making historical voice data instantly searchable and actionable.
A RAG-based architecture combined with vector search provides a foundation for both immediate wins (faster agent responses) and future opportunities (chatbots, automated workflows), making it a strategic investment for customer service operations.
Implementing real-time agent assistance can directly impact key business metrics—from improved customer satisfaction and loyalty (80% higher renewal rates) to significant financial returns (20-65% increase in total shareholder return for insurers).
The proposed architecture is simple but very powerful, easy to implement, and effective. Moreover, it can serve as a foundation for more advanced use cases that require complex interactions, such as agentic workflows, and iterative and multi-step processes that combine LLMs and hybrid search to complete sophisticated tasks.
This solution not only impacts human operator workflows but can also underpin chatbots and voicebots, enabling them to provide more relevant and contextual customer responses.
Technologies and Products Used
MongoDB Developer Data Platform
Partner Technologies
Authors
Luca Napoli, MongoDB
Sebastian Rojas Arbulu, MongoDB