INDIA, BANGALORE — September 17 , 2025 — Redis, the world’s fastest data platform, today announced a major expansion of its AI strategy at Redis Released 2025. During his maiden visit to India as CEO of Redis, Rowan Trollope highlighted the company’s AI strategy, and announced new tools and capabilities for the company’s platform, as well as the strategic acquisition of Decodable, while underscoring India’s growing role in Redis’ global innovation roadmap. He emphasised India’s AI-led innovation and its role as a hub for engineering talent, enterprise adoption, and customer growth.
While addressing the media, Redis CEO Rowan Trollope announced the acquisition of real-time data platform Decodable, the public preview of Redis’ new LangCache service, and several other improvements to Redis for AI that make it easier for developers to build agents with reliable, persistent memory. Together, these moves accelerate Redis’ evolution from the fastest in-memory data store to an essential infrastructure layer for AI, delivering the context and memory that intelligent agents depend on.
“As AI enters its next phase, the challenge isn’t proving what language models can do; it’s giving them the context and memory to act with relevance and reliability,” said Rowan Trollope, CEO of Redis. “As technology becomes ever more reliant on LLMs, the strategic investment we made in Decodable’s platform will make it easier for developers to build and expand data pipelines and convert that data into context within Redis, so it’s fast and always available in the right place at the right time.”
“India is not only a fast-growing market for Redis, it is also helping to shape the future of AI. With one of the world’s largest startup ecosystems, and millions of developers building intelligent applications, India represents the kind of scale, ambition, and innovation where Redis thrives. As enterprises and startups here embrace AI at unprecedented speed, our focus is on giving them the context, memory, and real-time infrastructure their systems need to be more capable, responsive, and reliable,” Trollope further added.
Redis also announced the public preview of LangCache, a fully-managed semantic caching service which cuts the latency and token usage for LLM-dependent applications by as much as 70%, and announced several updates to its AI infrastructure tools, including hybrid search enhancements, integrations with agent frameworks for AutoGen and Cognee. For India’s third largest startup ecosystem and the fastest-growing developer community with over 17 million developers, where cost optimization and scalability are crucial, LangCache helps build more affordable AI-powered experiences for chatbots, agents, and enterprise applications.
LangCache public preview
LangCache is Redis’ fully-managed semantic caching solution that stores and retrieves semantically similar calls to LLMs for chatbots and agents, saving roundtrip latency and drastically cutting token usage.
The performance and cost improvements are substantial:
Up to 70% reduction in LLM API costs, especially in high-traffic applications
15x faster response times for cache hits compared to live LLM inference
Improved end-user experience with lower latency and more consistent outputs
LangCache is in public preview today.
New agent integrations and agent memory
It’s now easier to use Redis with existing AI frameworks and tools. Our ecosystem integrations let developers store their data the way they want, without needing to write custom code. New integrations with AutoGen, Cognee, plus new enhancements with LangGraph expand how developers can use Redis’ scalable, persistent memory layer for agents and chatbots.
Build with:
AutoGen as a framework while getting the fast-data memory layer of Redis and build with existing templates
Cognee to simplify memory management with built-in summarization, planning, and reasoning using Redis as your backbone
LangGraph with new enhancements to improve your persistent memory and make your AI agents more reliable
-END-
About Redis
Redis is the world’s fastest data platform. From its open source origins in 2011 to becoming the #1 cited brand for caching solutions, Redis has helped more than 10,000 customers build, scale, and deploy the apps our world runs on. With multi-cloud and on-prem databases for caching, vector search, and more, Redis helps digital businesses set a new standard for app speed.
Located in San Francisco, Austin, London, and Tel Aviv, Redis is internationally recognized as the leader in building fast apps fast. Learn more at redis.io.
Media Contact
Suhani Lalwani
Annexure:
‘Redis is reliable, resilient, and built for scale. It powers our most critical systems—from pricing to personalization—while letting us focus on delivering innovation to our customers’ Vivek Parihal, Head of Engineering, Purplle
Other Redis for AI improvements
With the rapid emergence of AI agents, Redis is ensuring that its users have the ability to build high-quality agents using Redis for AI. Redis integrations with popular AI agent frameworks enable users to build agents faster and with more reliability. Key agent-focused capabilities include:
Hybrid search enhancements. Redis now includes Reciprocal Rank Fusion, a method to unify text and vector rankings into a single, more relevant result set.
Quantization. Redis now supports int8 quantized embeddings. Compress float vectors to 8‑bit integers for a smaller memory footprint and faster search performance for 75% memory savings and 30% faster search speeds in large‑scale AI applications
Redis Cloud and Redis Open Source updates
Redis also announced multiple updates to Redis Cloud and Redis Open Source, including:
Redis 8.2 GA. Bringing a huge leap in performance with 35% faster commands versus Redis 8.0, 37% smaller memory footprint, and improvements to Redis Query Engine, including 18 data structures including vector sets, and 480+ commands like hash field expiration.
Redis Data Integration (RDI) public preview on Cloud. Keep Redis caches fresh and in-sync with source databases using easy-to-setup, zero-code data pipelines. Speed up legacy data to become real time in minutes.
Redis Insight available on Cloud. Act on Redis data straight from a browser. Visualize and cut debugging time from hours to minutes, without having to open a terminal and context switch, to keep on top of Redis performance.