Discussion about this post

User's avatar
Petar Ivanov's avatar

This would be a great improvement for every Agentic / LLM application. I'll have to give Redis 8 a try.

Thanks for this breakdown, Raul!

Expand full comment
Vivek Ganesan's avatar

This is nice. Interestingly, caching the LLM responses makes a lot of sense because the response wont ever go out of date (due to LLMs having s cut-off date themselves)

Only exception is when you cache the response of agents that use LLMs. In that case we need to be intelligent enough to not cache things that are dynamic (very much like non-llm apps)

Expand full comment
5 more comments...

No posts