Try it out now: https://aditya-rag-app.streamlit.app/
As LLMs have become more ubiquitous, tools to build with them have evolved as well. In fact, they’ve improved to the point that even apps like Perplexity no longer seem daunting to build at a smaller scale.
To learn more about building with LLMs, I created a RAG (Retrieval-Augmented Generation) chatbot that integrates real-time web search information to provide up-to-date and relevant answers. This operates similarly to many popular large-scale LLM chat applications today.
The initial consideration when it came to this project was to decide which framework to use when working with LLMs. The two popular choices are LangChain and LlamaIndex. While LlamaIndex is well-known for its specialty in RAG, I chose LangChain because of its flexibility in various aspects – such as workflows and data loaders – thanks to its modular design.
After finishing an initial prototype however, I faced issues with LangChain. Primarily, I couldn’t enforce a strict workflow for the LLM system. For instance, system prompts would often fail to register, leading to the overall app being unreliable. Moreover, many LangChain features were being deprecated in favor of LangGraph, LangChain’s new framework for complex and dynamic agentic workflows. This led me to rewrite the application to use LangGraph instead.
Although it initially posed a slight learning curve due to LangGraph’s graph-based architecture, it was soon apparent how much more powerful it was. While LangChain is sequential in nature – “chaining” together runnable components – LangGraph allows for non-linear workflows with conditional logic. This made the agentic approach to the system much more reliable and scalable. LangGraph’s memory persistence via checkpointers made memory management and use much more straightforward, as well as providing a useful abstraction layer compared to LangChain’s many redundant and incompatible memory types.
When it came to putting together a frontend for the application, Streamlit was my first choice. It was relatively easy to build, and deployment was a smooth process!
Leave a comment