LangGraph Cloud
LangGraph is a recent addition to the ever-expanding LangChain ecosystem. With the launch of LangGraph Cloud, a managed, hosted service is now available for deploying and hosting LangGraph application
Agentic Applications
We are beginning to realize that agentic applications will become a standard in the near future. The advantages of agents are numerous; here are a few examples:
Complex Query Handling: Agents can manage complex, ambiguous, and implicit user queries automatically.
Dynamic Event Chains: Agents can create a chain of events on the fly based on user-assigned tasks.
LLM Integration: Agents use a large language model (LLM) as their backbone.
Task Decomposition: Upon receiving a user query, agents decompose the task into sub-tasks and execute them sequentially.
Tool Utilisation: Agents have access to various tools and decide which to use based on the provided tool descriptions.
A tool is a unit of capability that can perform tasks such as web searches, mathematical computations, API calls, and more.
Limitations To Agent Adoption
Impediments and apprehensions to agent adoption include:
LLM Inference Cost: The backbone LLMs are queried multiple times during a single query, and with a large number of users, the inference costs can skyrocket.
Control and Transparency: There is a significant need for enhanced controllability, inspectability, observability, and more granular control, as there is a market concern that agents may be too autonomous.
Over-Autonomy: While agents have surpassed the capabilities of chatbots, they may have done so excessively, necessitating some measure of control.
Performance and Latency: For more complex agents, there is a requirement to decrease latency by running tasks in parallel and streaming not only LLM responses but also agent responses as they become available.
LangGraph
LangGraph is framework-agnostic, where each node operates as a standard Python function.
It extends the core Runnable API, a unified interface for streaming, asynchronous, and batch calls, to support:
Seamless state management across multiple conversation turns or tool calls.
Flexible routing between nodes based on dynamic criteria.
Smooth transitions between LLMs and human intervention.
Persistence for long-running, multi-session applications.
LangGraph Cloud
Below is a basic outline of the developer workflow:
Users develop their LangGraph application within their preferred IDE.
They push their code to GitHub for version control.
LangGraph Cloud accesses the code from GitHub for deployment.
Applications deployed on LangGraph Cloud can be tested, traces can be run, interruptions can be added, and more.
Below you can see the LangGraph assistant specifications which gives the OpenAPI specs.
LangGraph Studio
LangGraph Studio visualises data flows, enabling interaction through message sending.
It displays and streams steps in real-time, allowing users to revisit and edit nodes, and branch new paths from any point.
Users can add breakpoints to pause sequences, requiring permission to proceed, making it a dynamic tool for application development.
LangGraph Studio serves as a graphic representation of your written code, providing a visual way to visualise and gain insights into data flow.
It’s important to note that Studio is not a tool for creating or developing flows; rather, it visually represents existing code.
Within Studio, code cannot be edited or changed; instead, it functions as a tool for observation, debugging, and understanding conversation flow.
Studio serves as a powerful tracing tool, allowing users to add pauses and fork conversations to inspect different permutations.
I’m currently the Chief Evangelist @ Kore AI. I explore & write about all things at the intersection of AI & language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces & more.