Creating a UI AGENT program with Langgraph: Putting someone in Loop

In our previous study, we create an AI agent who is able to answer questions using the Web and add to the last condition. However, in many cases, you may want to put a person at the loop to look at and allow agent actions. This can be easily done with a langgram. Let us consider how this works.
Setting up an agent
We will continue from there we left the last study. First, set natural variables, make the required entries, and prepare Checkpointer.
pip install langgraph==0.2.53 langgraph-checkpoint==2.0.6 langgraph-sdk==0.1.36 langchain-groq langchain-community langgraph-checkpoint-sqlite==2.0.1
import os
os.environ['TAVILY_API_KEY'] = ""
os.environ['GROQ_API_KEY'] = ""
from langgraph.graph import StateGraph, END
from typing import TypedDict, Annotated
import operator
from langchain_core.messages import AnyMessage, SystemMessage, HumanMessage, ToolMessage, AIMessage
from langchain_groq import ChatGroq
from langchain_community.tools.tavily_search import TavilySearchResults
from langgraph.checkpoint.sqlite import SqliteSaver
import sqlite3
sqlite_conn = sqlite3.connect("checkpoints.sqlite",check_same_thread=False)
memory = SqliteSaver(sqlite_conn)
# Initialize the search tool
tool = TavilySearchResults(max_results=2)
To explain the agent
class Agent:
def __init__(self, model, tools, checkpointer, system=""):
self.system = system
graph = StateGraph(AgentState)
graph.add_node("llm", self.call_openai)
graph.add_node("action", self.take_action)
graph.add_conditional_edges("llm", self.exists_action, {True: "action", False: END})
graph.add_edge("action", "llm")
graph.set_entry_point("llm")
self.graph = graph.compile(checkpointer=checkpointer)
self.tools = {t.name: t for t in tools}
self.model = model.bind_tools(tools)
def call_openai(self, state: AgentState):
messages = state['messages']
if self.system:
messages = [SystemMessage(content=self.system)] + messages
message = self.model.invoke(messages)
return {'messages': [message]}
def exists_action(self, state: AgentState):
result = state['messages'][-1]
return len(result.tool_calls) > 0
def take_action(self, state: AgentState):
tool_calls = state['messages'][-1].tool_calls
results = []
for t in tool_calls:
print(f"Calling: {t}")
result = self.tools[t['name']].invoke(t['args'])
results.append(ToolMessage(tool_call_id=t['id'], name=t['name'], content=str(result)))
print("Back to the model!")
return {'messages': results}
To set up an agent status
We now prepare for agent status with a minor modification. Earlier, a list of messages is described with Operated Operators.add, entering new messages in existing list. Working with people in-The-Loop, sometimes we want to restore existing text messages rather than use it.
from uuid import uuid4
def reduce_messages(left: list[AnyMessage], right: list[AnyMessage]) -> list[AnyMessage]:
# Assign IDs to messages that don't have them
for message in right:
if not message.id:
message.id = str(uuid4())
# Merge the new messages with the existing ones
merged = left.copy()
for message in right:
for i, existing in enumerate(merged):
if existing.id == message.id:
merged[i] = message
break
else:
merged.append(message)
return merged
class AgentState(TypedDict):
messages: Annotated[list[AnyMessage], reduce_messages]
Adding a person to the loop
We introduce additional conversion when combining graph. Interurture_beren =[“action”] The parameter adds disruption before calling Node for the action, to ensure the approval of the manual before issuing tools.
class Agent:
def __init__(self, model, tools, checkpointer, system=""):
# Everything else remains the same as before
self.graph = graph.compile(checkpointer=checkpointer, interrupt_before=["action"])
# Everything else remains unchanged
Running the agent
Now, we will start the program with the same Prompt, model, and checkpoint as before. When we call a agent, we pass through the string of the string.
prompt = """You are a smart research assistant. Use the search engine to look up information.
You are allowed to make multiple calls (either together or in sequence).
Only look up information when you are sure of what you want.
If you need to look up some information before asking a follow up question, you are allowed to do that!
"""
model = ChatGroq(model="Llama-3.3-70b-Specdec")
abot = Agent(model, [tool], system=prompt, checkpointer=memory)
messages = [HumanMessage(content="Whats the weather in SF?")]
thread = {"configurable": {"thread_id": "1"}}
for event in abot.graph.stream({"messages": messages}, thread):
for v in event.values():
print(v)
The answers are moved back, and the process stops after AI message, showing the toolbar. However, disturbing parameters_screen each parameter restricts rapid death. And we can find the current status of the graph for this thread and see what we contain and it consists of the next place next ('action' here).
abot.graph.get_state(thread)
abot.graph.get_state(thread).next
Continuous, we also call a bread and in the same line, passes nothing input. This broadcasts results back, including the tool message with the last AI message. Since there is no additional disorder between the verb and the verb and long, the execution is continuous.
for event in abot.graph.stream(None, thread):
for v in event.values():
print(v)
A person's approval
We can use a simple loop that makes the user approvals before you continue to be killed. The new line ID is used for new performance. If the user chooses not to continue, the agent stops.
messages = [HumanMessage("What's the weather in LA?")]
thread = {"configurable": {"thread_id": "2"}}
for event in abot.graph.stream({"messages": messages}, thread):
for v in event.values():
print(v)
while abot.graph.get_state(thread).next:
print("n", abot.graph.get_state(thread), "n")
_input = input("Proceed? (y/n): ")
if _input.lower() != "y":
print("Aborting")
break
for event in abot.graph.stream(None, thread):
for v in event.values():
print(v)
It's good! Now you know how to include someone around the waist. Now, try to try a different disturbance and see how the agent treats them.
Progress: Deeepleccioun.Ai (
Weneet Kumar is a student of a consultant in MarktechPost. He currently pursued his BS from the Indian Institute of Technology (Iit), Kanpur. He is a machine learning enthusiasm. She is passionate about the recent research and anger in the deepest learning, computer idea and related fields.
✅ [Recommended] Join Our Telegraph Channel