How to Step Guide Guide How to Create AI TAKE AS SUBMISSION USE Distribution, Groq and Tavily

Introduction
In this lesson, we will create an enhanced AI program that can search the web on the latest news on the topic provided and summarize the results. This ment followed the formal job movement:
- Browsing: Run appropriate search queries and collect information from the web.
- Hand: Release and including news summaries from collected information.
- Reflection: Conflict concealment in view of true accuracy and lift upgrading.
- Immersion: Improving summaries based on criticism.
- A good generation: Create appropriate topics for each container.
To improve the use, we will also build a simple GUI using the broadcast. Such as previous tutorials, we will use It is very grown Llm based performance and For Monya of the browsing web. You can produce free API keys on their websites.
Setting up the environment
We start with natural variables, including the required libraries, and import reliance:
Apply the required libraries
pip install langgraph==0.2.53 langgraph-checkpoint==2.0.6 langgraph-sdk==0.1.36 langchain-groq langchain-community langgraph-checkpoint-sqlite==2.0.1 tavily-python streamlit
Ingregation of libraries and set up API keys
import os
import sqlite3
from langgraph.graph import StateGraph
from langchain_core.messages import SystemMessage, HumanMessage
from langchain_groq import ChatGroq
from tavily import TavilyClient
from langgraph.checkpoint.sqlite import SqliteSaver
from typing import TypedDict, List
from pydantic import BaseModel
import streamlit as st
# Set API Keys
os.environ['TAVILY_API_KEY'] = "your_tavily_key"
os.environ['GROQ_API_KEY'] = "your_groq_key"
# Initialize Database for Checkpointing
sqlite_conn = sqlite3.connect("checkpoints.sqlite", check_same_thread=False)
memory = SqliteSaver(sqlite_conn)
# Initialize Model and Tavily Client
model = ChatGroq(model="Llama-3.1-8b-instant")
tavily = TavilyClient(api_key=os.environ["TAVILY_API_KEY"])
Defining the agent's condition
Agent keeps the state details throughout the movement of work:
- Topic: The title where the user wants the latest draft: The first draft of news of the news
- Content: Research content is issued from Tavily search results
- Institutionally: Prosperity and recommendations produced in the reflection system system.
- Sight summaries: News updates after adding sugar from criticism
Headings: Articles produced per category of news
class AgentState(TypedDict):
topic: str
drafts: List[str]
content: List[str]
critiques: List[str]
refined_summaries: List[str]
headings: List[str]
Description of information
It describes a promotional program for each paragraph of the agent:
BROWSING_PROMPT = """You are an AI news researcher tasked with finding the latest news articles on given topics. Generate up to 3 relevant search queries."""
WRITER_PROMPT = """You are an AI news summarizer. Write a detailed summary (1 to 2 paragraphs) based on the given content, ensuring factual correctness, clarity, and coherence."""
CRITIQUE_PROMPT = """You are a teacher reviewing draft summaries against the source content. Ensure factual correctness, identify missing or incorrect details, and suggest improvements.
----------
Content: {content}
----------"""
REFINE_PROMPT = """You are an AI news editor. Given a summary and critique, refine the summary accordingly.
-----------
Summary: {summary}"""
HEADING_GENERATION_PROMPT = """You are an AI news summarizer. Generate a short, descriptive headline for each news summary."""
Organized questions and issues
We use pydantic to describe the formation of the questions and news articles. Pydantic allows us to explain the formation of the llM release. This is important because we want the questions to be a line of string and content issued to the website will have many stories, which is why list of rings.
from pydantic import BaseModel
class Queries(BaseModel):
queries: List[str]
class News(BaseModel):
news: List[str]
Implementation of AI
1. Browsing and Dade
This node forms search queries and returns the correct content from the web.
def browsing_node(state: AgentState):
queries = model.with_structured_output(Queries).invoke([
SystemMessage(content=BROWSING_PROMPT),
HumanMessage(content=state['topic'])
])
content = state.get('content', [])
for q in queries.queries:
response = tavily.search(query=q, max_results=2)
for r in response['results']:
content.append(r['content'])
return {"content": content}
2. Writing and De
Uninstall the news summaries from the content received.
def writing_node(state: AgentState):
content = "nn".join(state['content'])
news = model.with_structured_output(News).invoke([
SystemMessage(content=WRITER_PROMPT),
HumanMessage(content=content)
])
return {"drafts": news.news}
3. Reflection node
Critical summaries are produced in content.
def reflection_node(state: AgentState):
content = "nn".join(state['content'])
critiques = []
for draft in state['drafts']:
response = model.invoke([
SystemMessage(content=CRITIQUE_PROMPT.format(content=content)),
HumanMessage(content="draft: " + draft)
])
critiques.append(response.content)
return {"critiques": critiques}
4. Too much consideration
It promotes summaries based summaries.
def refine_node(state: AgentState):
refined_summaries = []
for summary, critique in zip(state['drafts'], state['critiques']):
response = model.invoke([
SystemMessage(content=REFINE_PROMPT.format(summary=summary)),
HumanMessage(content="Critique: " + critique)
])
refined_summaries.append(response.content)
return {"refined_summaries": refined_summaries}
5. Hoadlines Generation Node
Produces a brief title for each of the story summaries.
def heading_node(state: AgentState):
headings = []
for summary in state['refined_summaries']:
response = model.invoke([
SystemMessage(content=HEADING_GENERATION_PROMPT),
HumanMessage(content=summary)
])
headings.append(response.content)
return {"headings": headings}
To build UI with streamlit
# Define Streamlit app
st.title("News Summarization Chatbot")
# Initialize session state
if "messages" not in st.session_state:
st.session_state["messages"] = []
# Display past messages
for message in st.session_state["messages"]:
with st.chat_message(message["role"]):
st.markdown(message["content"])
# Input field for user
user_input = st.chat_input("Ask about the latest news...")
thread = 1
if user_input:
st.session_state["messages"].append({"role": "user", "content": user_input})
with st.chat_message("assistant"):
loading_text = st.empty()
loading_text.markdown("*Thinking...*")
builder = StateGraph(AgentState)
builder.add_node("browser", browsing_node)
builder.add_node("writer", writing_node)
builder.add_node("reflect", reflection_node)
builder.add_node("refine", refine_node)
builder.add_node("heading", heading_node)
builder.set_entry_point("browser")
builder.add_edge("browser", "writer")
builder.add_edge("writer", "reflect")
builder.add_edge("reflect", "refine")
builder.add_edge("refine", "heading")
graph = builder.compile(checkpointer=memory)
config = {"configurable": {"thread_id": f"{thread}"}}
for s in graph.stream({"topic": user_input}, config):
# loading_text.markdown(f"*{st.session_state['loading_message']}*")
print(s)
s = graph.get_state(config).values
refined_summaries = s['refined_summaries']
headings = s['headings']
thread+=1
# Display final response
loading_text.empty()
response_text = "nn".join([f"{h}n{s}" for h, s in zip(headings, refined_summaries)])
st.markdown(response_text)
st.session_state["messages"].append({"role": "assistant", "content": response_text})
Store
The study covered the whole process of building Ai-Powered News Simbouring Agent with Simplelin UI. Now you can play around this and make other progress like:
- A Best gui Advanced user interaction.
- Including Existing refinement To ensure summaries are accurate and appropriate.
- Keeping the context to continue the discussion on specific issues.
Codes?
Also, feel free to follow it Sane and don't forget to join ours 75k + ml subreddit.
🚨 Recommended for an open source of AI' (Updated)
Weneet Kumar is a student of a consultant in MarktechPost. He currently pursued his BS from the Indian Institute of Technology (Iit), Kanpur. He is a machine learning enthusiasm. She is passionate about the recent research and anger in the deepest learning, computer idea and related fields.
✅ [Recommended] Join Our Telegraph Channel



