Generative AI

Registration Gailor to open the Mem0 memory of Anthropic Claude Bot: Enables the rich contexts of context

In this study, we visit a full-time bot set in Google Colab to the Anthropic's Claude model and Sector's memory memory. Integrating the Lang Graph's Machine Insout-Machine Real Memory will empower our assistant to remember previous conversations, obtain appropriate information in search of demand, and keep the environment online. Whether you create supporting bots, literal assistants, or active demeans, the guide will equip you on the power of AI driven by memory.

!pip install -qU langgraph mem0ai langchain langchain-anthropic anthropic

First, we apply for Langgraph, Com0 Ai Client, Langchain with its Anthropic Connectivity, and the latest SDK, ensures that all the latest libraries are required for creating a Chaude Chaude Chaude. Running forward will avoid issues of dependence and comply with the setup process.

import os
from typing import Annotated, TypedDict, List


from langgraph.graph import StateGraph, START
from langgraph.graph.message import add_messages
from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
from langchain_anthropic import ChatAnthropic
from mem0 import MemoryClient

We bring together our Colab Chatbot architect: Loading Apperson's Application Interface, Python Expressed Dictionaries and Advertising Claude, as well as the MEM0 client storage persistent memory memory.

os.environ["ANTHROPIC_API_KEY"] = "Use Your Own API Key"
MEM0_API_KEY = "Use Your Own API Key"

We safeguard our Anthropic Controls and Mem0 different in the environment and the variations of the area, to ensure that Chatthropic client and Memory Memory can provide well without critical books. Insert our API keys here, keeping pure division between code and secrets while enabling seamless access to the Claude model and persistent memory remedy.

llm = ChatAnthropic(
    model="claude-3-5-haiku-latest",
    temperature=0.0,
    max_tokens=1024,
    anthropic_api_key=os.environ["ANTHROPIC_API_KEY"]
)
mem0 = MemoryClient(api_key=MEM0_API_KEY)

We start our AI core: First, creates a chatanthropic example prepared to speak with Claude 3.5 Sonnet in Zero heating with deciding answers and receiving our Anthropic key. Then include memoryclient MemoryClient with our MEM0 API key, provided our bot memory shop based on the Vector foundation to save and restore previous seamless interactions.

class State(TypedDict):
    messages: Annotated[List[HumanMessage | AIMessage], add_messages]
    mem0_user_id: str


graph = StateGraph(State)


def chatbot(state: State):
    messages = state["messages"]
    user_id = state["mem0_user_id"]


    memories = mem0.search(messages[-1].content, user_id=user_id)


    context = "n".join(f"- {m['memory']}" for m in memories)
    system_message = SystemMessage(content=(
        "You are a helpful customer support assistant. "
        "Use the context below to personalize your answers:n" + context
    ))


    full_msgs = [system_message] + messages
    ai_resp: AIMessage = llm.invoke(full_msgs)


    mem0.add(
        f"User: {messages[-1].content}nAssistant: {ai_resp.content}",
        user_id=user_id
    )


    return {"messages": [ai_resp]}

It describes the State Type Typeddicts State of the State of the MEM0 user ID. Inside the Chatbot, the most recent user message is used to ask MEM0 for appropriate memories, the development of the system, creates a response, and that new exchange has been retained back in Meme0 before returning the assistant.

graph.add_node("chatbot", chatbot)
graph.add_edge(START, "chatbot")
graph.add_edge("chatbot", "chatbot")
compiled_graph = graph.compile()

We are conducting our Chatbot function in Langgraph's Walking by enrolling as a “Chatbot,” and connect the first marker built on that. Therefore, the conversation begins there, and eventually forms the skirt for engaging so that the new user message also log in again. Drawing Graph.Com () turn this Node-and Edge tip into a configured, active graph item that will carry each time of our chat session automatically.

def run_conversation(user_input: str, mem0_user_id: str):
    config = {"configurable": {"thread_id": mem0_user_id}}
    state = {"messages": [HumanMessage(content=user_input)], "mem0_user_id": mem0_user_id}
    for event in compiled_graph.stream(state, config):
        for node_output in event.values():
            if node_output.get("messages"):
                print("Assistant:", node_output["messages"][-1].content)
                return


if __name__ == "__main__":
    print("Welcome! (type 'exit' to quit)")
    mem0_user_id = "customer_123"  
    while True:
        user_in = input("You: ")
        if user_in.lower() in ["exit", "quit", "bye"]:
            print("Assistant: Goodbye!")
            break
        run_conversation(user_in, mem0_user_id)

We bind everything together by describing the run_conversion, which includes our user installation in Langgraph province, streaming with combined graphs to ask Chatbot and a Clouder's response. The __ Main__ guards and launched a simple loop, motivates us to type messages, deliver it with our memory enabled, and they are kindly out when we enter “out”.

In conclusion, we collect a variable AI pipe that includes anthropic cutting model in the edge of persistent memory, all Langgraph langgrams in the Langgraph location. This art permits our bot to keep in mind user information, conform to answers later, and have provided customized support. From here, think about strategic plans to invite remembrance-return, Claude CLAUDE, or combine additional tools on your graph.


Survey Brochure in the corner here. All credit for this study goes to research for this project. Also, feel free to follow it Sane and don't forget to join ours 95k + ml subreddit.

Here is a short opinion of what we build in MarktechPost:


Asphazzaq is a Markteach Media Inc. According to a View Business and Developer, Asifi is committed to integrating a good social intelligence. His latest attempt is launched by the launch of the chemistrylife plan for an intelligence, MarktechPost, a devastating intimate practice of a machine learning and deep learning issues that are clearly and easily understood. The platform is adhering to more than two million moon visits, indicating its popularity between the audience.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button