Generative AI

Implementation of the Ai Zomongo Conference on Google Colab using Langchain, Langgraph, Gemini Pro, and a project model (MCP) with a combination tool

For this craftsundial study, we present the main principles of the Model Management Agreement (MCP) Model Management Agreement, the context of AI Assistant using Langchain, Langgraph, and Google Model. While the full MCP integration includes the dedicated servers and communication policies, the simplified version shows that the same appearance, tool suppliers, and powerful interactions can ripen in a specific agent. An assistant may answer the language hearings and choose the external tools (such as customary information), implementing the MCP clients interact with real national providers.

!pip install langchain langchain-google-genai langgraph python-dotenv
!pip install google-generativeai

First, we include important libraries. The first command includes Langchain, Langgraph, Google Orative Ai Langchain Wrapper, and the environmental support for Python-Dotenv. The second commandment includes the official AI Google AI client, which enables the partnership with Gemini models.

import os
os.environ["GEMINI_API_KEY"] = "Your API Key"

Here, we place your Gemini Aki key like nature variable so the model can be safe without removing your Coppubase. Replace “your API key” with your real key from Google Ai Studio.

from langchain.tools import BaseTool
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.messages import HumanMessage, AIMessage
from langgraph.prebuilt import create_react_agent
import os


model = ChatGoogleGenerativeAI(
    model="gemini-2.0-flash-lite",
    temperature=0.7,
    google_api_key=os.getenv("GEMINI_API_KEY")
)


class SimpleKnowledgeBaseTool(BaseTool):
    name: str = "simple_knowledge_base"
    description: str = "Retrieves basic information about AI concepts."


    def _run(self, query: str):
        knowledge = {
            "MCP": "Model Context Protocol (MCP) is an open standard by Anthropic designed to connect AI assistants with external data sources, enabling real-time, context-rich interactions.",
            "RAG": "Retrieval-Augmented Generation (RAG) enhances LLM responses by dynamically retrieving relevant external documents."
        }
        return knowledge.get(query, "I don't have information on that topic.")


    async def _arun(self, query: str):
        return self._run(query)


kb_tool = SimpleKnowledgeBaseTool()
tools = [kb_tool]
graph = create_react_agent(model, tools)

In this Block, we start a Gemini language model (Gemini-2.0-flash-lite) Use Langchain's Chatgoogooguiei, with an API keys safely from the environment. We then explain the custom tool for SimpleKybasetool to imitate the Source of external information by returning the defined answers to questions as “MCP” and “Rag.” This tool works as a basic context, such as the MCP server will work. Finally, we use Langgraph's Deaner_REACT to build a style agent that we can think of the encouragement and the volatility has determined the tools, imitating a MCP tool, a rich communication law.

import nest_asyncio
import asyncio


nest_asyncio.apply()  


async def chat_with_agent():
    inputs = {"messages": []}


    print("🤖 MCP-Like Assistant ready! Type 'exit' to quit.")
    while True:
        user_input = input("nYou: ")
        if user_input.lower() == "exit":
            print("👋 Ending chat.")
            break


        from langchain.schema.messages import HumanMessage, AIMessage
        inputs["messages"].append(HumanMessage(content=user_input))


        async for state in graph.astream(inputs, stream_mode="values"):
            last_message = state["messages"][-1]
            if isinstance(last_message, AIMessage):
                print("nAgent:", last_message.content)


        inputs["messages"] = state["messages"]


await chat_with_agent()

Finally, set asynchronous chat loop to share with the inspired by MCP. Using Nest_asyndio, we enable the support of the Asynchronous code within the loop of the book. Chat_with_agent () The job holds user input, feeding on the Reuşt agent, and spread the model answers in real time. For each turn, the assistant uses discreet reasons to determine whether you will answer directly or request a knowledge of information, how thrilling the contexts of bringing stronger, contexts.

In conclusion, the lesson provides a practical basis for building AIGents inspired by MCP Standard. We have created a practical use of demand tools and replacement of foreign information by integrating the Langchain Interpretation Interface, Langgraph agent, and a strong generation of Gemini language. Although the setup is simplified, capturing the core of MCP Buildings: The situation, interaction and dress injection. From here, you can add the active APIs, local documents, or mobile tools, appear in an AI system that is ready for the products associated with the principles of the model project.


Here is the Colab Notebook. Also, don't forget to follow Sane and join ours Telegraph station including LinkedIn Grtopic. Don't forget to join ours 85k + ml subreddit.

🔥 [Register Now] The Minicon Virtual Conference at an open Source AI: Free Registration + 3 3 Certificate Reference (April 12, 9 pm 12 pm) + workshop [Sponsored]


Asphazzaq is a Markteach Media Inc. According to a View Business and Developer, Asifi is committed to integrating a good social intelligence. His latest attempt is launched by the launch of the chemistrylife plan for an intelligence, MarktechPost, a devastating intimate practice of a machine learning and deep learning issues that are clearly and easily understood. The platform is adhering to more than two million moon visits, indicating its popularity between the audience.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button