Generative AI

An introduction to building powerful AI systems with a protocol contactor model (MCP) for real-time implementation and integration of tools

In this course, we examine the Advanced Model Control System (MCP) and show how we can use it to address one of the most unique challenges in modern AI systems: enabling real-time interaction between AI models and or external tools. Traditional models work in isolation, limited to their training data, but with MCP, we create a bridge that enables models to access live resources, use specialized tools, and dynamically plan for changing situations. We walk through building the MCP server and client from scratch, showing how each component contributes to such a powerful intelligent collaboration system. Look Full codes here.

import json
import asyncio
from dataclasses import dataclass, asdict
from typing import Dict, List, Any, Optional, Callable
from datetime import datetime
import random


@dataclass
class Resource:
   uri: str
   name: str
   description: str
   mime_type: str
   content: Any = None


@dataclass
class Tool:
   name: str
   description: str
   parameters: Dict[str, Any]
   handler: Optional[Callable] = None


@dataclass
class Message:
   role: str
   content: str
   timestamp: str = None
   def __post_init__(self):
       if not self.timestamp:
           self.timestamp = datetime.now().isoformat()

We begin by defining the basic building blocks of MCP: Resources, tools, and messages. We design these data structures to represent how information flows between AI systems and their external environments in a clean and structured way. Look Full codes here.

class MCPServer:
   def __init__(self, name: str):
       self.name = name
       self.resources: Dict[str, Resource] = {}
       self.tools: Dict[str, Tool] = {}
       self.capabilities = {"resources": True, "tools": True, "prompts": True, "logging": True}
       print(f"✓ MCP Server '{name}' initialized with capabilities: {list(self.capabilities.keys())}")
   def register_resource(self, resource: Resource) -> None:
       self.resources[resource.uri] = resource
       print(f"  → Resource registered: {resource.name} ({resource.uri})")
   def register_tool(self, tool: Tool) -> None:
       self.tools[tool.name] = tool
       print(f"  → Tool registered: {tool.name}")
   async def get_resource(self, uri: str) -> Optional[Resource]:
       await asyncio.sleep(0.1)
       return self.resources.get(uri)
   async def execute_tool(self, tool_name: str, arguments: Dict[str, Any]) -> Any:
       if tool_name not in self.tools:
           raise ValueError(f"Tool '{tool_name}' not found")
       tool = self.tools[tool_name]
       if tool.handler:
           return await tool.handler(**arguments)
       return {"status": "executed", "tool": tool_name, "args": arguments}
   def list_resources(self) -> List[Dict[str, str]]:
       return [{"uri": r.uri, "name": r.name, "description": r.description} for r in self.resources.values()]
   def list_tools(self) -> List[Dict[str, Any]]:
       return [{"name": t.name, "description": t.description, "parameters": t.parameters} for t in self.tools.values()]

We use an MCP server that manages resources and tools while managing execution and retrieval operations. We ensure that it supports asynchronous interactions, making it better and more available to be considered for real-world AI applications. Look Full codes here.

class MCPClient:
   def __init__(self, client_id: str):
       self.client_id = client_id
       self.connected_servers: Dict[str, MCPServer] = {}
       self.context: List[Message] = []
       print(f"n✓ MCP Client '{client_id}' initialized")
   def connect_server(self, server: MCPServer) -> None:
       self.connected_servers[server.name] = server
       print(f"  → Connected to server: {server.name}")
   async def query_resources(self, server_name: str) -> List[Dict[str, str]]:
       if server_name not in self.connected_servers:
           raise ValueError(f"Not connected to server: {server_name}")
       return self.connected_servers[server_name].list_resources()
   async def fetch_resource(self, server_name: str, uri: str) -> Optional[Resource]:
       if server_name not in self.connected_servers:
           raise ValueError(f"Not connected to server: {server_name}")
       server = self.connected_servers[server_name]
       resource = await server.get_resource(uri)
       if resource:
           self.add_to_context(Message(role="system", content=f"Fetched resource: {resource.name}"))
       return resource
   async def call_tool(self, server_name: str, tool_name: str, **kwargs) -> Any:
       if server_name not in self.connected_servers:
           raise ValueError(f"Not connected to server: {server_name}")
       server = self.connected_servers[server_name]
       result = await server.execute_tool(tool_name, kwargs)
       self.add_to_context(Message(role="system", content=f"Tool '{tool_name}' executed"))
       return result
   def add_to_context(self, message: Message) -> None:
       self.context.append(message)
   def get_context(self) -> List[Dict[str, Any]]:
       return [asdict(msg) for msg in self.context]

We developed an MCP client that connects to the server, query resources, and render tools. We keep a memory of the content of all interactions, enabling continuous, informative communication with the server. Look Full codes here.

async def analyze_sentiment(text: str) -> Dict[str, Any]:
   await asyncio.sleep(0.2)
   sentiments = ["positive", "negative", "neutral"]
   return {"text": text, "sentiment": random.choice(sentiments), "confidence": round(random.uniform(0.7, 0.99), 2)}


async def summarize_text(text: str, max_length: int = 100) -> Dict[str, str]:
   await asyncio.sleep(0.15)
   summary = text[:max_length] + "..." if len(text) > max_length else text
   return {"original_length": len(text), "summary": summary, "compression_ratio": round(len(summary) / len(text), 2)}


async def search_knowledge(query: str, top_k: int = 3) -> List[Dict[str, Any]]:
   await asyncio.sleep(0.25)
   mock_results = [{"title": f"Result {i+1} for '{query}'", "score": round(random.uniform(0.5, 1.0), 2)} for i in range(top_k)]
   return sorted(mock_results, key=lambda x: x["score"], reverse=True)


We describe a set of asynchronous tool managers, including conceptual analysis, text summarization, and information searching. We use them to simulate how the MCP system can perform various tasks with modular, connected tools. Look Full codes here.

async def run_mcp_demo():
   print("=" * 60)
   print("MODEL CONTEXT PROTOCOL (MCP) - ADVANCED TUTORIAL")
   print("=" * 60)
   print("n[1] Setting up MCP Server...")
   server = MCPServer("knowledge-server")
   print("n[2] Registering resources...")
   server.register_resource(Resource(uri="docs://python-guide", name="Python Programming Guide", description="Comprehensive Python documentation", mime_type="text/markdown", content="# Python GuidenPython is a high-level programming language..."))
   server.register_resource(Resource(uri="data://sales-2024", name="2024 Sales Data", description="Annual sales metrics", mime_type="application/json", content={"q1": 125000, "q2": 142000, "q3": 138000, "q4": 165000}))
   print("n[3] Registering tools...")
   server.register_tool(Tool(name="analyze_sentiment", description="Analyze sentiment of text", parameters={"text": {"type": "string", "required": True}}, handler=analyze_sentiment))
   server.register_tool(Tool(name="summarize_text", description="Summarize long text", parameters={"text": {"type": "string", "required": True}, "max_length": {"type": "integer", "default": 100}}, handler=summarize_text))
   server.register_tool(Tool(name="search_knowledge", description="Search knowledge base", parameters={"query": {"type": "string", "required": True}, "top_k": {"type": "integer", "default": 3}}, handler=search_knowledge))
   client = MCPClient("demo-client")
   client.connect_server(server)
   print("n" + "=" * 60)
   print("DEMONSTRATION: MCP IN ACTION")
   print("=" * 60)
   print("n[Demo 1] Listing available resources...")
   resources = await client.query_resources("knowledge-server")
   for res in resources:
       print(f"  • {res['name']}: {res['description']}")
   print("n[Demo 2] Fetching sales data resource...")
   sales_resource = await client.fetch_resource("knowledge-server", "data://sales-2024")
   if sales_resource:
       print(f"  Data: {json.dumps(sales_resource.content, indent=2)}")
   print("n[Demo 3] Analyzing sentiment...")
   sentiment_result = await client.call_tool("knowledge-server", "analyze_sentiment", text="MCP is an amazing protocol for AI integration!")
   print(f"  Result: {json.dumps(sentiment_result, indent=2)}")
   print("n[Demo 4] Summarizing text...")
   summary_result = await client.call_tool("knowledge-server", "summarize_text", text="The Model Context Protocol enables seamless integration between AI models and external data sources...", max_length=50)
   print(f"  Summary: {summary_result['summary']}")
   print("n[Demo 5] Searching knowledge base...")
   search_result = await client.call_tool("knowledge-server", "search_knowledge", query="machine learning", top_k=3)
   print("  Top results:")
   for result in search_result:
       print(f"    - {result['title']} (score: {result['score']})")
   print("n[Demo 6] Current context window...")
   context = client.get_context()
   print(f"  Context length: {len(context)} messages")
   for i, msg in enumerate(context[-3:], 1):
       print(f"  {i}. [{msg['role']}] {msg['content']}")
   print("n" + "=" * 60)
   print("✓ MCP Tutorial Complete!")
   print("=" * 60)
   print("nKey Takeaways:")
   print("• MCP enables modular AI-to-resource connections")
   print("• Resources provide context from external sources")
   print("• Tools enable dynamic operations and actions")
   print("• Async design supports efficient I/O operations")


if __name__ == "__main__":
   import sys
   if 'ipykernel' in sys.modules or 'google.colab' in sys.modules:
       await run_mcp_demo()
   else:
       asyncio.run(run_mcp_demo())

We bring it all together in a complete display where the client communicates with the server, captures data, runs tools, and stores context. We witness the full capabilities of MCP as it seamlessly integrates with external knowledge and integration.

In conclusion, the conceptualization of the problem we solve here breaks the boundaries of static AI systems. Instead of treating models as closed boxes, we design constructs that enable them to question, think about, and work with real-world information in structured, contextual ways. This dynamic integration, achieved through the MCP framework, represents a major shift towards modular, intelligent design. By understanding and using MCP, we are positioning ourselves to build the next generation of AISTIVE AI systems that can think, learn, and connect beyond their original confines.


Look Full codes here. Feel free to take a look at ours GitHub page for tutorials, code and notebooks. Also, feel free to follow us Kind of stubborn and don't forget to join ours 100K + ML Subreddit and sign up Our newsletter. Wait! Do you telegraph? Now you can join us by telegraph.


AsifAzzaq is the CEO of MarktechPost Media Inc.. as a visionary entrepreneur and developer, Asifi is committed to harnessing the power of social intelligence for good. His latest effort is the launch of a media intelligence platform, MarktechPpost, which stands out for its deep understanding of machine learning and deep learning stories that are technically sound and easily understood by a wide audience. The platform sticks to more than two million monthly views, which shows its popularity among the audience.

Follow Marktechpost: Add us as a favorite source on Google.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button