Generative AI

Implementation of Code for Building Agent Research Agent Research Agent Assembly and Openai agents, operating tools, Handoffs, and session memory

In this lesson, we begin by showing the power of Opelai agents as a driving force after our multi-agent research system. We set up our Colob Nature with Opelai API key, including customized opera, and explain customary tools, Web_search, Analyze Again. We take three special special suppliers of Openai (Data Research, Data Antrocter, and Coordinator Coordinator), each contains clear instructions, directive and access to tools. We show how these people interact with Asynchronously and sync, maintenance memories of the continued time, and allow immediate testing about help activities. Look Full codes here.

!pip install openai-agents python-dotenv


import asyncio
import json
from datetime import datetime
from agents import Agent, Runner, function_tool, SQLiteSession
import os


os.environ['OPENAI_API_KEY'] = 'Use Your Own API Key'

We include Openai-agents and PythonV, and import ASYCIO, JSON, DETIME, and SDK SDK Primitives (Agent, runner, work_tonctor, Sqlitess). We put Openaai_api_key in the environment so that we can submit our agents at this time to chase. Look Full codes here.

@function_tool
def web_search(query: str, max_results: int = 3) -> str:
   """Simulate web search results for demonstration"""
   results = [
       f"Result 1 for '{query}': Latest findings show significant developments...",
       f"Result 2 for '{query}': Research indicates new approaches in this field...",
       f"Result 3 for '{query}': Expert analysis suggests important implications..."
   ]
   return f"Search results for '{query}':n" + "n".join(results[:max_results])


@function_tool
def analyze_data(data: str, analysis_type: str = "summary") -> str:
   """Analyze provided data with different analysis types"""
   analyses = {
       "summary": f"Summary: The data contains {len(data.split())} key points with main themes around innovation and efficiency.",
       "detailed": f"Detailed Analysis: Breaking down the {len(data)} characters of data reveals patterns in methodology and conclusions.",
       "trends": f"Trend Analysis: Current data suggests upward trajectory with 3 major inflection points identified."
   }
   return analyses.get(analysis_type, "Analysis complete: Standard evaluation performed.")


@function_tool
def save_research(title: str, content: str, category: str = "general") -> str:
   """Save research findings to a structured format"""
   timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
   research_entry = {
       "title": title,
       "content": content,
       "category": category,
       "timestamp": timestamp,
       "id": f"research_{len(content) % 1000}"
   }
   return f"āœ… Research saved: '{title}' in category '{category}' at {timestamp}"

It describes three tools for their work – Our agent: Web_search imitation of the quick results, analyzing – Data Rescue / Detailed / Tile Details. Look Full codes here.

research_agent = Agent(
   name="Research Specialist",
   instructions="""You are an expert researcher who:
   - Conducts thorough web searches on any topic
   - Analyzes information critically and objectively
   - Identifies key insights and patterns
   - Always uses tools to gather and analyze data before responding""",
   tools=[web_search, analyze_data]
)


analyst_agent = Agent(
   name="Data Analyst",
   instructions="""You are a senior data analyst who:
   - Takes research findings and performs deep analysis
   - Identifies trends, patterns, and actionable insights
   - Creates structured summaries and recommendations
   - Uses analysis tools to enhance understanding""",
   tools=[analyze_data, save_research]
)


coordinator_agent = Agent(
   name="Research Coordinator",
   instructions="""You are a research coordinator who:
   - Manages multi-step research projects
   - Delegates tasks to appropriate specialists
   - Synthesizes findings from multiple sources
   - Makes final decisions on research direction
   - Handoff to research_agent for initial data gathering
   - Handoff to analyst_agent for detailed analysis""",
   handoffs=[research_agent, analyst_agent],
   tools=[save_research]
)

It describes three of the castles in the banks: A research expert also collects information, a deeper data analyst and saves the formal results, along with the Handoffs research coordinator and last decisions. Together, we submit, analyze tools, and produce usable summaries at last-end. Look Full codes here.

async def run_advanced_research_workflow():
   """Demonstrates a complete multi-agent research workflow"""
  
   session = SQLiteSession("research_session_001")
  
   print("šŸš€ Starting Advanced Multi-Agent Research System")
   print("=" * 60)
  
   research_topic = "artificial intelligence in healthcare 2024"
  
   print(f"nšŸ“‹ PHASE 1: Initiating research on '{research_topic}'")
   result1 = await Runner.run(
       coordinator_agent,
       f"I need comprehensive research on '{research_topic}'. Please coordinate a full research workflow including data gathering, analysis, and final report generation.",
       session=session
   )
   print(f"Coordinator Response: {result1.final_output}")
  
   print(f"nšŸ“Š PHASE 2: Requesting detailed trend analysis")
   result2 = await Runner.run(
       coordinator_agent,
       "Based on the previous research, I need a detailed trend analysis focusing on emerging opportunities and potential challenges. Save the final analysis for future reference.",
       session=session
   )
   print(f"Analysis Response: {result2.final_output}")
  
   print(f"nšŸ”¬ PHASE 3: Direct specialist analysis")
   result3 = await Runner.run(
       analyst_agent,
       "Perform a detailed analysis of the healthcare AI market, focusing on regulatory challenges and market opportunities. Categorize this as 'market_analysis'.",
       session=session
   )
   print(f"Specialist Response: {result3.final_output}")
  
   print("nāœ… Research workflow completed successfully!")
   return result1, result2, result3


async def run_focused_analysis():
   """Shows focused single-agent capabilities"""
  
   print("nšŸŽÆ FOCUSED ANALYSIS DEMO")
   print("-" * 40)
  
   result = await Runner.run(
       research_agent,
       "Research in quantum computing and analyze the key breakthroughs from 2024.",
       max_turns=5
   )
  
   print(f"Focused Analysis Result: {result.final_output}")
   return result


def quick_research_sync(topic: str):
   """Synchronous research for quick queries"""
  
   print(f"n⚔ QUICK SYNC RESEARCH: {topic}")
   print("-" * 40)
  
   result = Runner.run_sync(
       research_agent,
       f"Quickly research {topic} and provide 3 key insights."
   )
  
   print(f"Quick Result: {result.final_output}")
   return result

We are fully optimized by full work movement with session memory (three phases integrated by coordinator and analyst). We are making one focus-related analysis of the cap cap, and finally, it causes a fast-faster, sirevened research assistant. Look Full codes here.

async def main():
   """Main function demonstrating all capabilities"""
  
   print("šŸ¤– OpenAI Agents SDK - Advanced Tutorial")
   print("Building a Multi-Agent Research System")
   print("=" * 60)
  
   try:
       await run_advanced_research_workflow()
      
       await run_focused_analysis()
      
       quick_research_sync("blockchain adoption in enterprise")
      
       print("nšŸŽ‰ Tutorial completed successfully!")
       print("nKey Features Demonstrated:")
       print("āœ… Multi-agent coordination with handoffs")
       print("āœ… Custom function tools")
       print("āœ… Session memory for conversation continuity")
       print("āœ… Async and sync execution patterns")
       print("āœ… Structured workflows with max_turns control")
       print("āœ… Specialized agent roles and capabilities")
      
   except Exception as e:
       print(f"āŒ Error: {e}")
       print("nTroubleshooting tips:")
       print("- Ensure OPENAI_API_KEY is set correctly")
       print("- Check internet connection")
       print("- Verify openai-agents package is installed")


if __name__ == "__main__":
   import nest_asyncio
   nest_asyncio.apply()
  
   asyncio.run(main())


def create_custom_agent(name: str, role: str, tools_list: list = None):
   """Helper function to create custom agents quickly"""
   return Agent(
       name=name,
       instructions=f"You are a {role} who provides expert assistance.",
       tools=tools_list or []
   )


custom_agent = create_custom_agent("Code Reviewer", "senior software engineer", [analyze_data])
result = Runner.run_sync(custom_agent, "Review this Python code for best practices")


print("nšŸ“š Tutorial Notes:")
print("- Modify research topics and agent instructions to explore different use cases")
print("- Add your own custom tools using the @function_tool decorator")
print("- Experiment with different agent handoff patterns")
print("- Use sessions for multi-turn conversations")
print("- Perfect for Colab - just add your OpenAI API key and run!")

It includes the Orchestrate demrate at the end of the main () multiple jobs, focused on the immediate synchronization, while dealing with the essential flaws and important features. We also provide an assistant obliging agents and shows the “Code” for immediate feedback.

In conclusion, we wrap the Open Opentai Agents by highlighting the highest strength of this frame: Multi-Alent's Collection, Changing Time Memory, and Changing Methods, and Varying Techniques, and Changing Operating methods, and changing changing methods. We encourage you to extend these bases by adding new tools, financial arts on custom agencies, and try different Handoff strategies. We emphasize that this Modar trade is enabling the ability to obtain AI research pipes driven by AI with a small boelplate.


Look Full codes here. Feel free to look our GITHUB page for tutorials, codes and letters of writing. Also, feel free to follow it Sane and don't forget to join ours 100K + ml subreddit Then sign up for Our newspaper.


Asphazzaq is a Markteach Media Inc. According to a View Business and Developer, Asifi is committed to integrating a good social intelligence. His latest attempt is launched by the launch of the chemistrylife plan for an intelligence, MarktechPost, a devastating intimate practice of a machine learning and deep learning issues that are clearly and easily understood. The platform is adhering to more than two million moon visits, indicating its popularity between the audience.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button