Form a powerful AI power agent using Nebius with LLAMAM 3 and real-time consultation tools

In this lesson, we present an improved AI agent constructed Has Bius' Strong Ecosystem, especially Chatnebius, you have Bisempeddings, and nutrient elements. The agent uses the speedy model of lllama-3.3-70b-quick to produce high-quality responses, incorporating the external operation such as Wikopedia's search, retrieving final documents, and secure statistics. By combining a systematic design and a Langchain Modar framework, this lesson shows how we can build an active assistant, the stronger – the power of AI helping its interaction and evident. Whether they are scientific questions, technical understanding, or basic amounts of prices, the ranger displays Nebius' power as a platform for building AI systems.
!pip install -q langchain-nebius langchain-core langchain-community wikipedia
import os
import getpass
from typing import List, Dict, Any
import wikipedia
from datetime import datetime
We begin by installing key libraries, including Langchain-Nebius, Langchain-Core, Langchain-Community, Wikipedia, which is very important to build a feature-rich AI. Importing required modules such as OS, GetPass, Datetime, and typing services, and starts the Wikipedia API for access to external data.
from langchain_core.documents import Document
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnablePassthrough
from langchain_core.tools import tool
from langchain_nebius import ChatNebius, NebiusEmbeddings, NebiusRetriever
if "NEBIUS_API_KEY" not in os.environ:
os.environ["NEBIUS_API_KEY"] = getpass.getpass("Enter your Nebius API key: ")
We import the main components from Nebius and Nebius to enable the administration of the document, the relief of the optimism, and output, and integration of tools. It sets key classes such as Chatnebius of model language, with Nebisembeddings for the vector representation of the Vector, and Nebiusretriever with Semantic search. The Nebius API user key is safe using the Grass to ensure the following API interaction.
class AdvancedNebiusAgent:
"""Advanced AI Agent with retrieval, reasoning, and external tool capabilities"""
def __init__(self):
self.llm = ChatNebius(model="meta-llama/Llama-3.3-70B-Instruct-fast")
self.embeddings = NebiusEmbeddings()
self.knowledge_base = self._create_knowledge_base()
self.retriever = NebiusRetriever(
embeddings=self.embeddings,
docs=self.knowledge_base,
k=3
)
self.agent_prompt = ChatPromptTemplate.from_template("""
You are an advanced AI assistant with access to:
1. A knowledge base about technology and science
2. Wikipedia search capabilities
3. Mathematical calculation tools
4. Current date/time information
Context from knowledge base:
{context}
External tool results:
{tool_results}
Current date: {current_date}
User Query: {query}
Instructions:
- Use the knowledge base context when relevant
- If you need additional information, mention what external sources would help
- Be comprehensive but concise
- Show your reasoning process
- If calculations are needed, break them down step by step
Response:
""")
def _create_knowledge_base(self) -> List[Document]:
"""Create a comprehensive knowledge base"""
return [
Document(
page_content="Artificial Intelligence (AI) is transforming industries through ML, NLP, and computer vision. Key applications include autonomous vehicles, medical diagnosis, and financial trading.",
metadata={"topic": "AI", "category": "technology"}
),
Document(
page_content="Quantum computing uses quantum mechanical phenomena like superposition and entanglement to process information. Companies like IBM, Google, and Microsoft are leading quantum research.",
metadata={"topic": "quantum_computing", "category": "technology"}
),
Document(
page_content="Climate change is caused by greenhouse gas emissions, primarily CO2 from fossil fuels. Renewable energy sources are crucial for mitigation.",
metadata={"topic": "climate", "category": "environment"}
),
Document(
page_content="CRISPR-Cas9 is a revolutionary gene editing technology that allows precise DNA modifications. It has applications in treating genetic diseases and improving crops.",
metadata={"topic": "biotechnology", "category": "science"}
),
Document(
page_content="Blockchain technology enables decentralized, secure transactions without intermediaries. Beyond cryptocurrency, it has applications in supply chain, healthcare, and voting systems.",
metadata={"topic": "blockchain", "category": "technology"}
),
Document(
page_content="Space exploration has advanced with reusable rockets, Mars rovers, and commercial space travel. SpaceX, Blue Origin, and NASA are pioneering new missions.",
metadata={"topic": "space", "category": "science"}
),
Document(
page_content="Renewable energy costs have dropped dramatically. Solar & wind power are now cheaper than fossil fuels in many regions, driving global energy transition.",
metadata={"topic": "renewable_energy", "category": "environment"}
),
Document(
page_content="5G networks provide ultra-fast internet speeds and low latency, enabling IoT devices, autonomous vehicles, and augmented reality applications.",
metadata={"topic": "5G", "category": "technology"}
)
]
@tool
def wikipedia_search(query: str) -> str:
"""Search Wikipedia for additional information"""
try:
search_results = wikipedia.search(query, results=3)
if not search_results:
return f"No Wikipedia results found for '{query}'"
page = wikipedia.page(search_results[0])
summary = wikipedia.summary(search_results[0], sentences=3)
return f"Wikipedia: {page.title}n{summary}nURL: {page.url}"
except Exception as e:
return f"Wikipedia search error: {str(e)}"
@tool
def calculate(expression: str) -> str:
"""Perform mathematical calculations safely"""
try:
allowed_chars = set('0123456789+-*/.() ')
if not all(c in allowed_chars for c in expression):
return "Error: Only basic mathematical operations allowed"
result = eval(expression)
return f"Calculation: {expression} = {result}"
except Exception as e:
return f"Calculation error: {str(e)}"
def _format_docs(self, docs: List[Document]) -> str:
"""Format retrieved documents for context"""
if not docs:
return "No relevant documents found in knowledge base."
formatted = []
for i, doc in enumerate(docs, 1):
formatted.append(f"{i}. {doc.page_content}")
return "n".join(formatted)
def _get_current_date(self) -> str:
"""Get current date and time"""
return datetime.now().strftime("%Y-%m-%d %H:%M:%S")
def process_query(self, query: str, use_wikipedia: bool = False,
calculate_expr: str = None) -> str:
"""Process a user query with optional external tools"""
relevant_docs = self.retriever.invoke(query)
context = self._format_docs(relevant_docs)
tool_results = []
if use_wikipedia:
wiki_keywords = self._extract_keywords(query)
if wiki_keywords:
wiki_result = self.wikipedia_search(wiki_keywords)
tool_results.append(f"Wikipedia Search: {wiki_result}")
if calculate_expr:
calc_result = self.calculate(calculate_expr)
tool_results.append(f"Calculation: {calc_result}")
tool_results_str = "n".join(tool_results) if tool_results else "No external tools used"
chain = (
{
"context": lambda x: context,
"tool_results": lambda x: tool_results_str,
"current_date": lambda x: self._get_current_date(),
"query": RunnablePassthrough()
}
| self.agent_prompt
| self.llm
| StrOutputParser()
)
return chain.invoke(query)
def _extract_keywords(self, query: str) -> str:
"""Extract key terms for Wikipedia search"""
important_words = []
stop_words = {'what', 'how', 'why', 'when', 'where', 'is', 'are', 'the', 'a', 'an'}
words = query.lower().split()
for word in words:
if word not in stop_words and len(word) > 3:
important_words.append(word)
return ' '.join(important_words[:3])
def interactive_session(self):
"""Run an interactive session with the agent"""
print("🤖 Advanced Nebius AI Agent Ready!")
print("Features: Knowledge retrieval, Wikipedia search, calculations")
print("Commands: 'wiki:' for Wikipedia, 'calc:' for math")
print("Type 'quit' to exitn")
while True:
user_input = input("You: ").strip()
if user_input.lower() == 'quit':
print("Goodbye!")
break
use_wiki = False
calc_expr = None
if user_input.startswith('wiki:'):
use_wiki = True
user_input = user_input[5:].strip()
elif user_input.startswith('calc:'):
parts = user_input.split(':', 1)
if len(parts) == 2:
calc_expr = parts[1].strip()
user_input = f"Calculate {calc_expr}"
try:
response = self.process_query(user_input, use_wiki, calc_expr)
print(f"n🤖 Agent: {response}n")
except Exception as e:
print(f"Error: {e}n")
The root of the implementation is included within the AdturancisusGent section, which is the orchestrates which indicate, retrieval, and integration of tools. Starts higher work llm from Nebius (meta-Llama / Llama-3.3-70b-quick). It puts the Semantic Retriver based on embedded documents, forming the basis of the mini information including articles such as AI, computer computer, Blockchain, and more. The strong dynamic template coordinates agent answers by installing the reset of the reset, the results of external tools, and date. The two built-in, Wikipedia_search and calculate the performance of the agent by providing access to access to excyclopedic external information and safe combination of arithmetic, respectively. The_the system is all that brings everything together, allowing immediate chains to the form, tools, and the order to produce educational answers, of many sources. The interactive elementary session enables actual time with agent, allowing the recognition of special prefixes such as Wiki: or Calc: Activate External Tools Support.
if __name__ == "__main__":
agent = AdvancedNebiusAgent()
demo_queries = [
"What is artificial intelligence and how is it being used?",
"Tell me about quantum computing companies",
"How does climate change affect renewable energy adoption?"
]
print("=== Nebius AI Agent Demo ===n")
for i, query in enumerate(demo_queries, 1):
print(f"Demo {i}: {query}")
response = agent.process_query(query)
print(f"Response: {response}n")
print("-" * 50)
print("nDemo with Wikipedia:")
response_with_wiki = agent.process_query(
"What are the latest developments in space exploration?",
use_wikipedia=True
)
print(f"Response: {response_with_wiki}n")
print("Demo with calculation:")
response_with_calc = agent.process_query(
"If solar panel efficiency improved by 25%, what would be the new efficiency if current is 20%?",
calculate_expr="20 * 1.25"
)
print(f"Response: {response_with_calc}n")
Finally, we reflect the skills of agent with a set of discussion questions. It begins with the recruitment of AdvancEBIgIgient, followed by the loop that processed the recommended AI, quantum computer, climate change, indicating fake performance. It has made a promotional question of the Wikipedia about the screening, using the actual external information to add the basis of information. Finally, it is conducting a mathematical state that includes the efficiency of the Sanitation panel to ensure the calculation tool. These demons indicate that Nebius, including Langchain and the well-organized repairs, empowered intelligent manner, which relates to a realistic question to the real helper.
In conclusion, the powerful nebius agent sees how they should unite the effectiveness of the llm-driven to return and external Tool to build. By integrating the Langchain and Nebius Apis, Agel reaches the basis of limited knowledge, carrying live data from Wikipedia, and provide arithmetic functions with safety checks. Study-based study, including quick templates, the installation of the dynamic energy, and customization, provides a powerful Blueprint of developers who want to respond to the wise modeling answers (LLM).
Look Codes. All credit for this study goes to research for this project. Also, feel free to follow it Sane and don't forget to join ours 100K + ml subreddit Then sign up for Our newspaper.
Asphazzaq is a Markteach Media Inc. According to a View Business and Developer, Asifi is committed to integrating a good social intelligence. His latest attempt is launched by the launch of the chemistrylife plan for an intelligence, MarktechPost, a devastating intimate practice of a machine learning and deep learning issues that are clearly and easily understood. The platform is adhering to more than two million moon visits, indicating its popularity between the audience.




