Guide that is coded on different ways to call real-time, enabled to be converted with AI agents

Calling allows the LLM to act as a bridge between the native language enhancement and the real world code or APIs. Instead of simply producing the text, the model decides when it can appeal to the previously defined work, issue the JSSON name in the name of the app and the controversial, and wait for your application. This back-and forth can import, may provide multiple tasks in a row, which enables richer cooperation, more than a number of complete. In this lesson, we will use the weather Assistance with a 2.0 Flash to show how to set up and handle the job call cycle. We will use different cording variations. By combining the dialog, we are converting the chat interface with real-time jobs, whether downloading live weather data, to view order rates, or edit information, or review details. Users are no longer filling with difficult forms or wandering on many screens; They simply describe what they need, and the llm organizes less seamless actions. This natural language is empowering the construction of AI AIs that can reach outdoor data sources, make a transaction, or the flow of work causing, all within.
Google Gemini Gemini work 2.0 Flash
!pip install "google-genai>=1.0.0" geopy requests
We include Gemini Python SDK (Google-Genai ≥ 1.0.0), and GEOPY Listing Names to link HTTP call applications, to ensure all HTP weather hearing in our area.
import os
from google import genai
GEMINI_API_KEY = "Use_Your_API_Key"
client = genai.Client(api_key=GEMINI_API_KEY)
model_id = "gemini-2.0-flash"
We import Gemini SDK, set your API key, and create your API example.
res = client.models.generate_content(
model=model_id,
contents=["Tell me 1 good fact about Nuremberg."]
)
print(res.text)
We send Perform Prompt (“I tell a good truth about Nurember 2.0 Flash Model with Excer's Precing, and print a basic call, to show the basic call, the last call uses SDK.
The work that calls Jon Schema
weather_function = {
"name": "get_weather_forecast",
"description": "Retrieves the weather using Open-Meteo API for a given location (city) and a date (yyyy-mm-dd). Returns a list dictionary with the time and temperature for each hour.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g., San Francisco, CA"
},
"date": {
"type": "string",
"description": "the forecasting date for when to get the weather format (yyyy-mm-dd)"
}
},
"required": ["location","date"]
}
}
Here, explaining the GET_Weat_thorecast Tool, explaining its name, which immediately directs Gemini is to use, and install fields, so the model can remove applicable operating calls.
from google.genai.types import GenerateContentConfig
config = GenerateContentConfig(
system_instruction="You are a helpful assistant that use tools to access and retrieve information from a weather API. Today is 2025-03-04.",
tools=[{"function_declarations": [weather_function]}],
)
We create generatecontent therefore, the model knows how to remove edited calls when you are asked by weather data.
response = client.models.generate_content(
model=model_id,
contents="Whats the weather in Berlin today?"
)
print(response.text)
This call sends quickly (“What weather is in Berlin today?”
response = client.models.generate_content(
model=model_id,
config=config,
contents="Whats the weather in Berlin today?"
)
for part in response.candidates[0].content.parts:
print(part.function_call)
Through the Config (which includes your Jonson-Cachma tool), Germa realizes that you should drive Get_weather_forecast rather than an obvious text. Loop over the answer.Canddates[0].Content.parts and print each part of each part
from google.genai import types
from geopy.geocoders import Nominatim
import requests
geolocator = Nominatim(user_agent="weather-app")
def get_weather_forecast(location, date):
location = geolocator.geocode(location)
if location:
try:
response = requests.get(f"
data = response.json()
return {time: temp for time, temp in zip(data["hourly"]["time"], data["hourly"]["temperature_2m"])}
except Exception as e:
return {"error": str(e)}
else:
return {"error": "Location not found"}
functions = {
"get_weather_forecast": get_weather_forecast
}
def call_function(function_name, **kwargs):
return functions[function_name](**kwargs)
def function_call_loop(prompt):
contents = [types.Content(role="user", parts=[types.Part(text=prompt)])]
response = client.models.generate_content(
model=model_id,
config=config,
contents=contents
)
for part in response.candidates[0].content.parts:
contents.append(types.Content(role="model", parts=[part]))
if part.function_call:
print("Tool call detected")
function_call = part.function_call
print(f"Calling tool: {function_call.name} with args: {function_call.args}")
tool_result = call_function(function_call.name, **function_call.args)
function_response_part = types.Part.from_function_response(
name=function_call.name,
response={"result": tool_result},
)
contents.append(types.Content(role="user", parts=[function_response_part]))
print(f"Calling LLM with tool results")
func_gen_response = client.models.generate_content(
model=model_id, config=config, contents=contents
)
contents.append(types.Content(role="model", parts=[func_gen_response]))
return contents[-1].parts[0].text.strip()
result = function_call_loop("Whats the weather in Berlin today?")
print(result)
Using the perfect 'Agentic “Loop: Sends your Prompt in Gemini, checks the phone response, installs_a geopy and the APEN-HTTP device), and eat the tool tool back to the Model.
Call Work using Python jobs
from geopy.geocoders import Nominatim
import requests
geolocator = Nominatim(user_agent="weather-app")
def get_weather_forecast(location: str, date: str) -> str:
"""
Retrieves the weather using Open-Meteo API for a given location (city) and a date (yyyy-mm-dd). Returns a list dictionary with the time and temperature for each hour."
Args:
location (str): The city and state, e.g., San Francisco, CA
date (str): The forecasting date for when to get the weather format (yyyy-mm-dd)
Returns:
Dict[str, float]: A dictionary with the time as key and the temperature as value
"""
location = geolocator.geocode(location)
if location:
try:
response = requests.get(f"
data = response.json()
return {time: temp for time, temp in zip(data["hourly"]["time"], data["hourly"]["temperature_2m"])}
except Exception as e:
return {"error": str(e)}
else:
return {"error": "Location not found"}
Get_inatim's Ominatim work to convert the city cord and the state into links, and send the HTTP application to Open-Meteo API. It also gives errors, to return an error message if the area is not available or an API call fails.
from google.genai.types import GenerateContentConfig
config = GenerateContentConfig(
system_instruction="You are a helpful assistant that can help with weather related questions. Today is 2025-03-04.", # to give the LLM context on the current date.
tools=[get_weather_forecast],
automatic_function_calling={"disable": True}
)
This setting enrolls Python Get_forecast work as a costly tool. Sleep quickly a clear system (including the date) content, while disabling the “automatic
r = client.models.generate_content(
model=model_id,
config=config,
contents="Whats the weather in Berlin today?"
)
for part in r.candidates[0].content.parts:
print(part.function_call)
By sending a Prompt for your custom confid (including the Python tool but in default telephones are disabled), this snippet captures the Gemino's Recip-Call. Then one package of the response to printing .funct_call something, lets you check exactly what Tool You want to persuade in any issues.
from google.genai.types import GenerateContentConfig
config = GenerateContentConfig(
system_instruction="You are a helpful assistant that use tools to access and retrieve information from a weather API. Today is 2025-03-04.", # to give the LLM context on the current date.
tools=[get_weather_forecast],
)
r = client.models.generate_content(
model=model_id,
config=config,
contents="Whats the weather in Berlin today?"
)
print(r.text)
With this arrangement (including your Get_Weather_forcast function and automatic driving leaves are automatically enabled), driving production_the product will be with your weather tool after the scenes and returns the natural language response. Prints printed R.TEXT to exit the final response, including the actual heat of Berlin on the specified date.
from google.genai.types import GenerateContentConfig
config = GenerateContentConfig(
system_instruction="You are a helpful assistant that use tools to access and retrieve information from a weather API.",
tools=[get_weather_forecast],
)
prompt = f"""
Today is 2025-03-04. You are chatting with Andrew, you have access to more information about him.
User Context:
- name: Andrew
- location: Nuremberg
User: Can i wear a T-shirt later today?"""
r = client.models.generate_content(
model=model_id,
config=config,
contents=prompt
)
print(r.text)
We extend your assistant about personal status, tell us the name of the Gemini Andrew and ask if it is a T-shirt weather, while using the Get_Weather_forcast tool under the hood. Prints the recommendation of the nature of the model language depending on the actual condition of the day.
In conclusion, we now explain how we can describe the activities (voscla Signature), prepare the Femino 2.The Former.
Here is the Colab Notebook. Also, don't forget to follow Sane and join ours Telegraph station including LinkedIn Grtopic. Don't forget to join ours 90k + ml subreddit.
🔥 [Register Now] Summit of the Minicon Virtual in Agentic AI: Free Registration + Certificate of Before Hour 4 Hour Court (May 21, 9 AM
Asphazzaq is a Markteach Media Inc. According to a View Business and Developer, Asifi is committed to integrating a good social intelligence. His latest attempt is launched by the launch of the chemistrylife plan for an intelligence, MarktechPost, a devastating intimate practice of a machine learning and deep learning issues that are clearly and easily understood. The platform is adhering to more than two million moon visits, indicating its popularity between the audience.
