Generative AI

Implementation of AI Wise AI and Jinah Search, Langchain, and Real-time Relief Gemini

In this lesson, we show how we can build a wise AI helper by integrating Langchain, Gemini 2.0 Flash, Join search tools. By combining the great model of the biggest language (llm) with external search API, we make a helper to provide for measurements. This step of instruction by step by setting the API buttons, to include the required libraries, tablets in the Gemini model, and create a custom Langchain that stirs foreign tools when the model needs new or specific information. At the end of this lesson, we will have a fully functioning, effective AI helper to answer user questions that have accurate, present, frameworks correctly.

%pip install --quiet -U "langchain-community>=0.2.16" langchain langchain-google-genai

We install Python Packages needed in this project. Including Langchain Framework for building AI services, Langchain community tools (translation 0.2.16 or more), as well as the integration of the Kulangchain and monitoring of Google Gemini models. These packages enable seamless use of Gemini models and foreign tools within the Langchain pipes.

import getpass
import os
import json
from typing import Dict, Any

We include important modules in this work. Grasspass allows for API keys without showing them on the screen, while OS helps natural variations and file methods. JSON is used to handle JSON's data structures, and typing offers fluctuations of flexibility, such as dictionaries and learning issues, to ensure the better and final code reading.

if not os.environ.get("JINA_API_KEY"):
    os.environ["JINA_API_KEY"] = getpass.getpass("Enter your Jina API key: ")


if not os.environ.get("GOOGLE_API_KEY"):
    os.environ["GOOGLE_API_KEY"] = getpass.getpass("Enter your Google/Gemini API key: ")

We make sure that the required API API and Google Gemini keys are set as environmental variety. Suppose the keys are not defined in nature. In that case, the script moves the user to install safely using GetPass module, to keep the keys hidden in viewing purposes. This approach enables the re-access to these services without the needing hardcoding of sensitive information in the code.

from langchain_community.tools import JinaSearch
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnableConfig, chain
from langchain_core.messages import HumanMessage, AIMessage, ToolMessage


print("🔧 Setting up tools and model...")

We import key contexts and classes from Langchain Ecosystem. Present the Jinaaseearch Tool to search the search, Chatgoogleogenogenatious model for access to Gemini's Gemini, along with important classes from Langchain Core, RunnableConfig, as well as messages, Airmssage). Collaboration, these parts enhance the integration of foreign tools for the restoration gemini to the restoration of information, conducted by AI. Print statement will ensure that the setup process is started.

search_tool = JinaSearch()
print(f"✅ Jina Search tool initialized: {search_tool.name}")


print("n🔍 Testing Jina Search directly:")
direct_search_result = search_tool.invoke({"query": "what is langgraph"})
print(f"Direct search result preview: {direct_search_result[:200]}...")

We start Jain's search tool by building a jinaaseearch example () and to ensure that it is ready for use. Tool is designed to manage web search queries within Langchain Ecosystem. The script then runs a direct test question, “What is Langgraph”, using a method of object, and the printer previews the search results. This step ensures that the search tool works well before combining it in the huge Assembly.

gemini_model = ChatGoogleGenerativeAI(
    model="gemini-2.0-flash",
    temperature=0.1,
    convert_system_message_to_human=True  
)
print("✅ Gemini model initialized")

We start the Gemini Gemini model 2.0 Flash that uses the ChatgoogleogeningEragenIvenatively, in Langchain. The model is set with low temperatures (0.1) to find multiple replacements, and ConversTem_Message_Tan_Human = true parameter confirms that the quality of Gemini's high quality. The final print statement confirms that the Gemini model is ready for use.

detailed_prompt = ChatPromptTemplate.from_messages([
    ("system", """You are an intelligent assistant with access to web search capabilities.
    When users ask questions, you can use the Jina search tool to find current information.
   
    Instructions:
    1. If the question requires recent or specific information, use the search tool
    2. Provide comprehensive answers based on the search results
    3. Always cite your sources when using search results
    4. Be helpful and informative in your responses"""),
    ("human", "{user_input}"),
    ("placeholder", "{messages}"),
])

We describe a speedy template using chatprompttemptemplate.from_mimesmes () directing AI functionality. Includes a message message to describe the role of the reliably, the owner of the user's questions, as well as the local message host that is produced during the toolbar. This systematic agreement guarantees that AI provides helpful, informative and acquirable answers while combining the search results.

gemini_with_tools = gemini_model.bind_tools([search_tool])
print("✅ Tools bound to Gemini model")


main_chain = detailed_prompt | gemini_with_tools


def format_tool_result(tool_call: Dict[str, Any], tool_result: str) -> str:
    """Format tool results for better readability"""
    return f"Search Results for '{tool_call['args']['query']}':n{tool_result[:800]}..."

We bind the JINA search tool in the Gemini model using the Bind_Tools (), making the request model for search tool when needed. The Main_chain includes a formalized format and model for upgraded, creating a seamless function of workouts to manage user-dynamic tools. Additionally, Formaf_Result Functionform to create clear and readable search results, confirmation that users can easily understand the results of search questions.

@chain
def enhanced_search_chain(user_input: str, config: RunnableConfig):
    """
    Enhanced chain that handles tool calls and provides detailed responses
    """
    print(f"n🤖 Processing query: '{user_input}'")
   
    input_data = {"user_input": user_input}
   
    print("📤 Sending to Gemini...")
    ai_response = main_chain.invoke(input_data, config=config)
   
    if ai_response.tool_calls:
        print(f"🛠️  AI requested {len(ai_response.tool_calls)} tool call(s)")
       
        tool_messages = []
        for i, tool_call in enumerate(ai_response.tool_calls):
            print(f"   🔍 Executing search {i+1}: {tool_call['args']['query']}")
           
            tool_result = search_tool.invoke(tool_call)
           
            tool_msg = ToolMessage(
                content=tool_result,
                tool_call_id=tool_call['id']
            )
            tool_messages.append(tool_msg)
       
        print("📥 Getting final response with search results...")
        final_input = {
            **input_data,
            "messages": [ai_response] + tool_messages
        }
        final_response = main_chain.invoke(final_input, config=config)
       
        return final_response
    else:
        print("ℹ️  No tool calls needed")
        return ai_response

It describes Enhanced_search_chain using the @chain decoration from Langchain, which makes you manage user questions about the use of dynamic tools. It takes the user's input and an artificial item, passing the installation of the main chain (including REAT and Gemini with tools), and AI refines any toolbar (eg a web search in the JINA. If new tools are available, issue searches, create tools, and ensure the chain with the final instrument, the essence of the richest. If there is no price tool done, it returns AI's direct response.

def test_search_chain():
    """Test the search chain with various queries"""
   
    test_queries = [
        "what is langgraph",
        "latest developments in AI for 2024",
        "how does langchain work with different LLMs"
    ]
   
    print("n" + "="*60)
    print("🧪 TESTING ENHANCED SEARCH CHAIN")
    print("="*60)
   
    for i, query in enumerate(test_queries, 1):
        print(f"n📝 Test {i}: {query}")
        print("-" * 50)
       
        try:
            response = enhanced_search_chain.invoke(query)
            print(f"✅ Response: {response.content[:300]}...")
           
            if hasattr(response, 'tool_calls') and response.tool_calls:
                print(f"🛠️  Used {len(response.tool_calls)} tool call(s)")
               
        except Exception as e:
            print(f"❌ Error: {str(e)}")
       
        print("-" * 50)

The employee describes a list of various assessment assessment, cover tools, AI titles, and Langchain integration, and printing consequences, indicates that tools are used. This helps ensure that AI can cause us to search for a Web search, process the answers, and restore useful information to users, verify the potential and functional program.

if __name__ == "__main__":
    print("n🚀 Starting enhanced LangChain + Gemini + Jina Search demo...")
    test_search_chain()
   
    print("n" + "="*60)
    print("💬 INTERACTIVE MODE - Ask me anything! (type 'quit' to exit)")
    print("="*60)
   
    while True:
        user_query = input("n🗣️  Your question: ").strip()
        if user_query.lower() in ['quit', 'exit', 'bye']:
            print("👋 Goodbye!")
            break
       
        if user_query:
            try:
                response = enhanced_search_chain.invoke(user_query)
                print(f"n🤖 Response:n{response.content}")
            except Exception as e:
                print(f"❌ Error: {str(e)}")

Finally, we run ai helper as the text when the file is executed directly. It begins to call test_search_chain () to confirm the program through predefined questions, to confirm the setup function. After that, it starts a partnership mode, which allows users to display customization questions and find the answers produced by AI developed through powerful search results. LOOP continues until user types are 'quitting', 'go out', or 'bye', providing an accurate way and deals with a way to contact AI.

In conclusion, we successfully create an enhanced AI relief of Langchain, Gemini 2.0 Flash's Glash's Glash's Gladaninina Qaver, and Jina Search's Web Web Search. This method of hybrid indicates how AI models can increase their knowledge more than static information, which provides users with timely and appropriate information from reliable sources. You can now increase this project by continuing additional tools, custom sync, or sending a helper as an API or the broader app plan. This basis opens permanent opportunities for building powerful and powerful systems of thinking.


View the letter of writing in Githubub. All credit for this study goes to research for this project. Also, feel free to follow it Sane and don't forget to join ours 95k + ml subreddit Then sign up for Our newspaper.


Asphazzaq is a Markteach Media Inc. According to a View Business and Developer, Asifi is committed to integrating a good social intelligence. His latest attempt is launched by the launch of the chemistrylife plan for an intelligence, MarktechPost, a devastating intimate practice of a machine learning and deep learning issues that are clearly and easily understood. The platform is adhering to more than two million moon visits, indicating its popularity between the audience.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button