Guarantee Codeter Codeting Codes and Gemino Ai of Automated, Slight, Insight-driven by competitiveness and market analysis

In this lesson, we show how we can resist the tools to shape the powerful scrapegrapers in combination with Gemini Ai to use a collection, classification and analysis of competitive information. By using SmartScrapitor's SmartScrapetool and MarkdownIFYTOOL, users can issue detailed information on product contributions, techniques, electronic media, and market presence from competitive websites. The lesson and use the developed language model of Gemini to combine these separating materials pointing to organized, effective intellectual intelligence. Throughout the process, the scrapgram confirms that the green background is accurate and scaled, allowing analysts to focus on techniques rather than the data information.
%pip install --quiet -U langchain-scrapegraph langchain-google-genai pandas matplotlib seaborn
We are open to open or include the latest forms of key libraries, including the new Langchain-scrapigraph cleaning web and Langchain-Genai-Geniagnet.
import getpass
import os
import json
import pandas as pd
from typing import List, Dict, Any
from datetime import datetime
import matplotlib.pyplot as plt
import seaborn as sns
We import the most important Paste of Python Protected Pipe: GetPass and OS Manage Passwords and Environment, JSONs Manage Correspondent data, and panda provides strong data functions. Type module provides strategies for the specification of a better code, while the datette is recording timestamp. Finally, the matplotlib.pyplot and a seashering was equipped with creative tools to see what you understand.
if not os.environ.get("SGAI_API_KEY"):
os.environ["SGAI_API_KEY"] = getpass.getpass("ScrapeGraph AI API key:n")
if not os.environ.get("GOOGLE_API_KEY"):
os.environ["GOOGLE_API_KEY"] = getpass.getpass("Google API key for Gemini:n")
We test whether Sgai_ap_key and Google_ap_apy flexible environment is already set; If not, the script moves the user with their scrapgrams with Google (Gemini) API Keys with Grasspass and keep them in the following certified applications.
from langchain_scrapegraph.tools import (
SmartScraperTool,
SearchScraperTool,
MarkdownifyTool,
GetCreditsTool,
)
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnableConfig, chain
from langchain_core.output_parsers import JsonOutputParser
smartscraper = SmartScraperTool()
searchscraper = SearchScraperTool()
markdownify = MarkdownifyTool()
credits = GetCreditsTool()
llm = ChatGoogleGenerativeAI(
model="gemini-1.5-flash",
temperature=0.1,
convert_system_message_to_human=True
)
Here, we import scrapgraphs, SmartScrapetol, Searchscrapeto, Searchdowningol, and Preparing ChatgoogleGenEvenatively, 1.5-flash “messages) to call our analysis. We also bring Chatprotttette, RunnableConfig, Chain, and J syndivation from Langchain_Core to organize outgoing results and the results of the PARSE model.
class CompetitiveAnalyzer:
def __init__(self):
self.results = []
self.analysis_timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
def scrape_competitor_data(self, url: str, company_name: str = None) -> Dict[str, Any]:
"""Scrape comprehensive data from a competitor website"""
extraction_prompt = """
Extract the following information from this website:
1. Company name and tagline
2. Main products/services offered
3. Pricing information (if available)
4. Target audience/market
5. Key features and benefits highlighted
6. Technology stack mentioned
7. Contact information
8. Social media presence
9. Recent news or announcements
10. Team size indicators
11. Funding information (if mentioned)
12. Customer testimonials or case studies
13. Partnership information
14. Geographic presence/markets served
Return the information in a structured JSON format with clear categorization.
If information is not available, mark as 'Not Available'.
"""
try:
result = smartscraper.invoke({
"user_prompt": extraction_prompt,
"website_url": url,
})
markdown_content = markdownify.invoke({"website_url": url})
competitor_data = {
"company_name": company_name or "Unknown",
"url": url,
"scraped_data": result,
"markdown_length": len(markdown_content),
"analysis_date": self.analysis_timestamp,
"success": True,
"error": None
}
return competitor_data
except Exception as e:
return {
"company_name": company_name or "Unknown",
"url": url,
"scraped_data": None,
"error": str(e),
"success": False,
"analysis_date": self.analysis_timestamp
}
def analyze_competitor_landscape(self, competitors: List[Dict[str, str]]) -> Dict[str, Any]:
"""Analyze multiple competitors and generate insights"""
print(f"🔍 Starting competitive analysis for {len(competitors)} companies...")
for i, competitor in enumerate(competitors, 1):
print(f"📊 Analyzing {competitor['name']} ({i}/{len(competitors)})...")
data = self.scrape_competitor_data(
competitor['url'],
competitor['name']
)
self.results.append(data)
analysis_prompt = ChatPromptTemplate.from_messages([
("system", """
You are a senior business analyst specializing in competitive intelligence.
Analyze the scraped competitor data and provide comprehensive insights including:
1. Market positioning analysis
2. Pricing strategy comparison
3. Feature gap analysis
4. Target audience overlap
5. Technology differentiation
6. Market opportunities
7. Competitive threats
8. Strategic recommendations
Provide actionable insights in JSON format with clear categories and recommendations.
"""),
("human", "Analyze this competitive data: {competitor_data}")
])
clean_data = []
for result in self.results:
if result['success']:
clean_data.append({
'company': result['company_name'],
'url': result['url'],
'data': result['scraped_data']
})
analysis_chain = analysis_prompt | llm | JsonOutputParser()
try:
competitive_analysis = analysis_chain.invoke({
"competitor_data": json.dumps(clean_data, indent=2)
})
except:
analysis_chain_text = analysis_prompt | llm
competitive_analysis = analysis_chain_text.invoke({
"competitor_data": json.dumps(clean_data, indent=2)
})
return {
"analysis": competitive_analysis,
"raw_data": self.results,
"summary_stats": self.generate_summary_stats()
}
def generate_summary_stats(self) -> Dict[str, Any]:
"""Generate summary statistics from the analysis"""
successful_scrapes = sum(1 for r in self.results if r['success'])
failed_scrapes = len(self.results) - successful_scrapes
return {
"total_companies_analyzed": len(self.results),
"successful_scrapes": successful_scrapes,
"failed_scrapes": failed_scrapes,
"success_rate": f"{(successful_scrapes/len(self.results)*100):.1f}%" if self.results else "0%",
"analysis_timestamp": self.analysis_timestamp
}
def export_results(self, filename: str = None):
"""Export results to JSON and CSV files"""
if not filename:
filename = f"competitive_analysis_{datetime.now().strftime('%Y%m%d_%H%M%S')}"
with open(f"{filename}.json", 'w') as f:
json.dump({
"results": self.results,
"summary": self.generate_summary_stats()
}, f, indent=2)
df_data = []
for result in self.results:
if result['success']:
df_data.append({
'Company': result['company_name'],
'URL': result['url'],
'Success': result['success'],
'Data_Length': len(str(result['scraped_data'])) if result['scraped_data'] else 0,
'Analysis_Date': result['analysis_date']
})
if df_data:
df = pd.DataFrame(df_data)
df.to_csv(f"{filename}.csv", index=False)
print(f"✅ Results exported to {filename}.json and {filename}.csv")
Orchestralyzerzerzerzerzerzerzerzerzerzerzer Class Orchestrates Research Competition Details, Advertising Contact Details Using Scrapegrams, Consolidating and Cleaning of Gendini Ai. It also follows development prices and timestamp, and provides ways of usage to send green information and summarized JSON format and the analysis.
def run_ai_saas_analysis():
"""Run a comprehensive analysis of AI/SaaS competitors"""
analyzer = CompetitiveAnalyzer()
ai_saas_competitors = [
{"name": "OpenAI", "url": "
{"name": "Anthropic", "url": "
{"name": "Hugging Face", "url": "
{"name": "Cohere", "url": "
{"name": "Scale AI", "url": "
]
results = analyzer.analyze_competitor_landscape(ai_saas_competitors)
print("n" + "="*80)
print("🎯 COMPETITIVE ANALYSIS RESULTS")
print("="*80)
print(f"n📊 Summary Statistics:")
stats = results['summary_stats']
for key, value in stats.items():
print(f" {key.replace('_', ' ').title()}: {value}")
print(f"n🔍 Strategic Analysis:")
if isinstance(results['analysis'], dict):
for section, content in results['analysis'].items():
print(f"n {section.replace('_', ' ').title()}:")
if isinstance(content, list):
for item in content:
print(f" • {item}")
else:
print(f" {content}")
else:
print(results['analysis'])
analyzer.export_results("ai_saas_competitive_analysis")
return results
The above work begins competitive analysis by implementing contitives and epidemic and described by the key players of AI / SAAS to be screened. There is a full flow of work attacks and understanding, printing summary statistics and techniques, and eventually sent detailed results to JSON and CSV for more use.
def run_ecommerce_analysis():
"""Analyze e-commerce platform competitors"""
analyzer = CompetitiveAnalyzer()
ecommerce_competitors = [
{"name": "Shopify", "url": "
{"name": "WooCommerce", "url": "
{"name": "BigCommerce", "url": "
{"name": "Magento", "url": "
]
results = analyzer.analyze_competitor_landscape(ecommerce_competitors)
analyzer.export_results("ecommerce_competitive_analysis")
return results
The above function places Collective E-Commerce platforms by ripping information from each site, and sends the findings to both JSON and CSVs under the word “eCommerCE_CEBWETITITTITTIATTIQ_ATELSIS.”
@chain
def social_media_monitoring_chain(company_urls: List[str], config: RunnableConfig):
"""Monitor social media presence and engagement strategies of competitors"""
social_media_prompt = ChatPromptTemplate.from_messages([
("system", """
You are a social media strategist. Analyze the social media presence and strategies
of these companies. Focus on:
1. Platform presence (LinkedIn, Twitter, Instagram, etc.)
2. Content strategy patterns
3. Engagement tactics
4. Community building approaches
5. Brand voice and messaging
6. Posting frequency and timing
Provide actionable insights for improving social media strategy.
"""),
("human", "Analyze social media data for: {urls}")
])
social_data = []
for url in company_urls:
try:
result = smartscraper.invoke({
"user_prompt": "Extract all social media links, community engagement features, and social proof elements",
"website_url": url,
})
social_data.append({"url": url, "social_data": result})
except Exception as e:
social_data.append({"url": url, "error": str(e)})
chain = social_media_prompt | llm
analysis = chain.invoke({"urls": json.dumps(social_data, indent=2)}, config=config)
return {
"social_analysis": analysis,
"raw_social_data": social_data
}
Here, the burnout is described the collection pipeline and analyzing coordinates of competitive competitions ultimately, returns the green information and productive information produced, which works well in one systematic check.
def check_credits():
"""Check available credits"""
try:
credits_info = credits.invoke({})
print(f"💳 Available Credits: {credits_info}")
return credits_info
except Exception as e:
print(f"⚠️ Could not check credits: {e}")
return None
The above function calls GetCritiststool to find and display your scrapgrams / Gemini API creditits, printing the result or warning fails, and nothing about error).
if __name__ == "__main__":
print("🚀 Advanced Competitive Analysis Tool with Gemini AI")
print("="*60)
check_credits()
print("n🤖 Running AI/SaaS Competitive Analysis...")
ai_results = run_ai_saas_analysis()
run_additional = input("n❓ Run e-commerce analysis as well? (y/n): ").lower().strip()
if run_additional == 'y':
print("n🛒 Running E-commerce Platform Analysis...")
ecom_results = run_ecommerce_analysis()
print("n✨ Analysis complete! Check the exported files for detailed results.")
Finally, the last piece of code serves as a point of entry: The header printer, checking the API credits, and kicks analysis of AI / SAAS) before signing that all the results are sent out.
In conclusion, combining scrapegrap skills with the Gemini Ai transforms the competition of competing time tradition traditions in successful pipeline. The scrapgram treats heavy download and performing a web-based information, and Gemin's understanding is turned into the green information into higher strategic recommendations. As a result, businesses can quickly evaluate the marketing, showing features of features, and produces exposure opportunities with less intervening. By changing these steps, users find speed and consensus, and fluctuations of extension their analysis or new markets as required.
View the letter of writing in Githubub. All credit for this study goes to research for this project. Also, feel free to follow it Sane and don't forget to join ours 95k + ml subreddit Then sign up for Our newspaper.
Asphazzaq is a Markteach Media Inc. According to a View Business and Developer, Asifi is committed to integrating a good social intelligence. His latest attempt is launched by the launch of the chemistrylife plan for an intelligence, MarktechPost, a devastating intimate practice of a machine learning and deep learning issues that are clearly and easily understood. The platform is adhering to more than two million moon visits, indicating its popularity between the audience.



