How to learn AI for analyzing data by 2025

Photo for Editor | Chatgt
Data Analytics has changed. It is no longer enough to know the tools like Python, SQL, and Excel to become a data commentator.
As a data technician in a technical company, I can see firsthand the combination of AI each work. The Sea of AI is now available to access and analyze all your information and help you create data on data analytics, machine and web systems in minutes.
If you are the desired data coach and not use these tools at AI, you lose it. And soon, you'll go through some detail commentators; People who use AI to improve their work.
In this article, I will go with AI tools that will help you stay before the competition and operation of your 10x functional performance.
With these tools, you can:
- Build and send Creative portfolio projects to rent as a data analysis
- Use Plain English to create last data analysis programs
- Hurry your data flow and be a well-applicable data analysis
In addition, this article will be a directory of steps by the step of using AI tools to create apps for data Analytics. We will focus on two tools at AI especially – the cursor and pandas AI.
With the video version of this article, see:
https: /www.youtube.com/watch? v .ukidskagai
AI 1 Tool: The cursor
The cursor is a AI code editor that can access all your code. Just just have to type the visual incentive for cursor chat, and it will access all files in your indicator and edit your code.
If you are still starting and you cannot write one column line, you can even start a blank code folder and ask the cursor to create something. AI tool will follow your instructions and create code files according to your needs.
Here is a guide that you can use the cursor to create a data analytical project without writing a single code line.
Step 1: Installation of the Center and Setup
Let's see how we can use the cursor AI for analyzing data.
Installing the cursor, just go to www.cursor.com, download a version associated with your OS, follow the installation instructions, and will be set in seconds.
Here's the visualization of the appearance indicator of us:

Cursor A Insert Insertise
Following this lesson, Download the train.csv File from the sensory analysis data in waggle.
Then create a folder called “project analyzing the project” and then submit the downloaded train file.Csv to it.
Finally, create an empty file with a word app.py. Your project folder now should look like this:

Folder to analyze sensations
This will be our workshop.
Now, open this folder in the direction of the file by the file -> Turn on the folder.
The right side of the screen has a discussion interface where you can type the motive. Note that there is a few selection here. Let's choose a “agent” in the deduction.
This tells the cursor to check your Codebase and work as ai assistant you will refuse and make your mistake.
Additionally, you can choose which language model you would like to use with cursor (GPT-4O, Gemini-2.5-Pro, etc.). I suggest using Claude-4-Sonnet, a well-known model with its advanced registration skills.
Step 2: Promote the cursor to create an application
Let us now name this promptly, asking you to develop a model of analyzing feelings using the training dataset in our code.
Create a sentiment analysis web app that:
1. Uses a pre-trained DistilBERT model to analyze the sentiment of text (positive, negative, or neutral)
2. Has a simple web interface where users can enter text and see results
3. Shows the sentiment result with appropriate colors (green for positive, red for negative)
4. Runs immediately without needing any training
Please connect all the files properly so that when I enter text and click analyze, it shows me the sentiment result right away.
After installing this quickly on the directory, it will automatically generate code files to create a mood analysis system.
Step 3: Accepting Running Reforms and instructions
As cursor creates new files and produces code, you need to click on “Receive” to confirm the changes made by AI agent.
After the grimmer writes every code, it can move you to run some orders on the end. Uninstall these instructions will allow you to enter the relative dependency and use the web plan.
Just click on “Run,” Allows the cursor to run these instructions:

Run Command Cursor
If the cursor created a request, it will tell you to copy and paste this link in your browser:

Cursor app link
Doing this will lead you to the web analysis test, which looks like:

Homensal analysis app for directions
This is a fullyu-web application updated by employers who can contact them. You can attach any sentence to this app and will predict the feeling, returning the result from you.
I find tools such as the indicator to have great power if you are the first field in the field and you want to produce your projects.
Most of the data detail do not know endless organizations such as HTML and CSS, as a result we can show our projects in the applicable app.
Our code always resides in Kaggle books, which can give us the opportunity to compete more than hundreds of applicants doing the same thing.
A tool is like a directory, however, it can put it without a competition. It can help you change your ideas into a reality by installing a covel directly what you tell.
AI 2 Tool: Pandas Ai
Pandas AI allows you to use power and analyze Pandas data frames without writing any code.
Of course you have to type clear English motives, which reduces the severity that comes and making data data and EDA.
If you don't know, pandas is the Python library you can use to analyze and deceive data.
You read data from something you are known as the Pandas Data Frame, which allows you to perform tasks in your data.
Let us continue to maintain an example of how you can make data restart, deception and analysis of Pandas Ai.
With this demo, I will use the Titanic Survival Predentection Dataset in Kaggle (Download i train.csv File).
In this analysis, I suggest to use the nature of the Python Notebook, such as Jsyter's booklet, a booklet for Kaggle, or google Colab. The full code of this analysis can be found in this Kaggle booklet.
Step 1: Pandas Ai installation and setup
Once you have your site update Ready, type the command below to include PANDAS AI:
!pip install pandasai
Next, Upload Titanic DataFamrame with the following code lines:
import pandas as pd
train_data = pd.read_csv('/kaggle/input/titanic/train.csv')
Now let's satisfy the following libraries:
import os
from pandasai import SmartDataframe
from pandasai.llm.openai import OpenAI
Next, we must build an item of Pandas Ai to analyze the Titanic Train Dataset.
Here is what this means:
Pandas Ai is the library that links your Pandas Data Frame in the larger language model. You can use Pandas AI to connect to GPT-4O, Claude-3.5, and other LLMs.
By default, Pandas Ai uses a language model called bamboo llm. Connecting Pandas AI in the language model, you can visit this website for the API key.
After that, enter the API key in this code block to create an item of Pandas AI:
# Set the PandasAI API key
# By default, unless you choose a different LLM, it will use BambooLLM.
# You can get your free API key by signing up at
os.environ['PANDASAI_API_KEY'] = 'your-pandasai-api-key' # Replace with your actual key
# Create SmartDataframe with default LLM (Bamboo)
smart_df = SmartDataframe(train_data)
Myself, I faced some problems to return the Bamboo LLM API key. Because of this, I decided to find the API key from Opena instead. Then, I used a GPT-4O model for this analysis.
One caveat in this option is that the keypa API keype keys. You must buy Apaichai's APAICAI's tokens to use these types.
To do this, pass to open ai website and buy tokens from the pile page. Then you can go to the “API Keys” page and create your API key.
Now that you have Openai API keys, you need to install it in this block of code to connect the GPT-4O to Pandas AI:
# Set your OpenAI API key
os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY"
# Initialize OpenAI LLM
llm = OpenAI(api_token=os.environ["OPENAI_API_KEY"], model="gpt-4o")
config = {
"llm": llm,
"enable_cache": False,
"verbose": False,
"save_logs": True
}
# Create SmartDataframe with explicit configuration
smart_df = SmartDataframe(train_data, config=config)
We can now use this Pandas AI item to analyze Titanic Dataset.
Step 2: EDA and Functional Data with Pandas Ai
First, let's start with a simple quick past asking for PANDAS AI to explain this data:
smart_df.chat("Can you describe this dataset and provide a summary, format the output as a table.")
You will see such an effect, with the basic abbreviation for data:

Description of Titanic Dataset
We often can write a specific code for a summary like this. With Pandas Ai, however, we just need to write quickly.
This will save the time tone if you are the first time you want to analyze other details but I do not know how to write the Python code.
Next, let's do the intended data processing about Pandas AI:
I ask you to give me a relationship between the “survivable” relationship in the Titanic Database, and other variables to Database:
smart_df.chat("Are there correlations between Survived and the following variables: Age, Sex, Ticket Fare. Format this output as a table.")
The immediate and relative connection should give you related connections between “Sound” and other variables to the Databot.
Next, let's ask Pandas Ai to help us identify the relationship between these changing materials:
1. Waiting in age
smart_df.chat("Can you visualize the relationship between the Survived and Age columns?")
The immediate upper case should give you a histogram that looks like this:

Titanic Dataset Age Distribution
This looks at us that young passengers may be in danger.
2. It is survived by gender
smart_df.chat("Can you visualize the relationship between the Survived and Sex")
You must find a bar chart that shows a relationship between “surviving” and “sex.”
3. it is wide and snare
smart_df.chat("Can you visualize the relationship between the Survived and Fare")
The above agreement translated into the box box, telling me that passengers who pay the high rides may be likely to survive the Titanic accident.
Note that the LLMS is not limited, which means the result you will find may vary depending on the mine. However, you will still find the answer to help you understand the dataset.
Next, we can make some information to find the issues such as these:
Quick Example 1
smart_df.chat("Analyze the quality of this dataset. Identify missing values, outliers, and potential data issues that would need to be addressed before we build a model to predict survival.")
Top For example 2
smart_df.chat("Let's drop the cabin column from the dataframe as it has too many missing values.")
Make a quick example 3
smart_df.chat("Let's impute the Age column with the median value.")
If you would like all the installing steps before cleaning them to clean the DANDA with Pandas AI, you can find complete motivation and code in my kaggle book.
Under 5 minutes, I was able to look forward to this data by managing lost prices, flexible transmission, and building new features. This is done without writing the Python Code, which is particularly useful when you are new in planning.
How to learn AI for data analysis: The following steps
In my opinion, the basic purchase point of tools such as the indicator and pandas AI to allow you to analyze data and make edit the code within your planning display.
This is much better than copying and paste the code without your visual plans like Chatgpt.
In addition, as your Codebase grows (ie have thousands of code and more than 10 details), useful to have a combined tool for AI and can understand the connection between these code files.
If you want to read AI for Data Analytics, there are many other tools that I have received is helpful:
- GITUB COPILIT: This tool is the same as the direction. You can use within the EDE's EDE for the code for manufacturing code, and has a discussion indicator that you can contact.
- Microsoft Copilot in Excel: This AI tool helps you to analyze information on your spreadsheets.
- Python in Excel: This is an extension that allows you to use the Python code within Excel. While this is not ai tool, I have found that it is a great help as it allows you to install during processing your data without changing different applications.
Natassha Selvaraj You are a familiar data scientist for writing. Natassha writes in every science related to scientific, related to the actual king of all data topics. You can connect with him in LinkedIn or evaluate his YouTube station.



