Here's how I created MCP to use my Data Science work


Photo by Ideogram
Most of my days as a data scientist looks like this:
- The boss that is not holders: “Can you tell us how much we did when we made money last month and what percent came from search advertisers?”
- Me: “Start the SQL question to issue the details and give it to it.”
- The boss that is not holders: “I see. What is our financial prediction for the next three years?”
- Me: “Combin the data from many sources, talk to a financial team, and create a model for accounting.”
Activities such as above are Ad Hoc applications from business participants. They take about 3-5 hours to complete and often disagree with the important project I work in.
When questions related to data like this entry, it usually requires me to press the last days of current projects or work for hours to perform work. And that's where AI gets.
When AI models love Chatgt including Defense They were made available, the performance of the party was developed, as I have my power to respond to the applications for Ad Hoc Tobakeholder. AE highly reduced the time I spend time spending, producing SQL questions, and working with different groups with the required information. Additionally, after AI code assistance Cleanerer Consolidated with our codes, effective benefits have improved again. Activities like that I have already described above can be completed twice as soon as possible.
Recently, when the MCP servers began to find popularity, I thought:
Can I build a MCP working on the functioning of this data science program continuously?
I spent two days to create this MCP server, and in this article, I will break:
- Results and how much time I saved with my Data Science Mcp
- Resources and items used to create MCP
- Basic Setup, Apis, and Services I Am Without My Work
Obvious To create a City Science MCP
If you do not know what the MCP is, it represents a model project model and is a framework that allows you to connect a large model of language to external services.
This video is a good introduction to MCPS.
// The main problem
The problem I wanted to solve my new MCP's MCP for:
How do I combine information on different sources and produce results that can be used directly by participants and team members?
To accomplish this, I build a MCP in three components, as shown in the FlowChart below:


Photo for Author | Mermaid
// Part 1: Combination of the question bank
As a basis for my MCP information, I used my team bank (with questions, a sample question to answer a question, and a certain context of tables).
When the facilitator asks me a question like this:
What percentages of market income are from the search advertisements?
I no longer need many tables and column names to produce a question. The MCP instead searchs the bank with the same question. Then get the context about proper tables should ask and adapt the questions to my question. What I need to do Call the MCP server, paste the requester's requester, and I get the right question in a few minutes.
// Part 2: Combination of Google Drive
Product documents are usually saved on Google Drive – even if the slide de is, a document, or a spreadsheet.
I connected my MCP server to a group drawing of a group so it reached all our documents to all many projects. This helps remove data immediately and answer questions such as:
Can you tell us how much we did in making money last month?
I also gave these scriptures to remove certain keywords and topics, so MCP is simply through the keywords list based on malmings rather than hundreds of the pages.
For example, if someone asks a question related to “video video advertisements,” MCP will start searching for the Index of the document to identify the most relevant files before looking at them before looking at them.
// Part 3: Access to Local Document
This is a simple part of the MCP, where I have a local folder where the MCP is searching. I add or delete files as needed, allow me to add my context, details, and instructions over my group projects.
Obvious Summary: How is my scientific MCP of science
Here is an example of my MCP is currently working in replying for Application for Ad Hoc data:
- Question Login: “How many impressed the video advertisement we worked for Q3, and how much need for the ad should give?”
- Document Retrieval MCP searches our “Q3 project folder,” “AD,” “and” offer, “and” suitable for appropriate project documents.
- It also receives some information about the Q3 video advertisement campaign, provision, and demand for group documents
- Searching for a question bank with the same questions about Ad Weses
- Using the context obtained from the documents and question bank to produce a SQL question about the Q3 video campaign
- Finally, the question has been transferred to a unique MCP connected to Predo SQL, automatically
- Then I collect results, review it, and send it to my stakeholders
Obvious Details of Starting
Here how I used this MCP:
// Step 1: Center Input
I used the cursor as my MCP client. You can enter the cursor from this link. It is actually ai editor of AI that can access your Codebase and use it to produce or modify the code.
// Step 2: Guarantees of Google Drive
Almost all documents used by the MCP (including a question bank) is stored on Google Drive.
To give your MCP access to Google Drive, sheets, and documents, you will need to set up API access:
- Go to Google Cloud console and create a new project.
- Enable the following APIs: Google Drive, Google Shings, Google Docs.
- Create Confirmations (Oaute Cecks Cecks ID) and keep them in a file called
credentials.json.
// Step 3: Set FastMCP
FASTMCP Open python frame used to create the MCP servers. I followed this lesson to build my first MCP server using FASTMCP.
(Note: This lesson uses Claude Desktop as a MCP client, but the steps apply to the indicator or any AI code you have chosen.)
For FastMCP, you can create a MCP server with Google compilation (Snippet of the sample below):
@mcp.tool()
def search_team_docs(query: str) -> str:
"""Search team documents in Google Drive"""
drive_service, _ = get_google_services()
# Your search logic here
return f"Searching for: {query}"
// Step 4: Prepare MCP
When your MCP is built, you can fix it in the direction. This can be done by wandering in the cursor settings window here, you will see a paragraph where you can add a MCP server. If you click, the File called mcp.json It will be opened, where you can enter configuration for your new MCP server.
This is an example of what your prepared should look:
{
"mcpServers": {
"team-data-assistant": {
"command": "python",
"args": ["path/to/team_data_server.py"],
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "path/to/credentials.json"
}
}
}
}
After saving your changes to the JSON file, you can enable this MCP and start using it inside the cursor.
Obvious The last thoughts
This MCP server was a simple side project I decided to create a time savings in my travel activity. It's not, but this tool resolves my little pain point: spending hours answering the Applications for Ad Hoc data that removes key projects I work on. I believe that such a tool is simply ripping over the possibility of Ai Generative Ai and represents a broad shift for how data science work is made.
Traditional scientific movement moves away from this:
- Spending many hours to get data
- The code of writing
- Building Models
Focusing to change in the work of technology, and data scientists are now expected to look at the big picture and solve business problems. In some cases, we are expected to represent product decisions and enter as product or project manager.
As AI continues to appear, I believe the lines between technology roles will disappear. The ability to understand the context of the business, asks appropriate questions, interpreting the results, and communication and understanding. If you are a data scientist (or desirable), it will not ask that AI will change your performance.
You have two options: You can accept AI tools and build solutions that make up the change of your team, or allow others to pay yourself.
Natassha Selvaraj You are a familiar data scientist for writing. Natassha writes in every science related to scientific, related to the actual king of all data topics. You can connect with him in LinkedIn or evaluate his YouTube station.



