Back to Blog

Beginner's Guide to MCP with LangChain and Gemini

April 16, 2025

Beginner's Guide to MCP with LangChain and Gemini

What is MCP?

MCP (Model Control Protocol) is a standard way for language models to interact with external tools like web search, calculators, or databases in a structured and clean manner.

It acts like a bridge between the model and the tools, making communication modular and manageable.

Why Use MCP in Agents?

  • Separates model logic from tool logic
  • Allows AI agents to decide when and how to use tools
  • Keeps the system modular, scalable, and easier to maintain
  • Works great with agent frameworks like LangChain

In short, MCP makes your AI agent smarter and more capable by letting it access external knowledge or functionalities as needed.


Let's Understand This by a Simple Project

To see MCP in action, let's look at a practical example: the GitHub project mcp_langchain_examples.

  • Uses LangChain to create and manage the agent
  • Uses Gemini 2.0 Flash as the base AI model
  • Adds DuckDuckGo as an external web search tool
  • Uses MCP to connect the model and tools cleanly

Here's how the project is structured:

  • server.py - Implements the MCP server and tool (DuckDuckGo)
  • prompt.py - Contains system prompts to guide the AI
  • main.py - Runs the agent logic with Gemini and MCP tools
  • requirements.txt - Lists Python dependencies

Installation & Setup

Prerequisites

  • Python 3.8+
  • Google API key for Gemini model

Method 1: Using uv (recommended)

# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh

# Create env & install
uv venv
uv sync

Method 2: Using pip

# Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate

# Install requirements
pip install -r requirements.txt

Then, create a `.env` file in the root and add your Gemini API key:

GOOGLE_API_KEY=your_api_key_here

Running the Project

# Start the MCP Server
uv run ./src/server.py

# In another terminal, run the main app
uv run ./src/main.py

How It Works

  • User inputs a question
  • The agent checks the prompt and decides to use the search tool
  • The MCP server handles the DuckDuckGo search
  • Results are returned and formatted as a response

Example Interaction

Question: Who is the current CEO of OpenAI?
Answer: As of the latest search, the CEO of OpenAI is Sam Altman.

Conclusion

MCP brings a structured and scalable way to empower your AI agents. By combining it with LangChain and a capable model like Gemini, you can build smart, flexible, tool-using agents easily. Try out this project and start building your own powerful AI workflows!