Programmatic MCP with LLMs

In previous lessons, you used Claude Desktop as the client. These are great for testing and chat interactions.

However, developers often need to build automated systems where an LLM (Large Language Model) needs to access tools without a human in the loop. For example, you might want a backend script that automatically calculates data or queries a database when a specific event happens.

To achieve this, you interact with the LLM via an API.

In this lesson, you will use the Anthropic API and its MCP Connector feature. This allows you to tell the LLM: “Here is the URL of an MCP server. If you need to answer my question, feel free to call any tools hosted there.”

Project Preparation

Let’s set up a new environment for this integration.

Open your terminal and get inside the lesson_3 directory if not already in:

$ cd lesson_3

Initialize the project and install the required libraries. You need the anthropic for the client:

$ uv add anthropic

Step 1: The HTTP Server

First, you need an MCP server. Since the Anthropic API is a cloud service, it needs to connect to your server over HTTP.

Create a file named mcp_server_http.py. This is a variation of the Dog Age Calculator you built earlier, configured to run over HTTP:

from mcp.server.fastmcp import FastMCP


mcp = FastMCP("Hello World MCP", json_response=True)

@mcp.tool()
def calculate_dog_in_human_year(dog_age: int) -> int:
    """
    Calculates a dog's age in human years based on the following rules:
    - 1st year: 15 human years
    - 2nd year: +9 human years (Total 24)
    - 3rd year onwards: +4 human years for every year
    """
    if dog_age < 0:
        return "Age cannot be negative"

    if dog_age == 0:
        return 0
    elif dog_age <= 1:
        # 1 year old = 15 human years
        return dog_age * 15
    elif dog_age <= 2:
        # 2 years old = 24 human years
        # (15 for the first year + 9 for the second)
        return 15 + ((dog_age - 1) * 9)
    else:
        # 2+ years old = 24 + 4 years for every year after 2
        return 24 + ((dog_age - 2) * 4)

if __name__ == "__main__":
    mcp.run(transport="streamable-http")

Understanding the HTTP Server

This server is intentionally minimal so the LLM has one clear tool to discover:

  1. FastMCP setup: FastMCP("Hello World MCP", json_response=True) enables JSON responses that are easy for an API client to parse.
  2. Single tool: calculate_dog_in_human_year provides deterministic math, making it easy to verify the LLM used the tool correctly.
  3. Input validation: The negative-age guard prevents invalid inputs from producing misleading results.
  4. HTTP transport: mcp.run(transport="streamable-http") exposes the server over HTTP so the Anthropic API can reach it.

Step 2: The LLM Client

Now, you will write the client script. This script does not calculate dog years itself. Instead, it sends a prompt to Claude and provides the connection details for your MCP server.

To run this, you will need an Anthropic API Key.

  1. Register at https://claude.ai/.
  2. Generate a key at https://platform.claude.com/settings/keys.

Important: Commercial API Access The Anthropic API is a paid service. It is not free.

You must have active credits in your developer console. Note: API billing is often separate from the standard chat subscription. Ensure you have topped up your credits or confirmed that your specific plan covers API usage.

Create a file named anthropic_llm.py:

import os
import anthropic

url = os.environ.get('MCP_SERVER_URL')
api_key = os.environ.get('ANTHROPIC_API_KEY')

prompt = f"My dog is 3 years old. If my dog were a human, how old would she be?"

client = anthropic.Anthropic(api_key=api_key)

response = client.beta.messages.create(
    model="claude-sonnet-4-5",
    max_tokens=2000,
    messages=[{"role": "user", "content": prompt}],
    mcp_servers=[
        {
            "type": "url",
            "url": url,
            "name": "dog-age-server",
        }
    ],
    tools=[
      {
        "type": "mcp_toolset",
        "mcp_server_name": "dog-age-server"
      }
    ],
    extra_headers={
        "anthropic-beta": "mcp-client-2025-11-20"
    }
)

for block in response.content:
    if block.type == "text":
        print(block.text)

    elif block.type == "mcp_tool_use":
        print(f"[System] Calling MCP Tool: '{block.name}'")
        print(f"[System] Parameters: {block.input}")

    elif block.type == "mcp_tool_result":
        result_text = " ".join([b.text for b in block.content if b.type == 'text'])
        print(f"[System] Tool Result: {result_text}\n")

Understanding the Code

This code introduces the MCP Connector, a feature that connects Claude directly to your tools.

  1. mcp_servers: This array tells Anthropic where your server lives. You provide a url and give the server a name.
  2. tools: This configuration uses mcp_toolset. It tells Claude, “Take all the tools available on the dog-age-server and make them available for this conversation.”
  3. extra_headers: This is critical. The integration is currently in beta, and you must include the specific version header (mcp-client-2025-11-20) or the API call will fail.

Important Note on Networking

Because you are calling the Anthropic API (which runs in the cloud), the API must be able to reach your MCP_SERVER_URL.

  • Production: You would deploy your mcp_server_http.py to a public server (like Cloudflare or AWS).
  • Local Development: Since localhost is not accessible from the cloud, you will typically use a tunneling service (like ngrok) to expose your local port 8000 to the internet.

Seeing it in Action

You have now set up the architecture for a fully autonomous AI agent. The Python script asks a question, Claude analyzes it, recognizes it needs a tool, communicates with your MCP server to get the math done, and returns the final answer.

In the next section, you will watch a demo video showing exactly how to run these scripts and execute the full flow.

See forum comments
Download course materials from Github
Previous: Running the MCP Client Next: Connecting Claude to Local Tools