Run a self-hosted MCP server
This page describes how to run your own local Data Commons MCP server and connect to it from an agent. This is useful for advanced use cases, such as developing your own custom AI agent/client to use with Data Commons.
For procedures for Custom Data Commons instances, please see instead Run MCP tools.
We provide procedures for the following scenarios:
- Local server and local agent: The agent spawns the server in a subprocess using Stdio as the transport protocol.
- Remote server and local agent: You start up the server as a standalone process and then connect the agent to it using streaming HTTP as the protocol.
For both scenarios, we use Gemini CLI and the sample agent as examples. You should be able to adapt the configurations to other MCP-compliant agents/clients.
Tip: For an end-to-end tutorial using a locally running server and agent over HTTP, see the sample Data Commons Colab notebook, Try Data Commons MCP Tools with a Custom Agent.
Prerequisites
In addition to a Data Commons API key, you will need the following:
- Install
uvfor managing and installing Python packages; see the instructions at https://docs.astral.sh/uv/getting-started/installation/.
Run a local server and agent
Gemini CLI
To instruct Gemini CLI to start up a local server using Stdio, replace the datacommons-mcp section in your settings.json file as follows:
{
// ...
"mcpServers": {
"datacommons-mcp": {
"command": "uvx",
"args": [
"datacommons-mcp@latest",
"serve",
"stdio"
],
// Only needed if you have not set the key in your environment
"env": "YOUR DC API KEY"
}
}
// ...
}
Run Gemini CLI as usual.
Sample agent
To instruct the sample agent to spawn a local server that uses the Stdio protocol, modify basic_agent/agent.py to set import modules and agent initialization parameters as follows:
from google.adk.tools.mcp_tool.mcp_toolset import (
McpToolset,
StdioConnectionParams,
StdioServerParameters,
)
#...
root_agent = LlmAgent(
model=AGENT_MODEL,
name="basic_agent",
instruction=AGENT_INSTRUCTIONS,
tools=[
McpToolset(
connection_params=StdioConnectionParams(
timeout=10,
server_params=StdioServerParameters(
command="uvx",
args=["datacommons-mcp", "serve", "stdio"],
env={"DC_API_KEY": DC_API_KEY}
)
)
)
],
)
Run the startup commands as usual.
Run a remote server and a local agent
Step 1: Start the server as a standalone process
- Be sure to set the API key as an environment variable.
- Run:
uvx datacommons-mcp serve http [--host HOSTNAME] [--port PORT]
By default, the host is
localhostand the port is8080if you don’t set these flags explicitly.
The server is addressable with the endpoint mcp. For example, http://my-mcp-server:8080/mcp.
Step 2: Configure an agent to connect to the running server
Gemini CLI
Replace the datacommons-mcp section in your settings.json file as follows:
{
"mcpServers": {
"datacommons-mcp": {
"httpUrl": "http://HOST:PORT/mcp",
"headers": {
"Content-Type": "application/json",
"Accept": "application/json, text/event-stream"
// If you have set the key in your environment
, "X-API-Key": "$DC_API_KEY"
// If you have not set the key in your environment
, "X-API-Key": "YOUR DC API KEY"
}
}
}
}
Run Gemini CLI as usual.
Sample agent
Modify basic_agent/agent.py as follows:
from google.adk.tools.mcp_tool.mcp_toolset import (
MCPToolset,
StreamableHTTPConnectionParams
)
root_agent = LlmAgent(
# ...
tools=[McpToolset(
connection_params=StreamableHTTPConnectionParams(
url="http://HOST:PORT/mcp",
headers={
"Content-Type": "application/json",
"Accept": "application/json, text/event-stream"
}
)
)
],
)
Run the startup commands as usual.
Page last updated: February 05, 2026 • Send feedback about this page