hf-eda-mcp / docs /deployment /mcp-client-examples.md
KhalilGuetari's picture
Deployment on hf spaces
5aaaef8
|
raw
history blame
6.66 kB

MCP Client Configuration Examples

This document provides configuration examples for connecting various MCP clients to the hf-eda-mcp server.

Table of Contents


Kiro IDE

Workspace Configuration

Create or edit .kiro/settings/mcp.json in your workspace:

{
  "mcpServers": {
    "hf-eda-mcp": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-p", "7860:7860",
        "--env-file", ".env",
        "hf-eda-mcp:latest"
      ],
      "env": {
        "HF_TOKEN": "${HF_TOKEN}"
      },
      "disabled": false,
      "autoApprove": [
        "get_dataset_metadata",
        "get_dataset_sample",
        "analyze_dataset_features"
      ]
    }
  }
}

User-Level Configuration

Edit ~/.kiro/settings/mcp.json for global configuration:

{
  "mcpServers": {
    "hf-eda-mcp": {
      "command": "pdm",
      "args": ["run", "hf-eda-mcp"],
      "env": {
        "HF_TOKEN": "your_token_here"
      },
      "disabled": false,
      "autoApprove": []
    }
  }
}

Using HuggingFace Spaces

{
  "mcpServers": {
    "hf-eda-mcp": {
      "url": "https://your-username-hf-eda-mcp.hf.space/gradio_api/mcp/sse",
      "disabled": false,
      "autoApprove": ["get_dataset_metadata"]
    }
  }
}

Claude Desktop

Configuration File Location

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Local Server Configuration

{
  "mcpServers": {
    "hf-eda-mcp": {
      "command": "python",
      "args": ["-m", "hf_eda_mcp"],
      "env": {
        "HF_TOKEN": "your_token_here",
        "PYTHONPATH": "/path/to/hf-eda-mcp/src"
      }
    }
  }
}

Docker Configuration

{
  "mcpServers": {
    "hf-eda-mcp": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-p", "7860:7860",
        "-e", "HF_TOKEN=your_token_here",
        "hf-eda-mcp:latest"
      ]
    }
  }
}

HuggingFace Spaces Configuration

{
  "mcpServers": {
    "hf-eda-mcp": {
      "url": "https://your-username-hf-eda-mcp.hf.space/gradio_api/mcp/sse"
    }
  }
}

Custom MCP Client

Python Client Example

import asyncio
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

async def main():
    # Connect to local server
    server_params = StdioServerParameters(
        command="python",
        args=["-m", "hf_eda_mcp"],
        env={"HF_TOKEN": "your_token_here"}
    )
    
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            # Initialize the connection
            await session.initialize()
            
            # List available tools
            tools = await session.list_tools()
            print("Available tools:", tools)
            
            # Call a tool
            result = await session.call_tool(
                "get_dataset_metadata",
                arguments={"dataset_id": "squad"}
            )
            print("Result:", result)

if __name__ == "__main__":
    asyncio.run(main())

JavaScript/TypeScript Client Example

import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js";

async function main() {
  const transport = new StdioClientTransport({
    command: "python",
    args: ["-m", "hf_eda_mcp"],
    env: {
      HF_TOKEN: process.env.HF_TOKEN
    }
  });

  const client = new Client({
    name: "hf-eda-client",
    version: "1.0.0"
  }, {
    capabilities: {}
  });

  await client.connect(transport);

  // List tools
  const tools = await client.listTools();
  console.log("Available tools:", tools);

  // Call a tool
  const result = await client.callTool({
    name: "get_dataset_metadata",
    arguments: {
      dataset_id: "squad"
    }
  });
  console.log("Result:", result);

  await client.close();
}

main().catch(console.error);

Environment Variables

Required Variables

  • HF_TOKEN: HuggingFace API token (optional for public datasets, required for private datasets)

Optional Variables

  • HF_HOME: Directory for HuggingFace cache (default: ~/.cache/huggingface)
  • HF_DATASETS_CACHE: Directory for datasets cache
  • TRANSFORMERS_CACHE: Directory for transformers cache
  • GRADIO_SERVER_NAME: Server host (default: 0.0.0.0)
  • GRADIO_SERVER_PORT: Server port (default: 7860)
  • MCP_SERVER_ENABLED: Enable MCP server (default: true)

Example .env File

# HuggingFace Authentication
HF_TOKEN=hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxxx

# Cache Configuration
HF_HOME=/path/to/cache
HF_DATASETS_CACHE=/path/to/cache/datasets
TRANSFORMERS_CACHE=/path/to/cache/transformers

# Server Configuration
GRADIO_SERVER_NAME=0.0.0.0
GRADIO_SERVER_PORT=7860
MCP_SERVER_ENABLED=true

Deployment Options Comparison

Option Pros Cons Best For
Local (PDM) Fast, easy debugging Requires Python setup Development
Docker Isolated, reproducible Requires Docker Production, CI/CD
HF Spaces Hosted, no maintenance Limited control Public sharing

Troubleshooting

Connection Issues

  1. Server not starting: Check logs for errors, verify dependencies installed
  2. Authentication failed: Verify HF_TOKEN is set correctly
  3. Port already in use: Change GRADIO_SERVER_PORT to a different port

Tool Execution Issues

  1. Dataset not found: Verify dataset ID is correct on HuggingFace Hub
  2. Permission denied: Ensure HF_TOKEN has access to private datasets
  3. Timeout errors: Increase timeout settings or use smaller sample sizes

Docker Issues

  1. Image build fails: Ensure all dependencies in pyproject.toml are compatible
  2. Container exits immediately: Check logs with docker logs hf-eda-mcp-server
  3. Cache not persisting: Verify volume mounts in docker-compose.yml

Additional Resources