Metadata-Version: 2.4
Name: iflow-mcp_vincentf305_mcp-server-ollama
Version: 0.1.0
Summary: A Model Control Protocol server that allows Claude Desktop to communicate with Ollama LLM server
Requires-Python: >=3.11
Requires-Dist: aiohttp>=3.9.1
Requires-Dist: fastapi>=0.109.0
Requires-Dist: httpx>=0.25.2
Requires-Dist: mcp>=0.9.1
Requires-Dist: pydantic-settings>=2.7.1
Requires-Dist: pydantic>=2.10.6
Requires-Dist: pytest-asyncio>=0.23.3
Requires-Dist: pytest>=7.4.4
Requires-Dist: python-dotenv>=1.0.0
Requires-Dist: requests>=2.31.0
Requires-Dist: uvicorn>=0.27.0
Description-Content-Type: text/markdown

# MCP Server for Ollama

A Model Control Protocol server that allows Claude Desktop to communicate with Ollama LLM server.

## Setup

1. Clone the repository
2. Copy `.env.example` to `.env` and configure as needed
3. Install dependencies: `pip install -r requirements.txt`

### Using with Claude Desktop

Edit the `claude_desktop_config.json` file with the following content, change path-to-mcp-server to the path of this repo:

```json
{
  "mcpServers": {
    "ollama-server": {
      "command": "python",
      "args": ["-m", "src.mcp_server.server"],
      "env": {
        "PYTHONPATH": "path-to-mcp-server"
      }
    }
  }
}
```
