FCHubFCHub.co

Deployment

Run FluentCart MCP on a VPS with Docker, because apparently localhost wasn't enough for you.

The stdio transport works brilliantly for local AI clients — Claude Desktop, Cursor, the usual suspects. But if you want ChatGPT, remote agents, or just the smug satisfaction of a public endpoint, you'll need the HTTP transport running on a server somewhere.

Docker Image

The image is on Docker Hub and GitHub Container Registry. Multi-architecture (amd64 + arm64), so it runs on basically anything with electricity.

docker run -d \
  -p 3000:3000 \
  -e FLUENTCART_URL=https://your-store.com \
  -e FLUENTCART_USERNAME=admin \
  -e FLUENTCART_APP_PASSWORD="aBcD eFgH iJkL mNoP" \
  -e FLUENTCART_MCP_API_KEY=your-secret-key \
  --name fluentcart-mcp \
  vcodesh/fluentcart-mcp

The container exposes port 3000 and starts with --transport http by default. Your MCP endpoint lives at http://localhost:3000/mcp.

Environment Variables

VariableRequiredDescription
FLUENTCART_URLYesFull URL to your WordPress site
FLUENTCART_USERNAMEYesWordPress admin username
FLUENTCART_APP_PASSWORDYesWordPress Application Password
FLUENTCART_MCP_API_KEYNoBearer token for the MCP endpoint. If set, all requests to /mcp must include Authorization: Bearer <key>. If not set, the endpoint is wide open.

Set the API key

Without FLUENTCART_MCP_API_KEY, anyone who discovers your endpoint can query your store, create orders, issue refunds — the full 200-tool experience. Thrilling for you, potentially expensive. Set the key.

Dokploy Setup

If you're running Dokploy (or similar), point it at the image and you're done in about 90 seconds.

Create a new application

In Dokploy, create a new application. Choose Docker Image as the source. Set the image to vcodesh/fluentcart-mcp:latest.

Alternatively, if you prefer building from Git: point it at the repo and set the build context to fluentcart-mcp/.

Set environment variables

Add all four environment variables from the table above. FLUENTCART_MCP_API_KEY is optional but I'll judge you silently if you skip it in production.

Deploy

Hit deploy. Exposed port is 3000. The container starts the HTTP transport automatically — no extra config.

Cloudflare Tunnel

If you're using Cloudflare Tunnels (and you should be — free SSL, no port forwarding, no firewall fiddling):

  1. In Cloudflare Zero Trust dashboard, add a public hostname
  2. Set the subdomain (e.g. mcp.your-domain.com)
  3. Point it at http://localhost:3000 (or whatever your container's internal address is)
  4. Save

Your MCP endpoint is now at https://mcp.your-domain.com/mcp. Encrypted, cached, and behind Cloudflare's network. Not bad for 4 clicks.

Health Check

curl https://mcp.your-domain.com/health

Should return:

{"status":"ok"}

If it doesn't, something's broken. Check the container logs:

docker logs fluentcart-mcp

Connecting Clients

ChatGPT

Paste your endpoint URL into ChatGPT's MCP server configuration:

https://mcp.your-domain.com/mcp

ChatGPT will negotiate the connection via Streamable HTTP. If you've set a bearer token, you'll need to configure that in the auth settings as well.

Any HTTP-capable MCP client

Same deal. Point the client at your /mcp endpoint. Set the bearer token if configured. The server speaks standard MCP over Streamable HTTP — no special sauce required.

Stateless mode

The HTTP transport creates a fresh server instance per request. This keeps things simple and works with scale-to-zero deployments, but it means each request resolves config and registers all 200 tools. On a reasonable VPS, this adds maybe 10-20ms overhead. Not exactly a crisis.

Docker Compose

If you prefer compose (or want to run it alongside other services):

docker-compose.yml
services:
  fluentcart-mcp:
    image: vcodesh/fluentcart-mcp:latest
    ports:
      - "3000:3000"
    environment:
      FLUENTCART_URL: https://your-store.com
      FLUENTCART_USERNAME: admin
      FLUENTCART_APP_PASSWORD: "aBcD eFgH iJkL mNoP"
      FLUENTCART_MCP_API_KEY: your-secret-key
    restart: unless-stopped
docker compose up -d

Done. Your MCP server survives reboots and keeps running until you tell it to stop. Which is more than I can say for most things I build.

On this page